Presentation is loading. Please wait.

Presentation is loading. Please wait.

Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb Part II: GuangZhou 2010 November 3 Sun Yat Sen University.

Similar presentations


Presentation on theme: "Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb Part II: GuangZhou 2010 November 3 Sun Yat Sen University."— Presentation transcript:

1 Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb Part II: GuangZhou 2010 November 3 Sun Yat Sen University

2 Topics in this presentation  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

3 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

4 Aims of SFL  SFG aims (primarily) to describe the network of choices available in a language For expressing meanings  “SFL differs from Firth, and also from Lamb, in that priority is given to the system” (Halliday, 2009:64)  “The organizing concept of a systemic grammar is that of choice (that is, options in ‘meaning potential’…)” (Halliday 1994/2003: 434

5 Aims of Neurocognitive linguistics (“NCL”)  NCL aims to describe the linguistic system of a language user As a dynamic system It operates Speaking, comprehending, learning, etc. It changes as it operates  Evidence that can be used Texts Findings of SFL Slips of “tongue” and mind Unintentional puns Etc.

6 NCL seeks to learn.. How information is represented in the linguistic system How information is represented in the linguistic system How the system operates in speaking and understanding How the system operates in speaking and understanding How the linguistic system is connected to other knowledge How the linguistic system is connected to other knowledge How the system is learned How the system is learned How the system is implemented in the brain How the system is implemented in the brain

7 The linguistic system of a language user: Two viewing platforms  Cognitive level: the cognitive system of the language user without considering its physical basis The cognitive (linguistic) system Field of study: “cognitive linguistics”  Neurocognitive level: the physical basis Neurological structures Field of study: “neurocognitive linguistics”

8 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

9 “Cognitive Linguistics”  First occurrence in print: “[The] branch of linguistic inquiry which aims at characterizing the speaker’s internal information system that makes it possible for him to speak his language and to understand sentences received from others.” (Lamb 1971)

10 Operational Plausibility  To understand how language operates, we need to have the linguistic information represented in such a way that it can be used for speaking and understanding  (A “competence model” that is not competence to perform is unrealistic )

11 Relational network notation  Thinking in cognitive linguistics was facilitated by relational network notation  Developed under the influence of the notation used by Halliday for systemic networks  Earlier steps leading to relational network notation appear in papers written in 1963

12 More on the early days  In the 1960s the linguistic system was viewed (by Hockett and Gleason and me and others) as containing items (of unspecified nature) together with their interrelationships Cf. Hockett’s “Linguistic units and their relations” (Language, 1966)  Early primitive notations showed units with connecting lines to related units

13 The next step: Nodes  The next step was to introduce nodes to go along with such connecting lines  Allowed the formation of networks – systems consisting of nodes and their interconnecting lines  Halliday’s notation (which I first saw in 1964) used different nodes for paradigmatic (‘or’) and syntagmatic (‘and’) relationships Just what I was looking for

14 From systemic networks to relational networks Three notational adaptations  Rotate 90 degrees, so that upwards would be toward meaning (at the theoretical top) and downwards would be toward phonetics (at the theoretical bottom)  Replace the brace for ‘and’ with a (more node-like appearing) triangle;  Retaining the bracket for ‘or’, allow the connecting lines to connect at a point

15 The downward OR abab a b

16 The downward AND abab a b

17 The 90° Rotation: Upward and Downward  Expression (phonetic or graphic) is at the bottom  Therefore, downward is toward expression  Upward is toward meaning (or other function) – more abstract network meaning expression

18 Orientation of Nodes  Downward AND and OR nodes: Branching on the expression side  Multiple branches to(ward) expression  Upward AND and OR nodes: Branching on the content side  Multiple branches to(ward) content

19 Downward and upward branching a b

20 The meaning of up/down: Neurological interpretation  At the bottom are the interfaces to the world outside the brain: Sense organs on the input side Muscles on the output side  ‘Up’ is more abstract

21 The ordered AND  We need to distinguish simultaneous from sequential  For sequential, the ‘ordered AND ’  Its two (or more) lines connect to different points at the bottom of the triangle (in the case of the ‘downward and’) to represent sequential activation  leading to sequential occurrence of items a b First a then b

22 The downward ordered or  For the ‘or’ relation, we don’t have sequence since only one of the two (or more) lines is activated  But an ordering feature for this node is useful to indicate precedence So we have precedence ordering.  The line connecting to the left takes precedence If conditions allow for its activation to be realized, it will be chosen in preference to the other line

23 The downward ordered or (original notation) a b marked choice unmarked choice (a.k.a. default ) The marked choice takes precedence: It is chosen if the conditions that constitute the marking are present

24 The downward ordered or (revised notation) a b marked choice unmarked choice (a.k.a. default ) The unmarked choice is the one that goes right through. The marked choice is off to the side – either side

25 The downward ordered or (revised notation) a b unmarked choice marked choice (a.k.a. default ) The unmarked choice is the one that goes right through. The marked choice is off to the side – either side

26 Sometimes the unmarked choice has zero realization b unmarked choice marked choice The unmarked choice is nothing. In other words, the marked choice is optional.

27 Operational Plausibility  To understand how language operates, we need to have the information represented in such a way that it can be directly used for speaking and understanding  Competence as competence to perform  The information in a person’s mind is “knowing how” – not “knowing that”  Information in operational form Able to operate without manipulation from some added “performance” system

28 Relational networks: Cognitive systems that operate  Language users are able to use their languages  Such operation takes the form of activation of lines and nodes  The nodes can be defined on the basis of how they treat incoming activation

29 Nodes are defined in terms of activation: The AND a b Downward activation from k goes to a and later to b Upward activation from a and later from b goes to k Downward ordered AND k

30 Nodes are defined in terms of activation a b The OR condition is not Achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Downward unordered OR k p q

31 Nodes are defined in terms of activation: The OR a b Upward activation from either a or b goes to k Downward activation from k goes to a and [sic] b Downward unordered OR k

32 Nodes are defined in terms of activation a b The OR condition is not achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Downward unordered OR k p q

33 The Ordered AND: Upward Activation Activation moving upward from below

34 The Ordered AND: Downward Activation Activation coming downward from above

35 Downward Activation ANDOR Upward Downward

36 Upward Activation AND OR Upward Downward

37 Upward activation through the or The or operates as either-or for activation going from the plural side to the singular side. For activation from plural side to singular side it acts locally as both-and, but in the context of other nodes the end result is usually either-or

38 Upward activation through the or bill BILL 1 BILL 2 Usually the context allows only one interpretation, as in I’ll send you a bill for it

39 Upward activation through the or bill BILL 1 BILL 2 But if the context allows both to get through, we have a pun: A duck goes into a pub and orders a drink and says, “Put it on my bill“.

40 Zhong Guo: Shadow Meaning CENTRAL CHINA KINGDOM zhong guo

41 The ordered OR: How does it work? default Ordered This line taken if possible Node-internal structure (not shown in abstract notation) is required to control this operation

42 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

43 A purely relational network  After making these adaptations to systemic network notation, resulting in relational network notation (abstract form), it became apparent (one afternoon in the fall of 1964) that relational networks) need not contain any items at all  The entire structure could be represented in the nodes and their interconnecting lines

44 Morpheme as item and its phonemic representation boy b - o - y Symbols? Objects?

45 Relationship of boy to its phonemes boy As a morpheme, it is just one unit Three phonemes, in sequence b o y

46 The nature of this “morphemic unit” BOY Noun b o y The object we are considering

47 The morpheme as purely relational BOY Noun b o y We can remove the symbol with no loss of information. Therefore, it is a connection, not an object boy

48 Another way of looking at it BOY Noun b o y

49 Another way of looking at it BOY Noun b o y

50 A closer look at the segments b boy y Phonological features o The phonological segments also are just locations in the network – not objects (Bob) (toy)

51 boy as label (not part of the structure) BOY Noun b o y boy Just a label – to make the diagram easier to read

52 Objection I  If there are no symbols, how does the system distinguish this morpheme from others?  Answer: Other morphemes necessarily have different connections  Another node with the same connections would be another (redundant) representation of the same morpheme

53 Objection II  If there are no symbols, how does the system know which morpheme it is?  Answer: If there were symbols, what would read them? Miniature eyes inside the brain?

54 Relations all the way  Perhaps all of linguistic structure is relational  It’s not relationships among linguistic items; it is relations to other relations to other relations, all the way to the top – at one end – and to the bottom – at the other  In that case the linguistic system is a network of interconnected nodes

55 Objects in the mind? When the relationships are fully identified, the objects as such disappear, as they have no existence apart from those relationships “The postulation of objects as some- thing different from the terms of relationships is a superfluous axiom and consequently a metaphysical hypothesis from which linguistic science will have to be freed.” Louis Hjelmslev (1943/61)

56 Compare SF Networks – nodes and lines, plus symbols  SF networks have and and or nodes  They also have symbols for linguistic items  E.g., polarity, positive, negative  And symbols for relationships/operations SymbolMeaningExample +insertion+ x /conflationX / Y ·expansionX (P · Q) ^orderingX ^Z : preselection: w ::classification::z =lexification=t

57 Syntax is also purely relational: Example: The Actor-Goal Construction CLAUSE DO-SMTHG Vt Nom Material process (type 2) Syntactic function Semantic function Variable expression

58 Syntax is also purely relational: Linked constructions CL Nom DO--SMTHG Vt Nom Material process (type 2) TOPIC-COMMENT

59 Add another type of process CL DO-TO-SMTHG THING-DESCR BE-SMTHG be Nom Vt Adj Loc

60 More of the English Clause DO-TO-SMTHG BE-SMTHG be Vt Vi to -ing CL Subj Pred Conc Past Mod Predicator FINITE

61 The system of THEME, System network for THEME SELECTION Halliday ( 2004: 80 )

62 THEME SELECTION PREDICATOR THEME ADJUNCT THEME WH- THEME SUBJECT THEME (Unmarked in imperative) Non-wh-theme Other ( Unmarked in wh-interrogative and exclamative) ( Unmarked in declarative and yes/no interrogative) Direct translation of Halliday’s system network

63 Theme selection in operation  This direct translation seems not to represent the way theme selection works in the cognitive system of the person forming a clause  Rather, whatever will be the theme the specific item, not a high-level category to which it belongs, is active at the start of the clause formation  Having been activated it comes first, as theme  and the rest of the clause follows, as Rheme

64 (Getting ready to add Theme) BE-SMTHG Vi to -ing CL Subj Pred Conc Past Mod Predicator FINITE

65 Add Theme-Rheme BE-SMTHG Vi to -ing CL Subj Pred Predicator FINITE THEME RHEME Nom DECLARE

66 Yes-No Questions to -ing Pred VP Perf Prog Subj ASK DECLARE Finite

67 Yes-No Questions: Finite as Theme Pred Subj ASK Finite CL THEME RHEME DECLARE Nom

68 Circumstance in the Verb Phrase be Vt Vi VP Obj Vbl Phrase Circumstance They did it I saw them He was walking in the garden a couple of days ago while she was away

69 Circumstance as Theme Vi VP Vbl Phrase Circumstance THEME RHEME

70 Conclusion: Relationships all the way to.. How far? What is at the bottom?  Introductory view: it is phonetics  In the system of the speaker, we have relational network structure all the way down to the points at which muscles of the speech-producing mechanism are activated At that interface we leave the purely relational system and send activation to a different kind of physical system  For the hearer, the bottom is the cochlea, which receives activation from the sound waves of the speech hitting the ear

71 What is at the top?  Is there a place up there somewhere that constitutes an interface between a purely relational system and some different kind of structure?  Somehow at the top there must be meaning

72 What are meanings? DOG C Perceptual properties of dogs All those dogs out there and their properties In the Mind The World Outside For example, DOG

73 How High is Up?  Downward is toward expression  Upward is toward meaning/function  Does it keep going up forever?  No — as it keeps going it arches over, through perception  Conceptual structure is at the top

74 The great cognitive arch The “Top”

75 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

76 Systemic Networks vis-à-vis Relational Networks: How related?  They operate at different levels of precision  Compare chemistry and physics Chemistry for molecules Physics for atoms  Both are valuable for their purposes

77 Different levels of investigation: Living Beings  Systems Biology  Cellular Biology  Molecular Biology  Chemistry  Physics

78 Levels of Precision  Advantages of description at a level of greater precision: Greater precision Shows relationships to other areas  Disadvantages of description at a level of greater precision: More difficult to accomplish  Therefore, can’t cover as much ground More difficult for consumer to grasp  Too many trees, not enough forest

79 Three Levels of precision for language  Systemic networks  Abstract relational network notation  Narrow relational network notation (coming up)

80 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

81 Narrow relational network notation  Developed later  Used for representing network structures in greater detail internal structures of the lines and nodes of the abstract notation  The original notation can be called the ‘abstract’ notation or the ‘compact’ notation

82 Toward Greater Precision The nodes evidently have internal structures Otherwise, how to account for their behavior? We can analyze them, figure out what internal structure would make them behave as they do

83 The Ordered AND: How does it know? Activation coming downward from above  How does the AND node “know” how long to wait before sending activation down the second line?  It must have internal structure to govern this function  We use the narrow notation to model the internal structure

84 Internal Structure – Narrow Network Notation As each line is bidirectional, it can be analyzed into a pair of one-way lines Likewise, the simple nodes can be analyzed as pairs of one-way nodes

85 Abstract and narrow notation  Abstract notation – also known as compact notation  The two notations are like different scales for making a map  Narrow notation shows greater detail and greater precision  Narrow notation ought to be closer to the actual neural structures 

86 Narrow and abstract network notation Narrow notation  Closer to neurological structure  Nodes represent cortical columns  Links represent neural fibers (or bundles of fibers)  Uni-directional Abstract notation  Nodes show type of relationship ( OR, AND )  Easier for representing linguistic relationships  Bidirectional  Not as close to neurological structure eat apple

87 More on the two network notations  The lines and nodes of the abstract notation represent abbreviations – hence the designation ‘abstract’  Compare the representation of a divided highway on a highway map In a more compact notation it is shown as a single line In a narrow notation it is shown as two parallel lines of opposite direction

88 Two different network notations Narrow notation ab b Abstract notation  Bidirectional ab f Upward Downward

89 Downward Nodes: Internal Structure AND OR 2 1

90 Upward Nodes: Internal Structure AND OR 2 1

91 Downward and, upward direction W 2 The ‘Wait’ Element

92 AND vs. OR In one direction their internal structures are the same In the other, it is a difference in threshold – hi or lo threshold for hi or lo degree of activation required to cross

93 Thresholds in Narrow Notation 1234 OR AND – You no longer need a basic distinction AND vs. OR – You can have intermediate degrees, between AND and OR – The AND/OR distinction was a simplification anyway — doesn’t always work!

94 The ‘Wait’ Element w Keeps the activation alive AB Activation continues to B after A has been activated Downward AND, downward direction

95 Structure of the ‘Wait’ Element W 1 2

96 Node Types in Narrow Notation T Junction Branching Blocking

97 Two Types of Connection Excitatory Inhibitory Type 1 Type 2

98 Types of inhibitory connection  Type 1 – connect to a node  Type 2 – Connects to a line Used for blocking default realization For example, from the node for second there is a blocking connection to the line leading to two

99 Type 2 – Connects to a line TWO ORDINAL 2 second two -th

100 Additional details of structure can be shown in narrow notation  Connections between upward and downward directions  Varying degrees of connection strength  Variation in threshold strength  Contrast

101 The two Directions 1 2 w w

102 The Two Directions w w Two Questions: 1. Are they really next to each other? 2. How do they “communicate” with each other? 1 2

103 Separate but in touch w w 1 2 Down Up In phonology, we know from aphasiology and neuroscience that they are in different parts of the cerebral cortex

104 Phonological nodes in the cortex w w 1 2 Arcuate fasciculus Frontal lobe Temporal lobe

105 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

106 Another level of precision  Systemic networks  Abstract relational network notation  Narrow relational network notation  Cortical columns and neural fibers  Neurons, axons, dendrites, neurotransmitters

107 Narrow RN notation as a set of hypotheses  Question: Are relational networks related in any way to neural networks?  We can find out  Narrow RN notation can be viewed as a set of hypotheses about brain structure and function Every property of narrow RN notation can be tested for neurological plausibility

108 Some properties of narrow RN notation  Lines have direction (they are one-way)  But they tend to come in pairs of opposite direction (“upward” and “downward”)  Connections are either excitatory or inhibitory  Nerve fibers carry activation in just one direction  Cortico-cortical connections are generally reciprocal  Connections are either excitatory or inhibitory (from different types of neurons, with two different neurotransmitters)

109 More properties as hypotheses  Nodes have differing thresholds of activation  Inhibitory connections are of two kinds  Additional properties – (too technical for this presentation )  Neurons have different thresholds of activation  Inhibitory connections are of two kinds (Type 2: “axo-axonal”)  All are verified Type 1 Type 2

110 The node of narrow RN notation vis-à-vis neural structures  The node corresponds not to a single neuron but to a bundle of neurons  The cortical column  A column consists of neurons stacked on top of one another  All neurons within a column act together When a column is activated, all of its neurons are activated

111 The node as a cortical column  The properties of the cortical column are approximately those described by Vernon Mountcastle “[T]he effective unit of operation…is not the single neuron and its axon, but bundles or groups of cells and their axons with similar functional properties and anatomical connections.” Vernon Mountcastle, Perceptual Neuroscience (1998), p. 192

112 Three views of the gray matter Different stains show different features Nissl stain shows cell bodies of pyramidal neurons

113 The Cerebral Cortex  Grey matter Columns of neurons  White matter Inter-column connections

114 Microelectrode penetrations in the paw area of a cat’s cortex

115 The (Mini)Column  Width is about (or just larger than) the diameter of a single pyramidal cell About 30–50  m in diameter  Extends thru the six cortical layers Three to six mm in length The entire thickness of the cortex is accounted for by the columns  Roughly cylindrical in shape  If expanded by a factor of 100, the dimensions would correspond to a tube with diameter of 1/8 inch and length of one foot

116 Cortical column structure  Minicolumn microns diameter  Recurrent axon collaterals of pyramidal neurons activate other neurons in same column  Inhibitory neurons can inhibit neurons of neighboring columns Function: contrast  Excitatory connections can activate neighboring columns In this case we get a bundle of contiguous columns acting as a unit

117 Levels of precision  Systemic networks  Abstract relational network notation  Narrow relational network notation  Cortical columns and neural fibers  Neurons, axons, dendrites, neurotransmitters  Intraneural structures Pre-/post-synaptic terminals Microtubules Ion channels Etc.

118 Levels of precision  Informal functional descriptions  Semi-formal functional descriptions  Systemic networks  Abstract relational network notation  Narrow relational network notation  Cortical columns and neural fibers  Neurons, axons, dendrites  Intraneural structures and processes

119 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

120 Competition vis-à-vis Halliday’s systems Halliday (not an exact quote): Putting the emphasis on systems gives recognition to the importance of Saussure's principle that everything meaningful has meaning in contrast to what could have been selected instead

121 Paradigmatic contrast: Competition a b 2 2 For example, /p/ vs. /k/

122 Simplified model of minicolumn II: Inhibition of competitors Thalamus Other cortical locations II III IV V VI Cells in neighboring columns Cell Types Pyramidal Spiny Stellate Inhibitory

123 Local and distal connections excitatory inhibitory

124 Paradigmatic contrast: Competition a b abab

125 Paradigmatic contrast: Competition a b 2 2 abab

126 Competition vis-à-vis Halliday’s systems Halliday (not an exact quote): Putting the emphasis on systems gives recognition to the importance of Saussure's principle that everything meaningful has meaning in contrast to what could have been selected instead

127 Topics  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

128 Precision vis-à-vis variability  Description at a level of greater precision encourages observation of variability  At the level of the forest, we are aware of the trees, but we tend to overlook the differences among them  At the level of the trees we clearly see the differences among them  But describing the forest at the level of detail used in describing trees would be very cumbersome  At the level of the trees we tend to overlook the differences among the leaves  At the level of the leaves we tend to overlook the differences among their component cells

129 Linguistic examples  At the cognitive level we clearly see that every person’s linguistic system is different from that of everyone else  We also see variation within the single person’s system from day to day  At the level of narrow notation we can treat Variation in connection strengths Variation in threshold strength Variation in levels of activation  We are thus able to explain prototypicality phenomena learning etc.

130 Variation in Connection Strength  Connections get stronger with use Every time the linguistic system is used, it changes  Can be indicated roughly by Thickness of connecting lines in diagrams or by Little numbers written next to lines

131 Variation in threshold strength  Thresholds are not fixed They vary as a result of use – learning  Nor are they integral  What we really have are threshold functions, such that A weak amount of incoming activation produces no response A larger degree of activation results in weak outgoing activation A still higher degree of activation yields strong outgoing activation S-shaped (“sigmoid”) function N.B. All of these properties are found in neural structures

132 Threshold function Incoming activation Outgoing activation

133 Topics in this presentation  Aims of SFL and NCL  From systemic networks to relational networks  Relational networks as purely relational  Levels of precision in description  Narrow relational network notation  Narrow relational networks and neural networks  Enhanced understanding of systemic-functional choice  Enhanced appreciation of variability in language

134 T h a n k y o u f o r y o u r a t t e n t I o n !

135 References Halliday, M.A.K., 1994/2003. Appendix: Systemic Theory. In On Language and Linguistics (vol. 3 in the Collected Works of M.A.K. Halliday (ed. Jonathan Webster). London: Continuum Halliday, M.A.K., Methods – techniques – problems. In Continuum Companion to Systemic Functional Linguistics (eds. M.A.K. Halliday & Jonathan Webster). London: Continuum Hockett, Charles F., Linguistic units and their relations” (Language, 1966) Lamb, Sydney, The crooked path of progress in cognitive linguistics. Georgetown Roundtable. Lamb, Sydney M., Pathways of the Brain: The Neurocognitive Basis of Language. John Benjamins Lamb, Sydney M., 2004a. Language as a network of relationships, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Lamb, Sydney M., 2004b. Learning syntax: a neurocognitive approach, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Mountcastle, Vernon W Perceptual Neuroscience: The Cerebral Cortex. Cambridge: Harvard University Press.

136 For further information..


Download ppt "Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb Part II: GuangZhou 2010 November 3 Sun Yat Sen University."

Similar presentations


Ads by Google