Presentation is loading. Please wait.

Presentation is loading. Please wait.

Linguistic Structure as a Relational Network Sydney Lamb Rice University National Taiwan University 9 November 2010.

Similar presentations


Presentation on theme: "Linguistic Structure as a Relational Network Sydney Lamb Rice University National Taiwan University 9 November 2010."— Presentation transcript:

1 Linguistic Structure as a Relational Network Sydney Lamb Rice University National Taiwan University 9 November 2010

2 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

3 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

4 Aims of Neurocognitive Linguistics (“NCL”)  NCL aims to understand the linguistic system of a language user As a dynamic system It operates Speaking, comprehending, learning, etc. It changes as it operates It has a locus The brain

5 NCL seeks to learn.. How information is represented in the How information is represented in the linguistic system linguistic system How the system operates in speaking and How the system operates in speaking and understanding understanding How the linguistic system is connected to How the linguistic system is connected to other knowledge other knowledge How the system is learned How the system is learned How the system is implemented in the brain How the system is implemented in the brain

6 The linguistic system of a language user: Two viewing platforms  Cognitive level: the cognitive system of the language user without considering its physical basis The cognitive (linguistic) system Field of study: “cognitive linguistics”  Neurocognitive level: the physical basis Neurological structures Field of study: “neurocognitive linguistics”

7 “Cognitive Linguistics”  First occurrence of the term in print: “[The] branch of linguistic inquiry which aims at characterizing the speaker’s internal information system that makes it possible for him to speak his language and to understand sentences received from others.” (Lamb 1971)

8 Operational Plausibility  To understand how language operates, we need to have the linguistic information represented in such a way that it can be used for speaking and understanding  (A “competence model” that is not competence to perform is unrealistic)

9 Operational Plausibility  To understand how language operates, we need to have the information represented in such a way that it can be directly used for speaking and understanding  Competence as competence to perform  The information in a person’s mind is “knowing how” – not “knowing that”  Information in operational form Able to operate without manipulation from some added “performance” system

10 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

11 Relational network notation  Thinking in cognitive linguistics was facilitated by relational network notation  Developed under the influence of the notation used by M.A.K. Halliday for systemic networks

12 Precursors  In the 1960s the linguistic system was viewed (by Hockett and Gleason and me and others) as containing items (of unspecified nature) together with their interrelationships Cf. Hockett’s “Linguistic units and their relations” (Language, 1966)  Early primitive notations showed units with connecting lines to related units

13 The next step: Nodes  The next step was to introduce nodes to go along with such connecting lines  Allowed the formation of networks – systems consisting of nodes and their interconnecting lines  Halliday’s notation used different nodes for paradigmatic (‘or’) and syntagmatic (‘and’) relationships Just what I was looking for

14 The downward or DIFFICULT hard diffricult

15 The downward and a b

16 The ordered AND  We need to distinguish simultaneous from sequential  For sequential, the ‘ordered AND ’  Its two (or more) lines connect to different points at the bottom of the triangle (in the case of the ‘downward and’) to represent sequential activation  leading to sequential occurrence of items

17 Downward (ordered) AND Vt Nom

18 Upward and Downward  Expression (phonetic or graphic) is at the bottom  Therefore, downward is toward expression  Upward is toward meaning (or other function) – more abstract network meaning expression

19 Neurological interpretation of up/down  At the bottom are the interfaces to the world outside the brain: Sense organs on the input side Muscles on the output side  ‘Up’ is more abstract

20 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

21 Morpheme as item and its phonemic representation boy b - o - y Symbols? Objects?

22 Relationship of boy to its phonemes boy As a morpheme, it is just one unit Three phonemes, in sequence b o y

23 The nature of this “morphemic unit” BOY Noun b o y The object we are considering

24 The morpheme as purely relational BOY Noun b o y We can remove the symbol with no loss of information. Therefore, it is a connection, not an object boy

25 Another way of looking at it BOY Noun b o y

26 Another way of looking at it BOY Noun b o y

27 A closer look at the segments b boy y Phonological features o The phonological segments also are just locations in the network – not objects (Bob) (toy)

28 Relationships of boy BOY Noun b o y boy Just a label – not part of the structure

29 Objection I  If there are no symbols, how does the system distinguish this morpheme from others?  Answer: Other morphemes necessarily have different connections  Another node with the same connections would be another (redundant) representation of the same morpheme

30 Objection II  If there are no symbols, how does the system know which morpheme it is?  Answer: If there were symbols, what would read them? Miniature eyes inside the brain?

31 Relations all the way  Perhaps all of linguistic structure is relational  It’s not relationships among linguistic items; it is relations to other relations to other relations, all the way to the top – at one end – and to the bottom – at the other  In that case the linguistic system is a network of interconnected nodes

32 Objects in the mind? When the relationships are fully identified, the objects as such disappear, since they have no existence apart from those relationships

33 The postulation of objects as some- thing different from the terms of relationships is a superfluous axiom and consequently a metaphysical hypothesis from which linguistic science will have to be freed. Louis Hjelmslev (1943/61) Quotation

34 Syntax is also purely relational: Example: The Actor-Goal Construcion CLAUSE DO-SMTHG Vt Nom Material process (type 2) Syntactic function Semantic function Variable expression

35 Syntax: Linked constructions CL Nom DO--SMTHG Vt Nom Material process (type 2) TOPIC-COMMENT

36 Add another type of process CL DO-TO-SMTHG THING-DESCR BE-SMTHG be Nom Vt Adj Loc

37 More of the English Clause DO-TO-SMTHG BE-SMTHG be Vt Vi to -ing CL Subj Pred Conc Past Mod Predicator FINITE

38 The downward ordered OR  For the ‘or’ relation, we don’t have sequence since only one of the two (or more) lines is activated  But an ordering feature for this node is useful to indicate precedence So we have precedence ordering.  One line for the marked condition If conditions allow for its activation to be realized, it will be chosen in preference to the other line  The other line is the default

39 The downward ordered or a b marked choice unmarked choice (a.k.a. default ) The unmarked choice is the line that goes right through. The marked choice is off to the side – either side

40 The downward ordered or a b unmarked choice marked choice (a.k.a. default ) The unmarked choice is the one that goes right through. The marked choice is off to the side – either side

41 Optionality Sometimes the unmarked choice is nothing b unmarked choice marked choice In other words, the marked choice is an optional constituent

42 Conclusion: Relationships all the way to.. What is at the bottom?  Introductory view: it is phonetics  In the system of the speaker, we have relational network structure all the way down to the points at which muscles of the speech-producing mechanism are activated At that interface we leave the purely relational system and send activation to a different kind of physical system  For the hearer, the bottom is the cochlea, which receives activation from the sound waves of the speech hitting the ear

43 What is at the top?  Is there a place up there somewhere that constitutes an interface between a purely relational system and some different kind of structure? This question wasn’t actually asked at first It was clear that as long as we are in language we are in a purely relational system, and that is what mattered  Somehow at the top there must be meaning

44 What are meanings? DOG C Perceptual properties of dogs All those dogs out there and their properties In the Mind The World Outside For example, DOG

45 How High is Up?  Downward is toward expression  Upward is toward meaning/function  Does it keep going up forever?  No — as it keeps going it arches over, through perception  Conceptual structure is at the top

46 The great cognitive arch The “Top”

47 Relational networks: Cognitive systems that operate  Language users are able to use their languages.  Such operation takes the form of activation of lines and nodes  The nodes can be defined on the basis of how they treat incoming activation

48 Nodes are defined in terms of activation: The downward ordered AND a b Downward activation from k goes to a and later to b Upward activation from a and later from b goes to k k

49 Nodes are defined in terms of activation a b The OR condition is not Achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Downward unordered OR k p q

50 Nodes are defined in terms of activation: The OR a b Upward activation from either a or b goes to k Downward activation from k goes to a and [sic] b k

51 Nodes are defined in terms of activation a b The OR condition is not achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Downward unordered OR k p q

52 The Ordered AND: Upward Activation Activation moving upward from below

53 The Ordered AND: Downward Activation Activation coming downward from above

54 Upward activation through the OR The or operates as either-or for activation going from the plural side to the singular side. For activation from plural side to singular side it acts locally as both-and, but in the context of other nodes the end result is usually either-or

55 Upward activation through the OR bill BILL 1 BILL 2 Usually the context allows only one interpretation, as in I’ll send you a bill for it

56 Upward activation through the or bill BILL 1 BILL 2 But if the context allows both to get through, we have a pun: A duck goes into a pub and orders a drink and says, “Put it on my bill“.

57 Shadow Meanings: Zhong Guo MIDDLE CHINA KINGDOM zhong guo

58 The ordered OR: How does it work? default Ordered This line taken if possible Node-internal structure (not shown in abstract notation) is required to control this operation

59 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

60 Toward Greater Precision The nodes evidently have internal structures Otherwise, how to account for their behavior? We can analyze them, figure out what internal structure would make them behave as they do

61 The Ordered AND: How does it know? Activation coming downward from above How does the AND node “know” how long to wait before sending activation down the second line?

62 How does it know?  How does the AND node “know” how long to wait before sending activation down the second line?  It must have internal structure to govern this function  We use the narrow notation to model the internal structure

63 Internal Structure – Narrow Network Notation As each line is bidirectional, it can be analyzed into a pair of one-way lines Likewise, the simple nodes can be analyzed as pairs of one-way nodes

64 Abstract and narrow notation  Abstract notation – also known as compact notation  A diagram in abstract notation is like a map drawn to a large scale  Narrow notation shows greater detail and greater precision  Narrow notation ought to be closer to the actual neural structures 

65 Narrow relational network notation  Developed later  Used for representing network structures in greater detail internal structures of the lines and nodes of the abstract notation  The original notation can be called the ‘abstract’ notation or the ‘compact’ notation

66 Narrow and abstract network notation Narrow notation  Closer to neurological structure  Nodes represent cortical columns  Links represent neural fibers (or bundles of fibers)  Uni-directional Abstract notation  Nodes show type of relationship ( OR, AND )  Easier for representing linguistic relationships  Bidirectional  Not as close to neurological structure eat apple

67 More on the two network notations  The lines and nodes of the abstract notation represent abbreviations – hence the designation ‘abstract’  Compare the representation of a divided highway on a highway map In a more compact notation it is shown as a single line In a narrow notation it is shown as two parallel lines of opposite direction

68 Two different network notations Narrow notation ab b Abstract notation  Bidirectional ab f Upward Downward

69 Downward Nodes: Internal Structure AND OR 2 1

70 Upward Nodes: Internal Structure AND OR 2 1

71 Downward AND, upward direction W 2 The ‘Wait’ Element

72 AND vs. OR In one direction their internal structures are the same In the other, it is a difference in threshold – hi or lo threshold for high or low degree of activation required to cross

73 Thresholds in Narrow Notation 1234 OR AND

74 The Beauty of the Threshold 1 – You no longer need a basic distinction AND vs. OR 2 – You can have intermediate degrees, between AND and OR 3 – The AND/OR distinction was a simplification anyway — doesn’t always work!

75 The ‘Wait’ Element w Keeps the activation alive AB Activation continues to B after A has been activated Downward AND, downward direction

76 Structure of the ‘Wait’ Element W 1 2

77 Node Types in Narrow Notation T Junction Branching Blocking

78 Two Types of Connection Excitatory Inhibitory Type 1 Type 2

79 Types of inhibitory connection  Type 1 – connect to a node  Type 2 – Connects to a line Used for blocking default realization For example, from the node for second there is a blocking connection to the line leading to two

80 Type 2 – Connects to a line TWO ORDINAL 2 second two -th

81 Additional details of structure can be shown in narrow notation  Varying degrees of connection strength  Variation in threshold strength  Contrast

82 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

83 The node of narrow RN notation vis-à-vis neural structures  It is very unlikely that a node is represented by a neuron Far more likely: a bundle of neurons  At this point we turn to neuroscience  Vernon Mountcastle, Perceptual Neuroscience (1998) Cortical columns

84 The node of narrow RN notation vis-à-vis neural structures  The cortical column  A column consists of neurons stacked on top of one another  All neurons within a column act together When a column is activated, all of its neurons are activated

85 The node as a cortical column  The properties of the cortical column are approximately those described by Vernon Mountcastle “[T]he effective unit of operation…is not the single neuron and its axon, but bundles or groups of cells and their axons with similar functional properties and anatomical connections.” Vernon Mountcastle, Perceptual Neuroscience (1998), p. 192

86 Three views of the gray matter Different stains show different features Nissl stain shows cell bodies of pyramidal neurons

87 The Cerebral Cortex  Grey matter Columns of neurons  White matter Inter-column connections

88 Layers of the Cortex From top to bottom, about 3 mm

89 The Cerebral Cortex  Grey matter Columns of neurons  White matter Inter-column connections

90 The White Matter  Provides long-distance connections between cortical columns  Consists of axons of pyramidal neurons  The cell bodies of those neurons are in the gray matter  Each such axon is surrounded by a myelin sheath, which.. Provides insulation Enhances conduction of nerve impulses  The white matter is white because that is the color of myelin

91 Dimensionality of the cortex  Two dimensions: The array of nodes  The third dimension: The length (depth) of each column (through the six cortical layers) The cortico-cortical connections (white matter)

92 Topological essence of cortical structure  Two dimensions for the array of the columns  Viewed this way the cortex is an array – a two- dimensional structure – of interconnected columns

93 The (Mini)Column  Width is about (or just larger than) the diameter of a single pyramidal cell About 30–50  m in diameter  Extends thru the six cortical layers Three to six mm in length The entire thickness of the cortex is accounted for by the columns  Roughly cylindrical in shape  If expanded by a factor of 100, the dimensions would correspond to a tube with diameter of 1/8 inch and length of one foot

94 Cortical column structure  Minicolumn microns diameter  Recurrent axon collaterals of pyramidal neurons activate other neurons in same column  Inhibitory neurons can inhibit neurons of neighboring columns Function: contrast  Excitatory connections can activate neighboring columns In this case we get a bundle of contiguous columns acting as a unit

95 Narrow RN notation viewed as a set of hypotheses  Question: Are relational networks related in any way to neural networks?  A way to find out  Narrow RN notation can be viewed as a set of hypotheses about brain structure and function Each property of narrow RN notation can be tested for neurological plausibility

96 Some properties of narrow RN notation  Lines have direction (they are one-way)  But they tend to come in pairs of opposite direction (“upward” and “downward”)  Connections are either excitatory or inhibitory  Nerve fibers carry activation in just one direction  Cortico-cortical connections are generally reciprocal  Connections are either excitatory or inhibitory (from different types of neurons, with two different neurotransmitters)

97 More properties as hypotheses  Nodes have differing thresholds of activation  Inhibitory connections are of two kinds  Additional properties – (too technical for this presentation )  Neurons have different thresholds of activation  Inhibitory connections are of two kinds (Type 2: “axo-axonal”)  All are verified Type 1 Type 2

98 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

99 Levels of precision in network notation: How related?  They operate at different levels of precision  Compare chemistry and physics Chemistry for molecules Physics for atoms  Both are valuable for their purposes

100 Levels of precision  (E.g.) Systemic networks (Halliday)  Abstract relational network notation  Narrow relational network notation

101 Three levels of precision a b 2 2 abab Systemic Relational Networks Networks Abstract Narrow (downward)

102 Different levels of investigation: Living Beings  Systems Biology  Cellular Biology  Molecular Biology  Chemistry  Physics

103 Levels of Precision  Advantages of description at a level of greater precision: Greater precision Shows relationships to other areas  Disadvantages of description at a level of greater precision: More difficult to accomplish  Therefore, can’t cover as much ground More difficult for consumer to grasp  Too many trees, not enough forest

104 Levels of precision  Systemic networks (Halliday)  Abstract relational network notation  Narrow relational network notation  Cortical columns and neural fibers  Neurons, axons, dendrites, neurotransmitters  Intraneural structures Pre-/post-synaptic terminals Microtubules Ion channels Etc.

105 Levels of precision  Informal functional descriptions  Semi-formal functional descriptions  Systemic networks  Abstract relational network notation  Narrow relational network notation  Cortical columns and neural fibers  Neurons, axons, dendrites  Intraneural structures and processes

106 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

107 Precision vis-à-vis variability  Description at a level of greater precision encourages observation of variability  At the level of the forest, we are aware of the trees, but we tend to overlook the differences among them  At the level of the trees we clearly see the differences among them  But describing the forest at the level of detail used in describing trees would be very cumbersome  At the level of the trees we tend to overlook the differences among the leaves  At the level of the leaves we tend to overlook the differences among their component cells

108 Linguistic examples  At the cognitive level we clearly see that every person’s linguistic system is different from that of everyone else  We also see variation within the single person’s system from day to day  At the level of narrow notation we can treat Variation in connection strengths Variation in threshold strength Variation in levels of activation  We are thus able to explain prototypicality phenomena learning etc.

109 Radial categories and Prototypicality  Different connections have different strengths (weights)  More important properties have greater strengths  Example: CUP, Important (but not necessary!) properties:  Short (as compared with a glass)  Ceramic  Having a handle  Cups with these properties are more prototypical

110 The properties of a category have different weights T CUP MADE OF GLASS CERAMIC SHORT HAS HANDLE The properties are represented by nodes which are connected to lower- level nodes The cardinal node for CUP

111 Nodes have activation thresholds  The node will be activated by any of many different combinations of properties  The key word is enough – it takes enough activation from enough properties to satisfy the threshold  The node will be activated to different degrees by different combinations of properties When strongly activated, it transmits stronger activation to its downstream nodes.

112 Prototypical exemplars provide stronger and more rapid activation T CUP MADE OF GLASS CERAMIC SHORT HAS HANDLE Stronger connections carry more activation Activation threshold (can be satisfied to varying degrees)

113 Explaining Prototypicality  Cardinal category nodes get more activation from the prototypical exemplars More heavily weighted property nodes  E.g., FLYING is strongly connected to BIRD Property nodes more strongly activated  Peripheral items (e.g. EMU ) provide only weak activation, weakly satisfying the threshold (emus can’t fly)  Borderline items may or may not produce enough activation to satisfy threshold

114 Activation of different sets of properties produces greater or lesser satisfaction of the activation threshold of the cardinal node CUP MADE OF GLASS CERAMIC SHORT HAS HANDLE More important properties have stronger connections, indicated by thickness of lines Inhibitory connection

115 Explaining prototypicality: Summary  Variation in strength of connections  Many connecting properties of varying strength  Varying degrees of activation  Prototypical members receive stronger activation from more associated properties  BIRD is strongly connected to the property FLYING Emus and ostriches don’t fly But they have some properties connected with BIRD Sparrows and robins do fly  And as commonly occurring birds they have been experienced often, leading to entrenchment – stronger connections

116 Variation over time in connection strength  Connections get stronger with use Every time the linguistic system is used, it changes  Can be indicated roughly by Thickness of connecting lines in diagrams or by Little numbers written next to lines

117 Variation in threshold strength  Thresholds are not fixed They vary as a result of use – learning  Nor are they integral  What we really have are threshold functions, such that A weak amount of incoming activation produces no response A larger degree of activation results in weak outgoing activation A still higher degree of activation yields strong outgoing activation S-shaped (“sigmoid”) function

118 Variation in threshold strength  Thresholds are not fixed They vary as a result of use – learning  Nor are they integral  What we really have are threshold functions, such that A weak amount of incoming activation produces no response A larger degree of activation results in weak outgoing activation A still higher degree of activation yields strong outgoing activation S-shaped (“sigmoid”) function N.B. All of these properties are found in neural structures

119 Threshold function Incoming activation Outgoing activation

120 Topics  Aims of Neurocognitive Linguistics  The origins of relational networks  Relational networks as purely relational  Narrow relational network notation  Narrow relational networks and neural networks  Levels of precision in description  Appreciating variability in language

121 T h a n k y o u f o r y o u r a t t e n t I o n !

122 References Hockett, Charles F., Linguistic units and their relations” (Language, 1966) Lamb, Sydney, The crooked path of progress in cognitive linguistics. Georgetown Roundtable. Lamb, Sydney M., Pathways of the Brain: The Neurocognitive Basis of Language. John Benjamins Lamb, Sydney M., 2004a. Language as a network of relationships, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Lamb, Sydney M., 2004b. Learning syntax: a neurocognitive approach, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Mountcastle, Vernon W Perceptual Neuroscience: The Cerebral Cortex. Cambridge: Harvard University Press.

123 For further information..


Download ppt "Linguistic Structure as a Relational Network Sydney Lamb Rice University National Taiwan University 9 November 2010."

Similar presentations


Ads by Google