Presentation is loading. Please wait.

Presentation is loading. Please wait.

June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability”

Similar presentations


Presentation on theme: "June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability”"— Presentation transcript:

1 June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability” June 27: “Meanings as Concept Assembly Instructions” SLIDES POSTED BEFORE EACH TALK terpconnect.umd.edu/~pietro (OR GOOGLE ‘pietroski’ AND FOLLOW THE LINK) pietro@umd.edu Meanings First Context and Content Lectures, Institut Jean Nicod

2 Reminders of last two weeks... Human Language: a language that human children can naturally acquire (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language (C) each human language is an i-language: a biologically implementable procedure that generates expressions that connect meanings with articulations (B) each human language is an i-language for which there is a theory of truth that is also the core of an adequate theory of meaning for that i-language

3 (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good Ideas Bad Companion Ideas “e-positions” allow for “e-positions” are Tarskian variables conjunction reductions that have mind-independent values Alvin moved to Venice happily. Alvin moved to Venice.  e  e’  e’’[AL(e’) & MOVED(e, e’) & T0(e, e’’) & VENICE(e’’) & HAPPILY(e)]  e  e’  e’’[AL(e’) & MOVED(e, e’) & T0(e, e’’) & VENICE(e’’)]

4 (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good Ideas Bad Companion Ideas “e-positions” allow for “e-positions” are Tarskian variables conjunction reductions that have mind-independent values Alvin moved to Venice happily. Alvin moved Torcello to Venice. Alvin moved to Venice.Alvin chased Pegasus. Alvin chased Theodore happily. Theodore chased Alvin unhappily.

5 (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good Ideas Bad Companion Ideas “e-positions” allow for “e-positions” are Tarskian variables conjunction reductions that have mind-independent values as Foster’s Problem reveals, the meanings computed are humans compute meanings truth-theoretic properties of via specific operations human i-language expressions Liar Sentences don’t Liar T-sentences are true preclude meaning theories (‘The first sentence is true.’ iff for human i-languages the first sentence is true.)

6 (D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language Good Ideas Bad Companion Ideas “e-positions” allow for characterizing meaning conjunction reductions in truth-theoretic terms yields good analyses as Foster’s Problem reveals, of specific constructions humans compute meanings via specific operations such characterization also helps address foundational Liar Sentences don’t issues concerning how preclude meaning theories human linguistic expressions for human i-languages could exhibit meanings at all

7 Weeks 3 and 4: Short Form In acquiring words, kids use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'ride fast' RIDE( )^FAST( ) 'fast horse' FAST( )^HORSE( ) 'horses' HORSE( )^PLURAL( ) PLURAL( ) => COUNTABLE(_)

8 Weeks 3 and 4: Short Form In acquiring words, kids use available concepts to introduce new ones. Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^  [Θ(, _)^HORSES(_)]

9 Weeks 3 and 4: Short Form In acquiring words, kids use available concepts to introduce new ones. Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^  [Θ(, _)^HORSES(_)] Meaning('fast horses') = JOIN{Meaning('fast'), Meaning('horses')} Meaning('ride horses') = JOIN{Meaning('ride'), Θ[Meaning('horses')]} = JOIN{fetch@'ride'), Θ[Meaning('horses')]}

10 Weeks 3 and 4: Short Form In acquiring words, kids use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' Meanings are instructions for how to access and combine i-concepts --lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address --introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand 'ride horses' RIDE( )^  [Θ(, _)^HORSES(_)] 'ride fast horses' RIDE( )^  [Θ(, _)^FAST(_)^HORSES(_)] 'ride horses fast' RIDE( )^  [Θ(, _)^HORSES(_)]^FAST( )

11 Weeks 3 and 4: Very Short Form In acquiring words, kids use available concepts to introduce i-concepts, which can be “joined” to form conjunctive monadic concepts, which may or may not have Tarskian satisfiers. 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^  [Θ(, _)^HORSES(_)] 'ride fast horses' RIDE( )^  [Θ(, _)^FAST(_)^HORSES(_)] 'ride fast horses fast' RIDE( )^  [Θ(, _)^FAST(_)^HORSES(_)]^FAST( ) Some Implications Verbs do not fetch genuinely relational concepts Verbs are not saturated by grammatical arguments The number of arguments that a verb can/must combine with is not determined by the concept that the verb fetches

12 Words, Concepts, and Conjoinability

13 What makes humans linguistically special? (i)Lexicalization: capacity to acquire words (ii)Combination: capacity to combine words (iii)Lexicalization and Combination (iv) Distinctive concepts that get paired with signals (v) Something else entirely FACT: human children are the world’s best lexicalizers SUGGESTION: focus on lexicalization is independently plausible

14 Constrained Homophony Again A doctor rode a horse from Texas A doctor rode a horse, and (i) the horse was from Texas (ii) the ride was from Texas why not… (iii) the doctor was from Texas

15 Leading Idea (to be explained and defended) In acquiring words, we use available concepts to introduce new ones Sound(’ride’) + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + ’chase’ The new concepts can be systematically conjoined in limited ways 'rode a horse from Texas' RODE(_) &  [Θ(_, _) & HORSE(_) & FROM(_, TEXAS)] RIDE(_) & PAST(_) &  [Θ(_, _) & HORSE(_) &  [FROM(_, _) & TEXAS(_)]] RODE(_) &  [Θ(_, _) & HORSE(_)] & FROM(_, TEXAS)  y[RODE(x, y) & HORSE(y)] & FROM(x, TEXAS)

16 A doctor rode a horse from Texas A doctor rode a horse that was from Texas  x{Doctor(x) &  y[Rode(x, y) & Horse(y) & From(y, Texas)]} & A doctor rode a horse from Texas & A doctor rode a horse and the ride was from Texas  e  x{Doctor(x) &  y[Rode(e, x, y) & Horse(y) & From(e, Texas)]}

17 A doctor rode a horse from Texas A doctor rode a horse that was from Texas  e  x{Doctor(x) &  y[Rode(e, x, y) & Horse(y) & From(y, Texas)]} & A doctor rode a horse from Texas & A doctor rode a horse and the ride was from Texas  e  x{Doctor(x) &  y[Rode(e, x, y) & Horse(y) & From(e, Texas)]}

18 A doctor rode a horse from Texas & A doctor rode a horse and the ride was from Texas  e  x{Doctor(x) &  y[Rode(e, x, y) & Horse(y) & From(e, Texas)]} But why doesn’t the structure below support a different meaning: A doctor both rode a horse and was from Texas  e  x{Doctor(x) &  y[Rode(e, x, y) & Horse(y) & From(x, Texas)]} Why can’t we hear the verb phrase as a predicate that is satisfied by x iff x rode a horse & x is from Texas?

19 In acquiring words, we use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' The new concepts can be systematically conjoined in limited ways 'rode a horse from Texas' RODE(_) &  [Θ(_, _) & HORSE(_) & FROM(_, TEXAS)] RODE(_) &  [Θ(_, _) & HORSE(_)] & FROM(_, TEXAS)  y[RODE(e, x, y) & HORSE(y)] & FROM(x, TEXAS) if 'rode' has a rider-variable, why can’t it be targeted by 'from Texas’? Verbs don’t fetch genuinely relational concepts. A phrasal meaning leaves no choice about which variable to target.

20 In acquiring words, we use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' The new concepts can be systematically conjoined in limited ways 'rode a horse from Texas' RODE(_)^  [Θ(_, _)^HORSE(_)^FROM(_, TEXAS)] RODE(_)^  [Θ(_, _)^HORSE(_)]^FROM(_, TEXAS)  y[RODE(e, x, y) & HORSE(y)] & FROM(x, TEXAS) Composition is simple and constrained, but unbounded. Phrasal meanings are generable, but always monadic. Lexicalization introduces concepts that can be systematically combined in simple ways.

21 In acquiring words, we use available concepts to introduce new ones Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride' DISTINGUISH Lexicalized concepts, L-concepts RIDE(_, _)GIVE(_, _, _) ALVIN HORSE(_) RIDE(_, _,...)MORTAL(_, _) Introduced concepts, I -concepts RIDE(_) GIVE(_)CALLED(_, Sound('Alvin')) MORTAL(_) HORSE(_) hypothesis: I -concepts exhibit less typology than L-concepts special case: I -concepts exhibit fewer adicities than L-concepts

22 Conceptual Adicity Two Common Metaphors Jigsaw Puzzles 7 th Grade Chemistry -2 +1 H–O–H +1

23 Jigsaw Metaphor A THOUGHT A THOUGHT

24 Jigsaw Metaphor Unsaturated Saturater Doubly Un- saturated 1st saturater 2nd saturater one Monadic Concept (adicity: -1) “filled by” one Saturater (adicity +1) yields a complete Thought one Dyadic Concept (adicity: -2) “filled by” two Saturaters (adicity +1) yields a complete Thought Brutus Sang( ) Brutus Caesar KICK(_, _)

25 7 th Grade Chemistry Metaphor a molecule of water a molecule of water -2 +1 H(OH +1 ) -1 a single atom with valence -2 can combine with two atoms of valence +1 to form a stable molecule

26 7 th Grade Chemistry Metaphor -2 +1 Brutus(KickCaesar +1 ) -1

27 7 th Grade Chemistry Metaphor +1 NaCl -1 an atom with valence -1 can combine with an atom of valence +1 to form a stable molecule +1 BrutusSang -1

28 Extending the Metaphor Aggie Brown( ) Aggie Cow( ) Aggie BrownCow( ) Brown( ) & Cow( ) Aggie is (a) cow Aggie is brown Aggie is (a) brown cow +1

29 Extending the Metaphor Aggie Brown( ) Aggie Cow( ) Aggie +1 Conjoining two monadic (-1) concepts can yield a complex monadic (-1) concept Brown( ) & Cow( )

30 Conceptual Adicity TWO COMMON METAPHORS --Jigsaw Puzzles --7 th Grade Chemistry DISTINGUISH Lexicalized concepts, L-concepts RIDE(_, _)GIVE(_, _, _) ALVIN Introduced concepts, I -concepts RIDE(_) GIVE(_)CALLED(_, Sound(’Alvin’)) hypothesis: I -concepts exhibit less typology than L-concepts special case: I -concepts exhibit fewer adicities than L-concepts

31 A Different (and older) Hypothesis (1) concepts predate words (2) words label concepts Acquiring words is basically a process of pairing pre-existing concepts with perceptible signals Lexicalization is a conceptually passive operation Word combination mirrors concept combination Sentence structure mirrors thought structure

32 Bloom: How Children Learn the Meanings of Words word meanings are, at least primarily, concepts that kids have prior to lexicalization learning word meanings is, at least primarily, a process of figuring out which existing concepts are paired with which word-sized signals in this process, kids draw on many capacities—e.g., recognition of syntactic cues and speaker intentions— but no capacities specific to acquiring word meanings

33 Lidz, Gleitman, and Gleitman “Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”

34 Lidz, Gleitman, and Gleitman “Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”

35 Why Not... Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is not a function of the number of participants logically implied by the verb meaning. A paradigmatic act of kicking has exactly two participants (kicker and kickee), and yet kick need not be transitive. Brutus kicked Caesar the ball Caesar was kicked Brutus kicked Brutus gave Caesar a swift kick Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences. *Brutus put the ball *Brutus put *Brutus sneezed Caesar

36 Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Perceptible Signal Quirky information for lexical items like ‘kick’ Concept of adicity -1 Concept of adicity -1 Perceptible Signal Quirky information for lexical items like ‘put’

37 Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences. Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isn’t a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and usually sneeze is intransitive. But it usually takes two to have a kicking; and yet kick can be untransitive. Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

38 Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences. Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isn’t a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and sneeze is typically used intransitively; but a paradigmatic kicking has exactly two participants, and yet kick can be used intransitively or ditransitively. Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

39 Quirks and Provisos, or Normal Cases? KICK(x 1, x 2 ) The baby kicked RIDE(x 1, x 2 ) Can you give me a ride? BEWTEEN(x 1, x 2, x 3 )I am between him and her why not: I between him her BIGGER(x 1, x 2 )This is bigger than that why not: This bigs that MORTAL(…?...)Socrates is mortal A mortal wound is fatal FATHER(…?...)Fathers father Fathers father future fathers EAT/DINE/GRAZE(…?...)

40 Lexicalization as Concept-Introduction (not mere labeling) Concept of type T Concept of type T Concept of type T Concept of type T Concept of type T* Concept of type T* Perceptible Signal

41 Lexicalization as Concept-Introduction (not mere labeling) Perceptible Signal Number(_) type: Number(_) type: Number(_) type: Number(_) type: NumberOf[_, Φ(_)] type:, > NumberOf[_, Φ(_)] type:, >

42 Lexicalization as Concept-Introduction (not mere labeling) Concept of type T Concept of type T Concept of type T Concept of type T Concept of type T* Concept of type T* Perceptible Signal

43 Concept of adicity -1 Concept of adicity -1 Concept of adicity -1 Concept of adicity -1 Concept of adicity -2 Concept of adicity -2 Perceptible Signal ARRIVE(x) ARRIVE(e, x) One Possible (Davidsonian) Application: Increase Adicity

44 Concept of adicity -2 Concept of adicity -2 Concept of adicity -2 Concept of adicity -2 Concept of adicity -3 Concept of adicity -3 Perceptible Signal KICK(x 1, x 2 ) KICK(e, x 1, x 2 ) One Possible (Davidsonian) Application: Increase Adicity

45 Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal KICK(x 1, x 2 ) KICK(e) KICK(e, x 1, x 2 ) Lexicalization as Concept-Introduction: Make Monads

46 Concept of adicity n Concept of adicity n Concept of adicity n (or n−1) Concept of adicity n (or n−1) Perceptible Signal Concept of adicity n Concept of adicity n Concept of adicity −1 Concept of adicity −1 Perceptible Signal Further lexical information (regarding flexibilities) further lexical information (regarding inflexibilities) Two Pictures of Lexicalization

47 Experience and Growth Language Acquisition Device in its Initial State Language Acquisition Device in a Mature State (an I-Language): GRAMMAR LEXICON Phonological Instructions  Semantic Instructions Lexicalizable concepts Introduced concepts  Articulation and Perception of Signals  Lexicalized concepts

48 Concept of adicity n Concept of adicity n Concept of adicity n (or n−1) Concept of adicity n (or n−1) Perceptible Signal Concept of adicity n Concept of adicity n Concept of adicity −1 Concept of adicity −1 Perceptible Signal Further lexical information (regarding flexibilities) further lexical information (regarding inflexibilities) Two Pictures of Lexicalization

49 Subcategorization A verb can access a monadic concept and impose further (idiosyncratic) restrictions on complex expressions Semantic Composition Adicity Number (SCAN) (instructions to fetch) singular concepts +1 singular (instructions to fetch) monadic concepts -1 monadic (instructions to fetch) dyadic concepts -2 dyadic > Property of Smallest Sentential Entourage (POSSE) zero NPs, one NP, two NPs, … the SCAN of every verb can be -1, while POSSEs vary: zero, one, two, …

50 POSSE facts may reflect...the adicities of the original concepts lexicalized...statistics about how verbs are used (e.g., in active voice)...prototypicality effects...other agrammatical factors ‘put’ may have a (lexically represented) POSSE of three in part because --the concept lexicalized was PUT(_, _, _) --the frequency of locatives (as in ‘put the cup on the table’) is salient and note: * I put the cup the table ? I placed the cup

51 On any view: Two Kinds of Facts to Accommodate Flexibilities – Brutus kicked Caesar – Caesar was kicked – The baby kicked – I get a kick out of you – Brutus kicked Caesar the ball Inflexibilities – Brutus put the ball on the table – *Brutus put the ball – *Brutus put on the table

52 On any view: Two Kinds of Facts to Accommodate Flexibilities – The coin melted – The jeweler melted the coin – The fire melted the coin – The coin vanished – The magician vanished the coin Inflexibilities – Brutus arrived – *Brutus arrived Caesar

53 Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity −1 Concept of adicity −1 Perceptible Signal further POSSE information, as for ‘put Two Pictures of Lexicalization Word: SCAN -1 Last Task for Today (which will carry over to next time): offer some reminders of the reasons for adopting the second picture

54 Absent Word Meanings Striking absence of certain (open-class) lexical meanings that would be permitted if Human I -Languages permitted nonmonadic semantic types <e,<e,<e,<e, t>>>> (instructions to fetch) tetradic concepts <e,<e,<e, t>>> (instructions to fetch) triadic concepts <e,<e, t>> (instructions to fetch) dyadic concepts <e> (instructions to fetch) singular concepts

55 Proper Nouns even English tells against the idea that lexical proper nouns label singular concepts (of type ) Every Tyler I saw was a philosopher Every philosopher I saw was a Tyler There were three Tylers at the party That Tyler stayed late, and so did this one Philosophers have wheels, and Tylers have stripes The Tylers are coming to dinner I spotted Tyler Burge I spotted that nice Professor Burge who we met before proper nouns seem to fetch monadic concepts, even if they lexicalize singular concepts

56 Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal TYLER TYLER(x) CALLED[x, SOUND(‘Tyler’)] Lexicalization as Concept-Introduction: Make Monads

57 Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal KICK(x 1, x 2 ) KICK(e) KICK(e, x 1, x 2 ) Lexicalization as Concept-Introduction: Make Monads

58 Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal TYLER TYLER(x) CALLED[x, SOUND(‘Tyler’)] Lexicalization as Concept-Introduction: Make Monads

59 Absent Word Meanings Striking absence of certain (open-class) lexical meanings that would be permitted if I-Languages permit nonmonadic semantic types >>> (instructions to fetch) tetradic concepts >> (instructions to fetch) triadic concepts > (instructions to fetch) dyadic concepts (instructions to fetch) singular concepts

60 Absent Word Meanings Brutus sald a car Caesar a dollar sald  SOLD(x, $, z, y) [sald [a car]]  SOLD(x, $, z, a car) [[sald [a car]] Caesar]  SOLD(x, $, Caesar, a car) [[[sald [a car]] Caesar]] a dollar]  SOLD(x, a dollar, Caesar, a car) _________________________________________________ Caesar bought a car bought a car from Brutus for a dollar bought Antony a car from Brutus for a dollar x sold y to z (in exchange) for $

61 Absent Word Meanings Brutus tweens Caesar Antony tweens  BETWEEN(x, z, y) [tweens Caesar]  BETWEEN(x, z, Caesar) [[tweens Caesar] Antony]  BETWEEN(x, Antony, Caesar) _______________________________________________________ Brutus sold Caesar a car Brutus gave Caesar a car*Brutus donated a charity a car Brutus gave a car away Brutus donated a car Brutus gave at the office Brutus donated anonymously

62 Absent Word Meanings Alexander jimmed the lock a knife jimmed  JIMMIED(x, z, y) [jimmed [the lock]  JIMMIED(x, z, the lock) [[jimmed [the lock] [a knife]]  JIMMIED(x, a knife, the lock) _________________________________________________ Brutus froms Rome froms  COMES-FROM(x, y) [froms Rome]  COMES-FROM(x, Rome)

63 Absent Word Meanings Brutus talls Caesar talls  IS-TALLER-THAN(x, y) [talls Caesar]  IS-TALLER-THAN(x, Caesar) _________________________________________ *Julius Caesar Julius  JULIUS Caesar  CAESAR

64 Absent Word Meanings Striking absence of certain (open-class) lexical meanings that would be permitted if I-Languages permit nonmonadic semantic types >>> (instructions to fetch) tetradic concepts >> (instructions to fetch) triadic concepts > (instructions to fetch) dyadic concepts (instructions to fetch) singular concepts I’ll come back to this next week

65 What makes humans linguistically special? (i)Lexicalization: capacity to acquire words (ii)Combination: capacity to combine words (iii)Lexicalization and Combination (iv) Distinctive concepts that get paired with signals (v) Something else entirely FACT: human children are the world’s best lexicalizers

66 One of Aristotle’s Observations Some animals are born early, and take time to grow into their “second nature”

67 One of Aristotle’s Observations Some animals are born early, and take time to grow into their “second nature”

68 Experience and Growth Language Acquisition Device in its Initial State Language Acquisition Device in a Mature State (an I-Language): GRAMMAR LEXICON Phonological Instructions  Semantic Instructions Lexicalizable concepts Introduced concepts  Articulation and Perception of Signals  Lexicalized concepts

69 Weeks 3 and 4: Very Short Form In acquiring words, kids use available concepts to introduce i-concepts, which can be “joined” to form conjunctive monadic concepts, which may or may not have Tarskian satisifiers. 'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^  [Θ(, _)^HORSES(_)] 'ride fast horses' RIDE( )^  [Θ(, _)^FAST(_)^HORSES(_)] 'ride fast horses fast' RIDE( )^  [Θ(, _)^FAST(_)^HORSES(_)]^FAST( ) Some Implications Verbs do not fetch genuinely relational concepts Verbs are not saturated by grammatical arguments The number of arguments that a verb can/must combine with is not determined by the concept that the verb fetches

70 Words, Concepts, and Conjoinability THANKS!

71 On this view, meanings are neither extensions nor concepts. Familiar difficulties for the idea that lexical meanings are concepts polysemy 1 meaning, 1 cluster of concepts (in 1 mind) intersubjectivity 1 meaning, 2 concepts (in 2 minds) jabber(wocky) 1 meaning, 0 concepts (in 1 mind) But a single instruction to fetch a concept from a certain address can be associated with more (or less) than one concept Meaning constancy at least for purposes of meaning composition

72 Lots of Conjoiners P & Qpurely propositional Fx & M Gxpurely monadic ?????? Rx 1 x 2 & DF Sx 1 x 2 purely dyadic, with fixed order Rx 1 x 2 & DA Sx 2 x 1 purely dyadic, any order Rx 1 x 2 & PF Tx 1 x 2 x 3 x 4 polyadic, with fixed order Rx 1 x 2 & PA Tx 3 x 4 x 1 x 5 polyadic, any order Rx 1 x 2 & PA Tx 3 x 4 x 5 x 6 the number of variables in the conjunction can exceed the number in either conjunct NOT EXTENSIONALLY EQUIVALENT

73 Lots of Conjoiners, Semantics If π and π* are propositions, then TRUE(π & π*) iff TRUE(π) and TRUE(π*) If π and π* are monadic predicates, then for each entity x: APPLIES[(π & M π*), x] iff APPLIES[π, x] and APPLIES[π*, x] If π and π* are dyadic predicates, then for each ordered pair o: APPLIES[(π & DA π*), o] iff APPLIES[π, o] and APPLIES[π*, o] If π and π* are predicates, then for each sequence σ: SATISFIES[σ, (π & PA π*)] iff SATISFIES[σ, π] and SATISFIES[σ, π*] APPLIES[σ, (π & PA π*)] iff APPLIES[π, σ] and APPLIES[π*, σ]

74 Lots of Conjoiners P & Qpurely propositional Fx & M Gxpurely monadic Fx^Gx ; Rex^Gx a monad can “join” with a monad or a dyad (with order fixed) Rx 1 x 2 & DF Sx 1 x 2 purely dyadic, with fixed order Rx 1 x 2 & DA Sx 2 x 1 purely dyadic, any order Rx 1 x 2 & PF Tx 1 x 2 x 3 x 4 polyadic, with fixed order Rx 1 x 2 & PA Tx 3 x 4 x 1 x 5 polyadic, any order Rx 1 x 2 & PA Tx 3 x 4 x 5 x 6 the number of variables in the conjunction can exceed the number in either conjunct

75 A Restricted Conjoiner and Closer, allowing for a smidgeon of dyadicity If M is a monadic predicate and D is a dyadic predicate, then for each ordered pair : the junction D^M applies to iff D applies to and M applies to y  [D^M] applies to x iff for some y, D^M applies to D applies to and M applies to y

76 A Restricted Conjoiner and Closer, allowing for a smidgeon of dyadicity If M is a monadic predicate and D is a dyadic predicate, then for each ordered pair : the junction D^M applies to iff D applies to and M applies to y  [Into(_, _)^Barn(_)] applies to x iff for some y, Into(_, _)^Barn(_) applies to Into(_, _) applies to and Barn(_) applies to y


Download ppt "June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability”"

Similar presentations


Ads by Google