Presentation is loading. Please wait.

Presentation is loading. Please wait.

Episode 5b. Agree and movement 5.3-5.4 CAS LX 522 Syntax I.

Similar presentations


Presentation on theme: "Episode 5b. Agree and movement 5.3-5.4 CAS LX 522 Syntax I."— Presentation transcript:

1 Episode 5b. Agree and movement 5.3-5.4 CAS LX 522 Syntax I

2 The Big Picture Now that we’ve gotten some idea of how the system works, let’s back up a bit to remind ourselves a bit about why we’re doing what we’re doing. Now that we’ve gotten some idea of how the system works, let’s back up a bit to remind ourselves a bit about why we’re doing what we’re doing. People have (unconscious) knowledge of the grammar of their native language (at least). They can judge whether sentences are good examples of the language or not. People have (unconscious) knowledge of the grammar of their native language (at least). They can judge whether sentences are good examples of the language or not. Two questions: Two questions: What is that we know? What is that we know? How is it that we came to know what we know? How is it that we came to know what we know?

3 History In trying to model what we know (since it isn’t conscious knowledge) some of the first attempts looked like this (Chomsky 1957): In trying to model what we know (since it isn’t conscious knowledge) some of the first attempts looked like this (Chomsky 1957): Phrase Structure Rules S  NP (Aux) VP VP  V (NP) (PP) NP  (Det) (Adj+) NPP  P NP Aux  (Tns) (Modal) (Perf) (Prog) N  Pat, lunch, …P  at, in, to, … Tns  Past, PresentModal  can, should, … Perf  have -enProg  be -ing Phrase Structure Rules S  NP (Aux) VP VP  V (NP) (PP) NP  (Det) (Adj+) NPP  P NP Aux  (Tns) (Modal) (Perf) (Prog) N  Pat, lunch, …P  at, in, to, … Tns  Past, PresentModal  can, should, … Perf  have -enProg  be -ing An S can be rewritten as an NP, optionally an Aux, and a VP. An NP can be rewritten as, optionally a determiner, optionally one or more adjectives, and a noun. … An S can be rewritten as an NP, optionally an Aux, and a VP. An NP can be rewritten as, optionally a determiner, optionally one or more adjectives, and a noun. … What we know is that an S has an NP, a VP, and sometimes an Aux between them, and that NPs can have a determiner, some number of adjectives, and a noun. What we know is that an S has an NP, a VP, and sometimes an Aux between them, and that NPs can have a determiner, some number of adjectives, and a noun.

4 History Phrase Structure Rules S  NP (Aux) VP VP  V (NP) (PP) NP  (Det) (Adj+) N PP  P NP Aux  (Tns) (Modal) (Perf) (Prog) N  Pat, lunch, … P  at, in, to, … Tns  Past, Present Modal  can, should, … Perf  have -en Prog  be -ing Phrase Structure Rules S  NP (Aux) VP VP  V (NP) (PP) NP  (Det) (Adj+) N PP  P NP Aux  (Tns) (Modal) (Perf) (Prog) N  Pat, lunch, … P  at, in, to, … Tns  Past, Present Modal  can, should, … Perf  have -en Prog  be -ing In this way, many sentences can be derived, starting from S. The tree-style structure is a way to record the history of the derivation from S to the words in the sentence. We model our knowledge of English as a machine that (ideally, when it’s finished) will generate all of the sentences of English and no others. NP V VPVP S eat lunch NP N Pat N Aux Modal might

5 Pat might have been eating lunch If you can say Pat ate you can say Pat had eaten or Pat was eating or Pat had been eating If you can say Pat ate you can say Pat had eaten or Pat was eating or Pat had been eating It looks like the verb can be past or present alone, but with have it takes on an -en (past participle) form, and with be it takes on an -ing (present participle) form. The first verb or auxiliary takes on tense forms.

6 Affix Hopping So, Chomsky proposed: Aux  (Tns) (Modal) (Perf) (Prog) Tns  Past, Present Modal  can, should, … Perf  have -en Prog  be -ing Past  -ed So, Chomsky proposed: Aux  (Tns) (Modal) (Perf) (Prog) Tns  Past, Present Modal  can, should, … Perf  have -en Prog  be -ing Past  -ed Yielding something like this: Yielding something like this: If you build a sentence this way, things aren’t in the right order, but there’s a simple transformation that can be done to the structure to get it right. Empirically, tense, perfect have, and progressive be each control the form of the verbal element to their right. NP V VPVP S eatlunch NP N Pat N Aux Tns Past Perf have -en Prog be -ing-ed

7 Affix Hopping So, Chomsky proposed: Aux  (Tns) (Modal) (Perf) (Prog) Tns  Past, Present Modal  can, should, … Perf  have -en Prog  be -ing Past  -ed So, Chomsky proposed: Aux  (Tns) (Modal) (Perf) (Prog) Tns  Past, Present Modal  can, should, … Perf  have -en Prog  be -ing Past  -ed Yielding something like this: Yielding something like this: Affix Hopping SD: afx verb SC: verb+afx The affixes all “hop to the right” and attach to the following word. An ancestor to the kinds of movement rules we’ve been exploring, and this phenomenon specifically is closely related to the Agree operation we’ll be talking about. NP V VPVP S eat+inglunch NP N Pat N Aux Tns Past Perf have+ed Prog be+en

8 History continues Through the 60s there were good people working hard, figuring out what kinds of phrase structure rules and transformations are needed for a comprehensive description on English. Through the 60s there were good people working hard, figuring out what kinds of phrase structure rules and transformations are needed for a comprehensive description on English. As things developed, two things became clear: As things developed, two things became clear: A lot of the PSRs look pretty similar. A lot of the PSRs look pretty similar. There’s no way a kid acquiring language can be learning these rules. There’s no way a kid acquiring language can be learning these rules. Chomsky (1970) proposed that there actually is only a limited set of phrase structure rule types. For any categories X, Y, Z, W, there are only rules like: XP  YP X X  X WP X  X ZP

9 X-bar theory If drawn out as a tree, you may recognize the kind of structures this proposal entails. These are structures based on the “X-bar schema”. If drawn out as a tree, you may recognize the kind of structures this proposal entails. These are structures based on the “X-bar schema”. XP  YP X X  X WP X  X ZP XP  YP X X  X WP X  X ZP YP being the “specifier”, WP being an “adjunct”, ZP being the “complement”. Adjuncts were considered to have a slightly different configuration then. YP being the “specifier”, WP being an “adjunct”, ZP being the “complement”. Adjuncts were considered to have a slightly different configuration then. WP ZPX X YP X XP

10 GB Around 1981, the view shifted from thinking of the system as constructing all and only structures with PSRs and transformations to a view in which structures and transformations could apply freely, but the grammatical structures were those that satisfied constraints on (various stages of) the representation. Around 1981, the view shifted from thinking of the system as constructing all and only structures with PSRs and transformations to a view in which structures and transformations could apply freely, but the grammatical structures were those that satisfied constraints on (various stages of) the representation. First, a “deep structure” (DS) tree is built, however you like but Selectional restrictions must be satisfied  -roles must be assigned Etc. Then, adjustments are made to get the “surface structure” (SS) Things more or less like Affix Hopping, or moving V to v, or moving the subject to SpecTP. Further constraints are verified here: Is there a subject in SpecTP? Etc. Finally, the result is assigned a pronunciation (PF), and, possibly after some further adjustments, an interpretation (LF).

11 Which brings us to 1993 The most recent change in viewpoint was to the system we’re working with now (arising from the Minimalist Program for Linguistic Theory). The most recent change in viewpoint was to the system we’re working with now (arising from the Minimalist Program for Linguistic Theory). The constraints that applied to the structures in GB were getting to be rather esoteric and numerous, to the extent that it seemed we were missing generalizations. The constraints that applied to the structures in GB were getting to be rather esoteric and numerous, to the extent that it seemed we were missing generalizations. The goal of MPLT was to “start over” in a sense, to try to make the constraints follow from some more natural assumptions that we would need to make anyway. This new view has the computational system working at a very basic level, forcing structures to obey the constraints of GB by enforcing them locally as we assemble the structure from the bottom up.

12 Features and technology The use of features to drive the system (uninterpretable features force Merge, because if they are not checked, the resulting structure will be itself uninterpretable) is a way to encode the notion that lexical items need other lexical items. The use of features to drive the system (uninterpretable features force Merge, because if they are not checked, the resulting structure will be itself uninterpretable) is a way to encode the notion that lexical items need other lexical items. What the system is designed to do is assemble grammatical structures where possible, given a set of lexical items to start with. What the system is designed to do is assemble grammatical structures where possible, given a set of lexical items to start with. A comment about the technology here: The operations of Merge, Adjoin, Agree, and feature checking, the idea that features can be interpretable or not (or, as we will see, strong or weak) are all formalizations of an underlying system, used so that we can describe the system precisely enough to understand its predictions about our language knowledge.

13 Features and the moon We can think of this initially as the same kind of model as this: We can think of this initially as the same kind of model as this: The Earth and the Moon don’t compute this. But if we write it this way, we can predict where the Moon will be. The Earth and the Moon don’t compute this. But if we write it this way, we can predict where the Moon will be. Saying lexical items have uninterpretable features that need to be checked, and hypothesizing mechanisms (matching, valuing) by which they might be checked is similarly a way to formalize the behavior of the computational system underlying language in a way that allows us deeper understanding of the system and what it predicts about language.

14 The “Minimalist Program” The analogy with the gravitational force equation isn’t quite accurate, given the underlying philosophy of the MP. The analogy with the gravitational force equation isn’t quite accurate, given the underlying philosophy of the MP. The Minimalist Program in fact is trying to do this: The Minimalist Program in fact is trying to do this: Suppose that we have a cognitive system for language, which has to interact with at least two other cognitive systems, the conceptual- intensional and the articulatory-perceptual. Whatever it produces needs to be interpretable (in the vernacular of) each of these cognitive systems for the representation to be of any use. Suppose that the properties of these external systems are your boundary conditions, your specifications. The hypothesis of the MPLT is that the computational system underlying language is an optimal solution to those design specifications. So everything is thought of in terms of the creation of interpretable representations.

15 And now, back to our Program With the history laid out in a little more detail now, on to the final major operation and concept in the computational system we are hypothesizing. With the history laid out in a little more detail now, on to the final major operation and concept in the computational system we are hypothesizing. Operations: Operations: (Select) (Select) Merge Merge Adjoin Adjoin Agree Agree Move Move Hierarchy of Projections: Hierarchy of Projections: T > v > V T > v > V We’ve covered Merge and Adjoin. Select is the name for picking a syntactic object off the workbench in preparation for doing something. Agree is what we’re about to discuss. We’ve seen examples of Move already, though we’ll try to further clarify its function and restrictions as a syntactic operation.

16 Pat might eat lunch. Pat[N, …] v[uN, …] eat[V, uN, …] lunch[N, …] might[…] Pat[N, …] v[uN, …] eat[V, uN, …] lunch[N, …] might[…] Since [uN] needs to be checked on T, and since there are no NPs left to Merge, T looks down into the tree, finds the first NP it sees, and moves it up. Since [uN] needs to be checked on T, and since there are no NPs left to Merge, T looks down into the tree, finds the first NP it sees, and moves it up. “Moves it up” = makes a copy and Merges with the root. “Moves it up” = makes a copy and Merges with the root. We’ll continue exploring this in a bit. We’ll continue exploring this in a bit. NP VPVP v vPvP lunch T might [T, uN, …] T [T, uN, …] TP v+V eat NP Pat

17 Uninterpretable features Some lexical items come in with uninterpretable selectional features. Some lexical items come in with uninterpretable selectional features. Kick has [uN], it needs an NP. Kick has [uN], it needs an NP. Merge creates a sisterhood relation. Merge creates a sisterhood relation. Uninterpretable features can be checked by matching features in a sisterhood relation. Uninterpretable features can be checked by matching features in a sisterhood relation. These features are privative— they’re either there or they’re not. These features are privative— they’re either there or they’re not.

18 Uninterpretable features There is a second kind of uninterpretable feature, an unvalued feature. There is a second kind of uninterpretable feature, an unvalued feature. For example, tense features come in two types, past and present. We say that the tense feature either has a present value or a past value. For example, tense features come in two types, past and present. We say that the tense feature either has a present value or a past value. [tense:past] or [tense:present] [tense:past] or [tense:present] An unvalued feature is a type of uninterpretable feature, but it is not checked by exactly matching a feature in the structural sister, but is instead checked by taking on a value from a higher feature of its type. An unvalued feature is a type of uninterpretable feature, but it is not checked by exactly matching a feature in the structural sister, but is instead checked by taking on a value from a higher feature of its type. T[tense:past] … v[utense:] T[tense:past] … v[utense:] T[tense:past] … v[utense: past] T[tense:past] … v[utense: past]

19 Uninterpretable features Unvalued features Unvalued features [utype:] is checked on a lexical item Y when a c-commanding lexical item X has a feature [type: value], resulting in [utype: value] on Y. [utype:] is checked on a lexical item Y when a c-commanding lexical item X has a feature [type: value], resulting in [utype: value] on Y. (Privative) uninterpretable features (Privative) uninterpretable features [uF] is checked on a lexical item Y when its sister has a matching feature (either [F] or [uF]). [uF] is checked on a lexical item Y when its sister has a matching feature (either [F] or [uF]).

20 Feature classes The feature types that can take on values are the ones that we’ve been treating as being in classes up to now. The feature types that can take on values are the ones that we’ve been treating as being in classes up to now. There are tense features. Like past, like present. There are case features. Like nom, like acc. There are person features. Like 1st, like 2nd. There are gender features. Like masculine, like feminine. There are tense features. Like past, like present. There are case features. Like nom, like acc. There are person features. Like 1st, like 2nd. There are gender features. Like masculine, like feminine. So, we can think of this as a feature category or feature type that has a value. So, we can think of this as a feature category or feature type that has a value. [Gender: masculine][Person: 1st] [Gender: masculine][Person: 1st] [Tense: past][Case: nom] [Tense: past][Case: nom]

21 Pat ate lunch So, back to Pat ate lunch. So, back to Pat ate lunch. T here is just T with a past tense feature: T here is just T with a past tense feature: [T, tense:past, …]. [T, tense:past, …]. We need to make a connection between the tense feature chosen for T and the tense morphology we see on the verb. We need to make a connection between the tense feature chosen for T and the tense morphology we see on the verb. Here’s how we’ll do it: Here’s how we’ll do it: Little v has an uninterpretable (unvalued) inflectional feature [uInfl:]. It will be valued by [tense:past] on T. Little v has an uninterpretable (unvalued) inflectional feature [uInfl:]. It will be valued by [tense:past] on T. It’s “Infl” because we want to include tense, but also other kinds of features later on. But tense-type features can check and value unvalued Infl-type features. It’s “Infl” because we want to include tense, but also other kinds of features later on. But tense-type features can check and value unvalued Infl-type features.

22 Pat ate lunch. Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] NP VPVP v vPvP lunch T [tense:past, T, uN, …] v[uInfl:]+V eat NP Pat

23 Pat ate lunch. Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] NP VPVP v vPvP lunch T [tense:past, T, uN, …] T [T, uN, tense:past, …] v[uInfl:past]+V eat NP Pat

24 Pat ate lunch. Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Last point, how does this come to be pronounced Pat ate lunch? Last point, how does this come to be pronounced Pat ate lunch? T isn’t pronounced as anything. It was just a pure tense feature. T isn’t pronounced as anything. It was just a pure tense feature. The “past” pronunciation of eat is ate, so v+V is pronounced “ate” here. The “past” pronunciation of eat is ate, so v+V is pronounced “ate” here. NP VPVP v vPvP lunch T [tense:past, T, uN, …] T [T, uN, tense:past, …] TP NP Pat v[uInfl:past]+V eat

25 Pat ate lunch. Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Pat[N, …] v[uN, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] The idea is that there is a somewhat separate list of pronunciation rules that tell us how to pronounce eat in the context of a valued uInfl feature: The idea is that there is a somewhat separate list of pronunciation rules that tell us how to pronounce eat in the context of a valued uInfl feature: eat by [uInfl:past] = ate eat by [uInfl:past] = ate NP VPVP v vPvP lunch T [tense:past, T, uN, …] T [T, uN, tense:past, …] TP NP Pat v[uInfl:past]+V eat

26 Pat had eaten lunch The auxiliary verbs have and be are used in forming the perfect and progressive, respectively, which are additional forms that a verb can take on. The auxiliary verbs have and be are used in forming the perfect and progressive, respectively, which are additional forms that a verb can take on. Pat has eaten lunch. Pat is eating lunch. Pat has eaten lunch. Pat is eating lunch. We can’t have two modals, but we can have a modal and an auxiliary: We can’t have two modals, but we can have a modal and an auxiliary: Pat should have eaten lunch. Pat should have eaten lunch. Pat might have been eating lunch. Pat might have been eating lunch. Conclusion: Auxiliaries aren’t T, they’re their own thing. Let’s call have Perf and be Prog. Conclusion: Auxiliaries aren’t T, they’re their own thing. Let’s call have Perf and be Prog.

27 Pat had eaten lunch Suppose that Perf can value an Infl feature, so in Pat had eaten lunch, v+V has [uInfl: Perf], pronounced as “eaten”. Suppose that Perf can value an Infl feature, so in Pat had eaten lunch, v+V has [uInfl: Perf], pronounced as “eaten”. But auxiliaries show tense distinctions too, so they must themselves have an unvalued Infl feature. But auxiliaries show tense distinctions too, so they must themselves have an unvalued Infl feature. Pat has eaten lunch. Pat had eaten lunch. Pat has eaten lunch. Pat had eaten lunch. Have [Perf, uInfl: ] Have [Perf, uInfl: ]

28 Pat had eaten lunch. Pat[N, …] v[uN, uInfl:, …] have[Perf, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Pat[N, …] v[uN, uInfl:, …] have[Perf, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] NOTE: This tree and the next one are not quite in final form. We will soon discuss one further operation that happens to the highest Perf or Prog head. NOTE: This tree and the next one are not quite in final form. We will soon discuss one further operation that happens to the highest Perf or Prog head. NP VPVP v vPvP lunch T [tense:past, T, uN, …] T [T, uN, tense:past, …] TP NP Pat v[uInfl:perf]+V eaten Perf [Perf, uInfl:past] had PerfP

29 Pat was eating lunch. Pat[N, …] v[uN, uInfl:, …] have[Perf, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Pat[N, …] v[uN, uInfl:, …] have[Perf, uInfl:, …] eat[V, uN, …] lunch[N, …] T[T, tense:past, …] Notice that what we have here is a modern implementation of Affix Hopping: Notice that what we have here is a modern implementation of Affix Hopping: Each of T, Prog, and Perf regulate the form of the subsequent verbal form by valuing the next Infl feature down. Each of T, Prog, and Perf regulate the form of the subsequent verbal form by valuing the next Infl feature down. NP VPVP v vPvP lunch T [tense:past, T, uN, …] T [T, uN, tense:past, …] TP NP Pat v[uInfl:prog]+V eating Prog [Perf, uInfl:past] was ProgP

30 Hierarchy of Projections Both have and be (Perf and Prog) are possible, but just in that order. Both have and be (Perf and Prog) are possible, but just in that order. Pat had been eating lunch. Pat had been eating lunch. *Pat was having eaten lunch. *Pat was having eaten lunch. But neither is obligatory. Thus: But neither is obligatory. Thus: Hierarchy of Projections T > (Perf) > (Prog) > v > V Hierarchy of Projections T > (Perf) > (Prog) > v > V

31 Negation Pat might not eat lunch. Pat might not eat lunch. Given everything we have so far, it’s fairly clear where not must be in the structure. Given everything we have so far, it’s fairly clear where not must be in the structure. might is a T might is a T Pat is in SpecTP Pat is in SpecTP eat lunch is a vP (with a trace of Pat in SpecvP) eat lunch is a vP (with a trace of Pat in SpecvP) So, assuming not is a head (of category Neg), we have a NegP between TP and vP. So, assuming not is a head (of category Neg), we have a NegP between TP and vP.

32 Pat might not eat lunch. Pat[N, …] v[uN, uInfl:, …] not[Neg, …] eat[V, uN, …] lunch[N, …] might[T, …] Pat[N, …] v[uN, uInfl:, …] not[Neg, …] eat[V, uN, …] lunch[N, …] might[T, …] Pat might not have eaten lunch. Pat might not have eaten lunch. Pat might not be eating lunch. Pat might not be eating lunch. Pat might not have been eating lunch. Pat might not have been eating lunch. Hierarchy of Projections: T > (Neg) > (Perf) > (Prog) > v > V Hierarchy of Projections: T > (Neg) > (Perf) > (Prog) > v > V NP VPVP v vPvP lunch T might T [T, uN, tense:past, …] TP NP Pat v+V eat Neg not NegP

33 Pat was not eating lunch. Now suppose that we tried to form Pat was not eating lunch. Now suppose that we tried to form Pat was not eating lunch. This is the result. This is the result. HoP says: T > (Neg) > (Perf) > (Prog) > v > V HoP says: T > (Neg) > (Perf) > (Prog) > v > V But the words are not in the right order: be (is) should be before not. But the words are not in the right order: be (is) should be before not. What did we do when we ran into this problem with give? What did we do when we ran into this problem with give? NP VPVP v vPvP lunch T [tense:past] T TP NP Pat v+V eat Prog be NegP ProgPNeg not

34 Pat was not eating lunch. It seems that be (Prog) moves up to T. It seems that be (Prog) moves up to T. Pat is not eating lunch. Pat is not eating lunch. Same thing happens with Perf: Same thing happens with Perf: Pat has not eaten lunch. Pat has not eaten lunch. Only the top one moves over negation: Only the top one moves over negation: Pat has not been eating lunch. Pat has not been eating lunch. NP VPVP v vPvP lunch T [tense:past] T TP NP Pat v+V eat Prog be NegP ProgPNeg not

35 Pat was not eating lunch. What we will assume is that this generally happens (the closest Prog or Perf moves to T)… What we will assume is that this generally happens (the closest Prog or Perf moves to T)… …having not in the sentence only revealed that it happened. …having not in the sentence only revealed that it happened. If there’s no not, you can’t tell whether Prog has moved up to T or not. If there’s no not, you can’t tell whether Prog has moved up to T or not. Just like V moving to v; you can tell for distransitives like give because V moves past the Theme. In transitives like eat, you can’t hear it move, but we assume for uniformity that it does. Just like V moving to v; you can tell for distransitives like give because V moves past the Theme. In transitives like eat, you can’t hear it move, but we assume for uniformity that it does. NP VPVP v vPvP lunch T [tense:past] T TP NP Pat v+V eat Prog be NegP ProgPNeg not

36 Auxiliaries moving to T So, we have observed that empirically, have and be seem to move to T. So, we have observed that empirically, have and be seem to move to T. However: However: Non-auxiliary verbs do not move to T. Non-auxiliary verbs do not move to T. Auxiliaries should not have moved to T if there is already a modal. Auxiliaries should not have moved to T if there is already a modal. This is something special re: have and be. This is something special re: have and be. How can we make this follow from our grammar? How can we make this follow from our grammar? Building on the idea that [uInfl:] on the auxiliaries are valued by T, let’s say that the [uInfl:] feature on the auxiliaries is special in that just being c-commanded by a checker isn’t good enough. It has to be close. Building on the idea that [uInfl:] on the auxiliaries are valued by T, let’s say that the [uInfl:] feature on the auxiliaries is special in that just being c-commanded by a checker isn’t good enough. It has to be close.

37 Strong features We will differentiate between two kinds of unvalued features, strong and weak. We will differentiate between two kinds of unvalued features, strong and weak. Strong unvalued features can only be valued/checked locally. Strong unvalued features can only be valued/checked locally. This will generally require that the lower one be moved up so that it is local. This will generally require that the lower one be moved up so that it is local. Weak features can be valued/checked as we’ve been discussing so far. Weak features can be valued/checked as we’ve been discussing so far. So, the [uInfl:] feature on Aux (Perf or Prog) is strong if (tense is) valued by T. Otherwise, [uInfl:] (including on v) is weak. So, the [uInfl:] feature on Aux (Perf or Prog) is strong if (tense is) valued by T. Otherwise, [uInfl:] (including on v) is weak.

38 Pat was not eating lunch. So, T values [uInfl:] on Prog, but [uInfl:] isn’t checked until it is local to T. So, T values [uInfl:] on Prog, but [uInfl:] isn’t checked until it is local to T. NP VPVP v vPvP lunch T [tense:past] T TP NP Pat v+V eat Prog be [uInfl:*] NegP ProgPNeg not

39 Pat was not eating lunch. So, T values [uInfl:] on Prog, but [uInfl:] isn’t checked until it is local to T. So, T values [uInfl:] on Prog, but [uInfl:] isn’t checked until it is local to T. Once Prog has moved up to fuse with T, [uInfl:past] is local to [tense:past] and [uInfl:past] can be checked. Once Prog has moved up to fuse with T, [uInfl:past] is local to [tense:past] and [uInfl:past] can be checked. As we’ll say next time, this “fusion” is really a variation of Adjoin. As we’ll say next time, this “fusion” is really a variation of Adjoin. NP VPVP v vPvP lunch T+Prog [tense:past]+[uInfl:past*] T TP NP Pat v+V eat NegP ProgPNeg not

40 Closing comment It turns out that this (moving verbal elements) to T is common crosslinguistically, and is a point at which languages vary. It turns out that this (moving verbal elements) to T is common crosslinguistically, and is a point at which languages vary. French raises all verbs and auxiliaries to T, Swedish doesn’t raise either verbs or auxiliaries to T, English raises auxiliaries but not verbs to T, etc. French raises all verbs and auxiliaries to T, Swedish doesn’t raise either verbs or auxiliaries to T, English raises auxiliaries but not verbs to T, etc. In general, this is thought of a variation in feature “strength”— we’ll discuss this more next time. In general, this is thought of a variation in feature “strength”— we’ll discuss this more next time.

41                       


Download ppt "Episode 5b. Agree and movement 5.3-5.4 CAS LX 522 Syntax I."

Similar presentations


Ads by Google