Presentation is loading. Please wait.

Presentation is loading. Please wait.

From linear sequences to abstract structures: Distributional information in infant-direct speech Hao Wang & Toby Mintz Department of Psychology University.

Similar presentations


Presentation on theme: "From linear sequences to abstract structures: Distributional information in infant-direct speech Hao Wang & Toby Mintz Department of Psychology University."— Presentation transcript:

1 From linear sequences to abstract structures: Distributional information in infant-direct speech Hao Wang & Toby Mintz Department of Psychology University of Southern California This research was supported in part by a grant from the National Science Foundation (BCS-0721328). 1 Socal Workshop 2009 @ UCLA

2 Outline Introduction – Learning word categories (e.g., noun and verb) is a crucial part of language acquisition – The role of distributional information – Frequent frames (FFs) Analyses 1 & 2, structures of FFs in child- directed speech Conclusion and implication 2

3 Speakers’ Implicit Knowledge of Categories Upon hearing: I saw him slich. Hypothesizing: They slich. He sliches. Johny was sliching. 3 The truff was in the bag. He has two truffs. She wants a truff. Some of the truffs are here.

4 Distributional Information The contexts a word occurs – Words before and after the target word Example – the cat is on the mat – Affixes in rich morphology languages Cartwright & Brent, 1997; Chemla et al, 2009; Maratsos & Chalkley, 1980; Mintz, 2002, 2003; Redington et al, 1998 4

5 Frequent frames (Mintz, 2003) Two words co-occurring frequently with one word intervening 5 FRAMEFREQ. you__it 433 you__to 265 you__the 257 what__you 234 to__it 220 want__to 219... the__is79... Frame you_it Peter Corpus (Bloom, 1970) 433 tokens, 93 types, 100% verbs

6 Accuracy Results Averaged Over All Six Corpora (Mintz, 2003) 6

7 Structure of Natural Languages In contemporary linguistics, sentences are analyzed as hierarchical structures Word categories are defined by their structural positions in the hierarchical structure 7 But, FFs are defined over linear sequences How can they accurately capture abstract structural regularities?

8 Why FFs are so good at categorizing words? Is there anything special about the structures associated with FFs? FFs are manifestations of some hierarchically coherent and consistent patterns which largely constrained the possible word categories in the target position. 8

9 Analysis 1 Corpora – Same six child-directed speech corpora from CHILDES (MacWhinney, 2000) as in Mintz (2003) – Labeled with dependency structures (Sagae et al., 2007) – Speech to children before age of 2;6 Eve (Brown, 1973), Peter (Bloom, Hood, & Lightbown, 1974; Bloom, Lightbown, & Hood, 1975), Naomi (Sachs, 1983), Nina (Suppes, 1974), Anne (Theakston, Lieven, Pine, & Rowland, 2001), and Aran (Theakston, et al., 2001). 9

10 Grammatical relations A dependency structure consists of grammatical relations (GRs) between words in a sentence Similar to phrase structures, it’s a representation of structural information. Sagae et al., 2005 10

11 Consistency of structures of FFs Combination of GRs to represent structure – W1-W3, W1-W2, W2-W3, W1-W2-W3 Measures – For each FF, percentage of tokens accounted for by the most frequent 4 GR patterns Control – Most frequent 45 unigrams (FUs) – E.g., the__ Method W1 W2 W3 11

12 Results 12 * t(5)=26.97, p<.001

13 Frequent framesGR of W1*GR of W3* Token count what__you 2 OBJ2 SUBJ 287 4 OBJ2 SUBJ 46 5 OBJ2 SUBJ 20 3 POBJ2 SUBJ 5 you__to 0 SUBJ2 INF 260 0 SUBJ0 JCT 26 -2 SUBJ2 INF 1 0 SUBJ0 INF 1 what__that 0 PRED0 SUBJ 216 0 PRED2 DET 14 3 OBJ2 DET 4 2 OBJ2 SUBJ 4 you__it 0 SUBJ0 OBJ 195 0 SUBJ2 SUBJ 6 -2 OBJ0 OBJ 2 -2 OBJ2 SUBJ 1 *The word position and head position for GRs in this table are positions relative to the target word of a frame. W1’s word position is always -1, W3 is always 1. Top 4 W1-W3 GR patterns 13

14 Analysis 1 Summary Frequent frames in child-directed speech select very consistent structures, which help accurately categorizing words Analysis 2, internal organizations of frequent frames 14

15 Analysis 2 Same corpora as Analysis 1 GRs between words in a frame and words outside that frame (external links) and GRs between two words within a frame (internal links) For each FF type, the number of links per token was computed for each word position Internal links External links Not counted 15

16 Links from/to W1 16

17 Conclusion & implications Frequent frames, which are simple linear relations between words, achieve accurate categorization by selecting structurally consistent and coherent environments. The third word (W3) helps FFs to focus on informative structures This relation between a linear order pattern and internal structures of languages may be a cue for children to bootstrap into syntax 17

18 References – MacWhinney, B. (2000). The CHILDES Project: Tools for Analyzing Talk. Mahwah, NJ: Lawrence Erlbaum Associates. – Mintz, T. H. (2003). Frequent frames as a cue for grammatical categories in child directed speech. Cognition, 90(1), 91-117. – Sagae, K., Lavie, A., & MacWhinney, B. (2005). Automatic measurement of syntactic development in child language. ACL Proceedings. – Sagae, K., Davis, E., Lavie, A., MacWhinney, B. and Wintner, S. High- accuracy annotation and parsing of CHILDES transcripts. In, Proceedings of the ACL-2007 Workshop on Cognitive Aspects of Computational Language Acquisition. Thank you! 18

19 Pure frequent frames? 19

20 Ana. 2 mean token coverage EvePeterNinaNaomiAnneAran Frequent frames W1-W30.960.870.940.930.920.89 W1-W20.940.870.930.92 0.88 W2-W30.920.850.910.880.900.82 W1-W2-W30.890.800.890.86 0.79 BigramsW1-W20.690.570.680.630.680.61 20

21 Ana. 2 FF external links Corpus Token count External links to W1to W2to W3from W1from W2from W3 Eve36010.190.540.500.150.330.39 Peter45410.280.710.440.250.300.52 Nina67090.190.460.710.150.320.40 Naomi14470.200.770.460.130.360.52 Anne44350.240.500.540.180.320.43 Aran52450.270.610.510.170.390.51 Average0.230.600.520.170.340.46 Table 3 Average number of links per token for frequent frames 21

22 FF internal links Corpus Token count Internal links W1->W2W1->W3W2->W1W2->W3W3->W1W3->W2 Eve36010.520.250.100.280.100.29 Peter45410.440.210.160.270.130.20 Nina67090.480.290.090.370.070.23 Naomi14470.600.170.130.210.070.24 Anne44350.410.290.170.340.120.17 Aran52450.500.200.160.240.100.21 Average0.490.240.140.290.100.22 22

23 Ana. 2 FU links Corpus Token count External linksInternal links to W1to W2from W1from W2W1->W2W2->W1 Eve280760.520.51 0.620.320.18 Peter357230.650.480.530.660.280.20 Nina370550.660.580.490.640.320.15 Naomi124090.590.500.510.630.300.19 Anne386810.520.480.440.620.360.16 Aran493020.520.550.540.600.300.22 Average0.580.520.510.630.310.19 23


Download ppt "From linear sequences to abstract structures: Distributional information in infant-direct speech Hao Wang & Toby Mintz Department of Psychology University."

Similar presentations


Ads by Google