Download presentation
Presentation is loading. Please wait.
Published byRafe Wells Modified over 9 years ago
1
Year Review Nancy Rader May 13, 2011
2
esearch Emotion and Working Memory Temperament Infant Perception Attention and Early Language
3
yeTracking: A Window into the Mind of a Child
4
NFANT RESEARCH Getting Their Attention through Gesture The Problem How do young infants discover that a segment of the sound stream references a particular aspect of the visual world around them?
6
Joint Attention accomplished through Gaze Following Point Following But these are not reliably available before 15 months of age
7
Mother Introducing Object Name to Infant 3-D Motion Analysis of Gesture Sahlstrom, A., Rader, N., & King, D. (2009). Measuring speech-gesture synchrony in mother-to-infant interactions. Presented at the biennial meeting of the Society for Research in Child Development, Denver, CO. borky
8
Tennessee… Tennessee… Look at the Borky…
9
Digital video technology was used to create the video segments for the two conditions so that they were identical except for the gesture. Studying the Effects of Gesture on Early Word Learning in Infants 9-14 Months of Age* Rader, N. & Zukow-Goldring, P. (2011, in press). Caregivers’ Gestures Direct Infant Attention during Early Word Learning: The Importance of Dynamic Synchrony. Language Sciences,33(4). *Research supported by a grant from NSF
10
Method Participants From English-speaking homes in the Ithaca NY area Criterion for participation: Able to look at “mommy” when asked – Study 1 17 infants aged 9.8-13.9 months (M=11.9) – Study 215 infants aged 9.6-14.7 months (M=12.5) Design Within-subject design Independent Variable: Type of gesture – Study 1synchronous dynamic vs. static – Study 2synchronous dynamic vs. asynchronous dynamic
11
Hypotheses of Research Study 2 Infants will pay more attention to an object at the time a word is said if the dynamic gesture is made in synchrony with the saying of the word, as compared with an asynchronous dynamic gesture. Use of the synchronous, dynamic gesture to direct attention will result in better word learning. Study 1 Infants will pay more attention to an object at the time a word is said if a synchronous dynamic gesture, as compared with a static gesture, is used to highlight the object in coordination with the saying of the word. This directing of attention through a dynamic gesture will result in better word learning.
12
Apparatus 42” plasma screen for presentation of videos Car seat for infant Eye tracking hardware and software from Applied Science Laboratory Computational software from Eye Response Technology Sensor above the right eye to track head movements. Teletubbie video clip was presented while the infant’s eye was captured by the eye tracking system.
13
Data Analysis Look Zones for Learning Trials
14
Study 1 Gaze Duration for Object during Naming Infants’ gaze durations for the object while it was being named were significantly longer in the dynamic condition (M=2.01, SD=1.60) than in the static condition (M=.47, SD=.59), t(16)=4.283, p<.001, n 2 =.53.
15
Study 1: Other Gaze Duration Data Infants looked at the object significantly more when it was paired with a dynamic gesture (M=7.08, SD=3.44) than a static gesture (M=2.09, SD=1.89), t(16)=5.228, p<.001, n 2 =.63. Infants’ total amount of time spent looking at the screen in the synchronous dynamic condition was not significantly different (M=16.28, SD=5.21) from the static condition (M=16.47, SD=6.02).
16
Study 2 Gaze Duration for the Object during Naming Infants’ total time looking at the object during the word was significantly longer in the synchronous condition (M=1.76, SD=1.17) than in the asynchronous condition (M=1.11, SD=1.02), t(14)=1.915, p=.038, n 2 =.23
17
Infants’ total amount of time spent looking at the screen was not significantly different in the synchronous condition (M=19.01, SD=5.08) when compared to the asynchronous condition (M=19.95. SD=5.95) Infants’ total amount of time spent looking at the object was not significantly different in the synchronous condition (M=6.98, SD=3.54) when compared to the asynchronous condition (M= 6.97, SD=3.89) Other Gaze Duration Data
18
So, while movement directs attention toward the object, as seen in Study 1, it is the synchrony that directs looking at the object at the critical time for binding the sound of the word together with the sight of the object.
19
Test for Word Learning
20
Example of Looking during Test of Word Learning Measure: After each request, the number of looks to each object was counted. The Dependent Measure for word learning was the ratio of total correct looks to total looks.
21
Study 1: Word Learning Results There was a significant difference between the two conditions as shown by an ANCOVA with age as the covariate, F(1,15)=10.631,p=.003 Younger infants benefited more than older infants from the dynamic gesture Infants 12.5 or younger showed better word learning when exposed to the dynamic condition (M=.64, SD=.27) than the static condition (M=.36, SD=.33), t(9)=2.083, p=.034, n 2 =.16.
22
Study 2: Word Learning Results A paired samples t-test between the synchronous and asynchronous condition showed significantly better word learning for the synchronous condition (M=.68, SD=.21) over the asynchronous condition (M=.44, SD=.27), t(14)=2.71, p=.01.
23
Attention to the Speaker’s Mouth
24
The Importance of Watching the Mouth for Early Language Motor Neurons? Developmental Task = learning motor control to produce English phonemes
25
Mean Ratio of Gaze Duration for the Speaker’s Eyes compared to the Gaze Duration for the Mouth =.286 (SD =.32 ) In other words, infants 9 – 14 months of age spent nearly 4 times as much time looking at the speaker’s mouth compared to the time spent looking at the speaker’s eyes. Rader, N. & Zukow-Goldring, P. (2010). How the hands control attention during early word learning. Gesture, 10:2-3, 202-221.
26
Next research question Does this attention to the mouth change with development? i.e., is there a developmental shift from looking at the mouth to looking at the eyes? We hypothesized that there should be given the necessity of looking at the speaker’s eyes for gaze following the emotional information provided by the eyes
27
Preschool Study Participants Twenty typically developing children (10 males and 10 females) ranging in age from 19 to 49 months (M = 32.14, SD = 10.49) participated in the study Fourteen typically developing infants (7 males and 7 females) ranging in age from 9 to 15 months (M= 12.01, SD= 1.54) The procedure of our current study was the same as that of the previous infant study except: We used a different physical set-up to accommodate the size of the older children Eye fixations were measured using a system provided by the Mangold Corporation Rader, N., Zukow-Goldring, P., Stuprich, E., & Rhoades, M. (April, 2011). Looking away from the speaker’s mouth: A developmental shift from infancy to preschool. Presented at the biennial meeting of the Society for Research in Child Development, Montreal, Canada.
28
Hypothesis: While infants spent more time looking at the speaker’s mouth than at her eyes, preschoolers will spend more time looking at the speaker’s eyes than at her mouth. Results: For the preschoolers, the mean ratio of gaze duration for the eyes to the mouth was 3.60 (SD = 4.96), while for the infants it was.286 (SD =.325); this difference was significant, t(32) = 2.485, p =.018. AGE
29
Current Eye Tracking Research 1.Filling in the developmental gap – testing infants 15-19 months of age 2.Doing longitudinal research with infants beginning at 4 months of age 3. Testing atypically developing children, e.g., those with a diagnosis of ASD
30
Many thanks to Research Team 04!! Spring 2011
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.