Colin J. Palmer, Colin W.G. Clifford  Current Biology 

Slides:



Advertisements
Similar presentations
Bram-Ernst Verhoef, Rufin Vogels, Peter Janssen  Neuron 
Advertisements

Long-Term Speeding in Perceptual Switches Mediated by Attention-Dependent Plasticity in Cortical Visual Processing  Satoru Suzuki, Marcia Grabowecky 
Backward Masking and Unmasking Across Saccadic Eye Movements
Context-Dependent Decay of Motor Memories during Skill Acquisition
Bram-Ernst Verhoef, Rufin Vogels, Peter Janssen  Neuron 
Volume 19, Issue 22, Pages R1022-R1023 (December 2009)
Illusory Jitter Perceived at the Frequency of Alpha Oscillations
Natalia Zaretskaya, Andreas Bartels  Current Biology 
Volume 27, Issue 23, Pages e4 (December 2017)
Volume 27, Issue 22, Pages e3 (November 2017)
Pre-constancy Vision in Infants
Perceptual Echoes at 10 Hz in the Human Brain
Avi J.H. Chanales, Ashima Oza, Serra E. Favila, Brice A. Kuhl 
African Elephants Can Use Human Pointing Cues to Find Hidden Food
Kinesthetic information disambiguates visual motion signals
Independent Aftereffects of Attention and Motion
Mental Imagery Changes Multisensory Perception
Depth Affects Where We Look
Visual Motion and the Perception of Surface Material
Minami Ito, Gerald Westheimer, Charles D Gilbert  Neuron 
There's more to magic than meets the eye
Context-Dependent Decay of Motor Memories during Skill Acquisition
Volume 27, Issue 20, Pages e3 (October 2017)
Jason Samaha, Bradley R. Postle  Current Biology 
A Role for the Superior Colliculus in Decision Criteria
Attentional Modulations Related to Spatial Gating but Not to Allocation of Limited Resources in Primate V1  Yuzhi Chen, Eyal Seidemann  Neuron  Volume.
A Novel Form of Stereo Vision in the Praying Mantis
A Novel Form of Stereo Vision in the Praying Mantis
Syed A. Chowdhury, Gregory C. DeAngelis  Neuron 
Cristina Márquez, Scott M. Rennie, Diana F. Costa, Marta A. Moita 
Volume 95, Issue 1, Pages e3 (July 2017)
Volume 27, Issue 23, Pages e4 (December 2017)
Avi J.H. Chanales, Ashima Oza, Serra E. Favila, Brice A. Kuhl 
Volume 28, Issue 15, Pages e5 (August 2018)
The Occipital Place Area Is Causally Involved in Representing Environmental Boundaries during Navigation  Joshua B. Julian, Jack Ryan, Roy H. Hamilton,
Integration Trumps Selection in Object Recognition
Walking Modulates Speed Sensitivity in Drosophila Motion Vision
Volume 45, Issue 5, Pages (March 2005)
Opposite Effects of Recent History on Perception and Decision
Ryo Sasaki, Takanori Uka  Neuron  Volume 62, Issue 1, Pages (April 2009)
Neuronal Response Gain Enhancement prior to Microsaccades
Serial, Covert Shifts of Attention during Visual Search Are Reflected by the Frontal Eye Fields and Correlated with Population Oscillations  Timothy J.
Volume 25, Issue 5, Pages (March 2015)
Social Signals in Primate Orbitofrontal Cortex
Stephen V. David, Benjamin Y. Hayden, James A. Mazer, Jack L. Gallant 
Dissociable Effects of Salience on Attention and Goal-Directed Action
Masaya Hirashima, Daichi Nozaki  Current Biology 
Volume 14, Issue 3, Pages (February 2004)
Martijn Barendregt, Ben M. Harvey, Bas Rokers, Serge O. Dumoulin 
Visual Adaptation of the Perception of Causality
Computer Use Changes Generalization of Movement Learning
Dongjun He, Daniel Kersten, Fang Fang  Current Biology 
Humans Have an Expectation That Gaze Is Directed Toward Them
Supervised Calibration Relies on the Multisensory Percept
Repeating Spatial Activations in Human Entorhinal Cortex
Encoding of Stimulus Probability in Macaque Inferior Temporal Cortex
Keith A. May, Li Zhaoping, Paul B. Hibbard  Current Biology 
Ian C. Fiebelkorn, Yuri B. Saalmann, Sabine Kastner  Current Biology 
The Perception and Misperception of Specular Surface Reflectance
Proprioceptive Distance Cues Restore Perfect Size Constancy in Grasping, but Not Perception, When Vision Is Limited  Juan Chen, Irene Sperandio, Melvyn.
Volume 21, Issue 7, Pages (April 2011)
Color Constancy for an Unseen Surface
Kazumichi Matsumiya, Satoshi Shioiri  Current Biology 
Manuel Jan Roth, Matthis Synofzik, Axel Lindner  Current Biology 
J. Andrew Pruszynski, Roland S. Johansson, J. Randall Flanagan 
Volume 22, Issue 5, Pages (March 2012)
Visual Motion Induces a Forward Prediction of Spatial Pattern
Dogs' Gaze Following Is Tuned to Human Communicative Signals
Li Zhaoping, Nathalie Guyader  Current Biology 
Volume 28, Issue 19, Pages e8 (October 2018)
Presentation transcript:

Perceived Object Trajectory Is Influenced by Others’ Tracking Movements  Colin J. Palmer, Colin W.G. Clifford  Current Biology  Volume 27, Issue 14, Pages 2169-2176.e4 (July 2017) DOI: 10.1016/j.cub.2017.06.019 Copyright © 2017 Elsevier Ltd Terms and Conditions

Figure 1 Effect of Gaze Cues on the Perception of Target Trajectory in Depth (A) A schematic view of the stimulus shown in Movie S1. The focal point of the avatar’s gaze follows the path of the black ellipse in either the clockwise or counterclockwise direction. The target object follows the path of the straight red line (in Movie S1), perpendicular to the observer’s direction of view. The target object does not change in depth relative to the viewer (as indicated by size cues), but despite this, it is typically perceived as moving in an elliptical path more consistent with the trajectory of gaze. (B) Quantifying the effect of gaze cues on perceived object trajectory. The path of the target object (as indicated by size cues) differed across trials between a straight line and a set of different elliptical paths that were up to 50% of the depth of the gaze trajectory, in either the congruent (positive) or incongruent (negative) direction of travel (see Movie S2). Participants indicated whether the target object was traveling in a clockwise or counterclockwise direction of elliptical motion. Logistic functions were fit to the data by minimizing the sum of squared errors. There was a shift toward more counterclockwise responses compared to baseline when the eyes were following a counterclockwise trajectory and a shift toward more clockwise responses compared to baseline when the eyes were following a clockwise trajectory. To quantify the magnitude of this effect, we calculated the difference between the midpoints of the psychometric function when gaze was clockwise compared to when gaze was counterclockwise and halved this value to get an average effect of gaze trajectory on perceived target depth. Effect size = 12.02% (95% CI: 6.13, 17.90). (C) Within-subjects replication of (B), demonstrating that the illusion persists after extensive experience with the stimuli. Effect size = 15.13% (95% CI: 9.68, 22.13). At the individual level, all subjects differed between the clockwise and counterclockwise conditions in the expected direction at time 1 and time 2. (D) Between-subjects replication of (B), demonstrating that the illusion occurs in a wider pool of subjects. Effect size = 6.35% (95% CI: 2.12, 11.14). The smaller effect size in this sample may reflect different degrees of engagement in the task typical to the sample in experiment 1 (experienced psychophysical observers) and the sample in experiment 2 (undergraduate students). At the individual level, 18 out of 23 subjects differed between the clockwise and counterclockwise conditions in the expected direction. A binomial test confirmed that this is a significantly greater proportion than that expected by chance, p < 0.05 (two-tailed). Current Biology 2017 27, 2169-2176.e4DOI: (10.1016/j.cub.2017.06.019) Copyright © 2017 Elsevier Ltd Terms and Conditions

Figure 2 The Effect of Isolating the Stimulus Features that Were Tracking the Trajectory of Gaze (A) Animation stills for the full cues condition (containing both head and eye cues to gaze direction) and baseline condition (containing neither head nor eye cues to gaze direction) used in experiments 1 and 2. (B–F) Animation stills and sample data for the head only condition, eyes only condition, head-eyes opposed condition, non-face object condition, and local motion removed condition (experiment 3). n = 5 for each, within subjects. (G) The effect of gaze cues on perceived target trajectory across conditions. FC, full cues (experiment 1); HO, head only; EO, eyes only; HEO, head-eyes opposed; NFO, non-face object; LMR, local motion removed. Asterisk (∗) indicates a significant difference between conditions at the 95% level. Data are represented as the half-difference between the midpoints of the psychometric functions in clockwise and counterclockwise conditions with 95% confidence intervals. Current Biology 2017 27, 2169-2176.e4DOI: (10.1016/j.cub.2017.06.019) Copyright © 2017 Elsevier Ltd Terms and Conditions

Figure 3 Distinguishing the Effect of Object-Level Motion and Low-Level Motion on Perceived Target Trajectory (A) Bird’s-eye view schematic of the stimuli shown in Movie S3. When concave and convex objects are rotated to track the movement of a target, the object-level motion occurs in the same direction for these different objects, while the low-level motion (i.e., local motion of the object’s textured surface) seen behind the target occurs in opposite directions. (B) Animation stills of the convex and concave non-face stimuli used in experiment 4. Animated examples of these stimuli are shown side by side for comparison in Movie S3. (C) Data for the convex and concave stimulus conditions (n = 5 for each). The effect of the inducer on perceived target trajectory occurred in the same direction across these two conditions, despite the opposite directions of low-level motion. (D) The effect of the inducer on perceived target trajectory across conditions. Data are represented as the half-difference between the midpoints of the psychometric functions for clockwise and counterclockwise conditions, with 95% confidence intervals. The effect size in the convex non-face object condition was significantly greater than in the concave non-face object condition (∗ difference = 7.31%, 95% CI: 1.42 15.70), but the effect of the inducer was significantly different from zero in both conditions (and, most importantly, occurred in the same direction in both conditions). Current Biology 2017 27, 2169-2176.e4DOI: (10.1016/j.cub.2017.06.019) Copyright © 2017 Elsevier Ltd Terms and Conditions

Figure 4 Effect of Gaze Cues on the Perception of Apparent Motion and the Perception of Causal Interactions between Moving Objects (A) Apparent motion displays that incorporate gaze cues to the direction of apparent motion. The display alternated between the two upper frames or the two lower frames, corresponding to vertical and horizontal shifts in gaze direction, respectively. When gaze cues are absent, the spheres appear to move either vertically or horizontally between frames, and the perception of motion direction tends to be bistable. See Movie S4. (B) In experiment 5, participants indicated whether the spheres were moving vertically or horizontally between frames. The aspect ratio of the sphere positions differed across trials such that there was either greater horizontal separation between the sphere positions (positive log aspect ratio) or greater vertical separation between sphere positions (negative log aspect ratio). This allowed us to quantify the effect of gaze cues on the apparent motion of the target spheres. In the baseline condition, the spheres were shown in the absence of gaze cues; specifically, the avatar was stationary across frames, and its eyes were occluded by sunglasses, similar to that illustrated in Figure 2A, while the position of the spheres differed between frames as usual. (C) In experiment 6, the avatar’s gaze either followed the target dot to the opposite side of the screen (implying that the dots crossed) or back to the same side of the screen (implying that the dots bounced off one another). See Movie S5. The images shown are frames 25%, 50%, and 75% into the animation. (D) Participants reported whether the dots appeared to bounce off one another or cross past one another. The cues provided by the avatar differed across a series of conditions (see STAR Methods for details; see Figure S1 in the Supplemental Information for illustrations of each condition). B, baseline; FC, full cues; HO, head only; EO, eyes only; NFO, non-face object; FC-L, full cues with local motion removed; NF-L, non-face object with local motion removed; HEO, head-eyes opposed; MH, multiple heads; CAT, catch trials. In catch trials, the avatar provides no cues to target trajectory, but “bouncing” and “crossing” trajectories are indicated by dot color. Data are represented as mean ± 1 SEM. (E) The effect of gaze cues, quantified as the difference in mean responses between when the eyes followed a bounce trajectory compared to when they followed a crossing trajectory, was significant for all but the HEO condition. Error bars indicate 95% confidence intervals. Current Biology 2017 27, 2169-2176.e4DOI: (10.1016/j.cub.2017.06.019) Copyright © 2017 Elsevier Ltd Terms and Conditions