Learning Sensorimotor Contingencies James J. Clark Centre for Intelligent Machines McGill University.

Slides:



Advertisements
Similar presentations
Perception Chapter 4 Visual Process beyond the Retina
Advertisements

Chapter 4: The Visual Cortex and Beyond
Attention and neglect.
Chapter 2.
Visual Sensation & Perception How do we see?. Structure of the eye.
Behavioral Theories of Motor Control
Chapter 3: Neural Processing and Perception. Lateral Inhibition and Perception Experiments with eye of Limulus –Ommatidia allow recordings from a single.
Perception Chapter 9: Event Perception Event Perception: an event is defined as a change in both time and space. Thus far we have discussed how our visual.
Visual Attention Attention is the ability to select objects of interest from the surrounding environment A reliable measure of attention is eye movement.
Human (ERP and imaging) and monkey (cell recording) data together 1. Modality specific extrastriate cortex is modulated by attention (V4, IT, MT). 2. V1.
$ recognition & localization of predators & prey $ feature analyzers in the brain $ from recognition to response $ summary PART 2: SENSORY WORLDS #09:
Perception Chapter 3 Light is necessary but not sufficient for vision Ganzfeld: a visual field completely lacking in contour, or luminance changes. Prolonged.
PSYC 330: Perception Seeing in Color PSYC 330: Perception
Chapter 6 The Visual System
Gain Modulation Huei-Ju Chen Papers: Chance, Abbott, and Reyes(2002) E. Salinas & T. Sejnowski(2001) E. Salinas & L.G. Abbott (1997, 1996) Pouget & T.
Higher Processing of Visual Information: Lecture III
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Sensation Overview How is perception different from sensation? What is psychophysics? What do sense organs do? How does vision work? How does this compare.
Exam in 12 days in class assortment of question types including written answers.
We are on track for an exam on NOVEMBER 2 nd To cover everything since last exam up to Friday the 28th.
1 Biological Neural Networks Example: The Visual System.
Exam 1 week from today in class assortment of question types including written answers.
Sebastian Bitzer Seminar Neurophysiological Foundations of Consciousness University of Osnabrueck A sensorimotor.
December 1, 2009Introduction to Cognitive Science Lecture 22: Neural Models of Mental Processes 1 Some YouTube movies: The Neocognitron Part I:
Color vision Different cone photo- receptors have opsin molecules which are differentially sensitive to certain wavelengths of light – these are the physical.
Instar Learning Law Adapted from lecture notes of the course CN510: Cognitive and Neural Modeling offered in the Department of Cognitive and Neural Systems.
The visual system Lecture 1: Structure of the eye
Cognitive Processes PSY 334 Chapter 2 – Perception.
Major transformations of the light signal in the retina: 1.Temporal filtering – visual response slower than input signal. 2. Spatial filtering – local.
1B50 – Percepts and Concepts Daniel J Hulme. Outline Cognitive Vision –Why do we want computers to see? –Why can’t computers see? –Introducing percepts.
DO NOW: What do you know about our sense of sight and vision? What parts of the eye do you know? What do you know about light?
Active Vision Key points: Acting to obtain information Eye movements Depth from motion parallax Extracting motion information from a spatio-temporal pattern.
Sensation & Perception
1 Computational Vision CSCI 363, Fall 2012 Lecture 3 Neurons Central Visual Pathways See Reading Assignment on "Assignments page"
THE VISUAL SYSTEM: EYE TO CORTEX Outline 1. The Eyes a. Structure b. Accommodation c. Binocular Disparity 2. The Retina a. Structure b. Completion c. Cone.
15 1 Grossberg Network Biological Motivation: Vision Eyeball and Retina.
Myers EXPLORING PSYCHOLOGY Module 14 Introduction to Sensation and Perception: Vision James A. McCubbin, PhD Clemson University Worth Publishers.
.  Sensation: process by which our sensory receptors and nervous system receive and represent stimulus energy  Perception: process of organizing and.
Lecture 2b Readings: Kandell Schwartz et al Ch 27 Wolfe et al Chs 3 and 4.
Sensation vs. Perception Sensation: a process by which our sensory receptors and nervous system receive and represent stimulus energy Sensation: a process.
Chapter 8: Perceiving Motion
Chapter 3: Neural Processing and Perception. Neural Processing and Perception Neural processing is the interaction of signals in many neurons.
1 Computational Vision CSCI 363, Fall 2012 Lecture 5 The Retina.
Visually guided attention during flying OR Pilots “do not like” fovea because they cannot pay attention to more than 1% of space at any one time.
1 Perception and VR MONT 104S, Fall 2008 Lecture 2 The Eye.
Fundamentals of Sensation and Perception RECOGNIZING VISUAL OBJECTS ERIK CHEVRIER NOVEMBER 23, 2015.
1 Copyright © 2014 Elsevier Inc. All rights reserved. Chapter 19 Visual Network Moran Furman.
Keith Clements Introduction to Neuroscience
1 Computational Vision CSCI 363, Fall 2012 Lecture 16 Stereopsis.
Chapter 22 Fundamentals of Sensory Systems
Understanding Psychophysics: Spatial Frequency & Contrast
Sensation and Perception. Transformation of stimulus energy into a meaningful understanding –Each sense converts energy into awareness.
How we actively interpret our environment..  Perception: The process in which we understand sensory information.  Illusions are powerful examples of.
Chapter 9: Perceiving Color. Figure 9-1 p200 Figure 9-2 p201.
1 Computational Vision CSCI 363, Fall 2012 Lecture 32 Biological Heading, Color.
1 Perception and VR MONT 104S, Spring 2008 Lecture 3 Central Visual Pathways.
Does the brain compute confidence estimates about decisions?
Psychology 304: Brain and Behaviour Lecture 28
Sight Our Visual Perception
THE VISUAL SYSTEM: ESSENTIALS OF SIGHT
Dr.safeyya Adeeb Alchalabi
VISION Module 18.
Grossberg Network.
Sensorimotor Learning and the Development of Position Invariance
Alteration of Visual Perception prior to Microsaccades
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Experiencing the World
(Do Now) Journal What is psychophysics? How does it connect sensation with perception? What is an absolute threshold? What are some implications of Signal.
Learning Sensorimotor Contingencies
Presentation transcript:

Learning Sensorimotor Contingencies James J. Clark Centre for Intelligent Machines McGill University

This work is being done in collaboration with: J. Kevin O’Regan (CNRS, Univ. Rene Descartes) and with doctoral students at McGill University: Fatima Drissi-Smaili Ziad Hafed Muhua Li

A mystery : Why do we perceive the same feature value (e.g. Color) when viewing the feature foveally or peripherally? Why is this a mystery? The signal provided by retinal photoreceptors can be quite different when the image of the feature falls on different places on the retina. For example: the spectral sensitivity curves of retinal photoreceptors are shifted towards the blue in peripheral cells as compared with the foveal cells..

A related mystery (perhaps…) : Why do neurons in areas such as V4 and IT, which have large receptive fields, respond to the same feature value (e.g. color, orientation, complex shape) no matter where the feature lies in the receptive field? The activity of these neurons is usually reduced when the feature falls in the periphery of the receptive field as compared with the center, but the neuron’s selectivity, or tuning, is the same everywhere.

Perceptual Stability These mysteries can be more generally considered as related to the mystery of perceptual stability. Perceptual stability is the constancy of subjective experience across self-actions, even though these self-actions can cause large changes in sensory inputs.

Sensorimotor Contingencies One theory of perceptual stability, due to O’Regan and Noe, holds that what is perceived is the sensorimotor contingency associated with a given physical stimulus. A sensorimotor contingency is a law or set of laws that describes the relation between self-actions and resulting changes in sensory input. Since it is the presence of a lawful relationship between sensory input and motor activity that determines the perception of a physical stimulus, an appropriate change in sensory input is necessary for a perception to be stable!

Conditioning using Temporal-Difference Learning We propose that Sensorimotor Contingencies associated with sensory changes due to eye movements can be learned using a variety of learning techniques. We propose the use of the Temporal-Difference Learning scheme of Sutton and Barto. This reinforcement learning technique can be thought of as a form of Conditioning where the Conditioned Stimulus is the sensory activity before the eye movement and the Unconditioned Stimulus is the sensory activity after the eye movement. After conditioning, presentation of the conditioned stimulus will produce the same behaviour as that produced by the unconditioned stimulus.

The Sutton-Barto Temporal-Difference Learning Rule

V is a matrix of association strengths between pre- and post- saccadic stimuli. The pre-motor stimulus X is held in a short-term memory generating an eligibilty trace, which will be used to enhance, in a Hebbian fashion, the association to the post-motor stimulus. The reinforcement signal, which is multiplied by the eligibilty trace to yield the change in the association matrix, is the difference between 2 different predictions of the foveal response - a weighted sum of the current and previous foveal responses, and the action of the current association matrix on the previous peripheral stimulus.

Attention selects a peripheral target and enhances feature detector activity at that location. TRAINING PHASE

A short-term memory (eligibility trace) of this feature activity is generated. TRAINING PHASE

An eye movement is made, foveating the target. TRAINING PHASE

Attention shifts to the fovea, enhancing the feature detector activity there. TRAINING PHASE

The feature detector activity at the fovea is associated with the feature detector activity represented in the short-term memory, using an appropriate learning rule, e.g. the Sutton-Barto Temporal Difference Rule. TRAINING PHASE

Once associations have been built up, the appearance of an attended-to target in periphery can produce a response as though the target is actually foveated. This response can be thought of as a mental image. This mental image might be represented by activity in neurons in areas with large receptive fields (V4, IT) and hence would be concerned only with feature type, rather than feature location. This provides an explanation for the continuity in the quality of the subjective experience of a stimulus across the visual field. RECOGNITION PHASE

We have divided the processing into two separate phases, Training and Recognition. In practice, however, these can co-occur. The learning mechanism can be continuous, allowing adaptation to changes in the sensory and motor systems (e.g. aging of the photoreceptors, changes in the projective optics of the eye, …) STEADY-STATE OPERATION

Creation of “Mental Images” Once the association weights matrix, V, has been learned, it can be used to generate predictions, M, of what the foveal image or feature detector response will look like, based on the peripheral, responses, P. M = V*P It is expected that the association matrix should map foveal images into themselves, therefore the eigenvectors of this matrix should be (linear combinations of) the foveal images. F = kV*F

AN EXAMPLE: STABILITY OF COLOR PERCEPTION Many factors, including absorption of light by the lens of the eye, cause a yellowing of the light falling on the fovea as compared with that falling on the periphery.

After training, a presentation of a given color feature in the fovea is associated with the color feature that would be observed after the feature is foveated with an eye movement. This can be seen in the structure of the association weights matrix, where peripheral and foveal color features map to the same color class.

ANOTHER EXAMPLE: STABILITY OF STRAIGHT LINE PERCEPTION The retina is hemispherical, and this causes straight lines in space to be projected as 2-D arcs on the retinal surface, with radii of curvature that vary with eccentricity

Images of Lines Projected onto Receptors Images of Straight Lines At Various Eccentricities

It can be seen that the “mental images” are all very close to the foveal images, no matter where on the periphery the projection of the physical line falls. The eigenvalues are not equal to the foveal images, but the foveal images can be obtained from them through a linear sum.

Development of Position Invariance in Neural Responses Feature detectors with differing preferred stimuli (corresponding to the photoreceptor responses of a stable physical stimulus as the eye moves) Which feature detectors are connected to the cell must be learned (and continually adapted) Standard View it is unclear how the development would proceed without some sort of adaptation signal coming from the need for constancy of response across self-actions (e.g. eye movements)

Development of Position Invariance in Neural Responses Feature detectors with differing preferred stimuli (corresponding to the photoreceptor responses of a stable physical stimulus as the eye moves) Alternate View The weightings of the lower level units are continually updated through the associative learning mechanism. This mechanism requires input from the oculomotor system to know when an eye movement has taken place. Association Layer “mental image” (prediction of foveal response) Eye movement signal

Conclusions Perceptual stability and the position invariance of higher-level cortical neurons may arise from a learning of sensorimotor contingencies. Such learning can be accomplished with a reinforcement learning network, which learns to generate predictions of lower level visual feature detector activity which would occur after foveation of a physical stimulus. In our view, a projection of a physical stimulus onto any peripheral retinal location will result in the same “mental image” of the feature as projection onto the fovea.

On-going and Future Research * Recurrent Feedback of predictions back down to low-level feature detectors - will allow small displacements of foveal image * Interpretation of the Reinforcement Signal - small signal can be used to drive adaptation - large signal can be used to indicate instability of the world or to indicate that a new class should be created * Psychophysical studies of Pre- and Post-motor attention shifts * Sensorimotor Basis function representations of the Association weights matrix.