Presentation is loading. Please wait.

Presentation is loading. Please wait.

Abstract This talk summarizes our recent attempts to integrate action and perception within a single optimization framework. We start with a statistical.

Similar presentations


Presentation on theme: "Abstract This talk summarizes our recent attempts to integrate action and perception within a single optimization framework. We start with a statistical."— Presentation transcript:

1

2 Abstract This talk summarizes our recent attempts to integrate action and perception within a single optimization framework. We start with a statistical formulation of Helmholtz's ideas about neural energy to furnish a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. Using constructs from statistical physics it can be shown that the problems of inferring the causes of our sensory inputs and learning regularities in the sensorium can be resolved using exactly the same principles. Furthermore, inference and learning can proceed in a biologically plausible fashion. The ensuing scheme rests on Empirical Bayes and hierarchical models of how sensory information is generated. The use of hierarchical models enables the brain to construct prior expectations in a dynamic and context-sensitive fashion. This scheme provides a principled way to understand many aspects of the brain's organization and responses. We will demonstrate the brain-like dynamics that this scheme entails by using models of bird songs that are based on chaotic attractors with autonomous dynamics. This provides a nice example of how nonlinear dynamics can be exploited by the brain to represent and predict dynamics in the sensorium.. Attractors in Song Cognitive Neuroimaging Seminar on Monday 18th

3 Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview

4 agent - m environment Separated by a Markov blanket External states Internal states Sensation Action Exchange with the environment

5 Perceptual inference Perceptual learning Perceptual uncertainty Action to minimise a bound on surprise The free-energy principle Perception to optimise the bound The conditional density and separation of scales

6 Mean-field partition Synaptic gain Synaptic activitySynaptic efficacy Activity-dependent plasticity Functional specialization Attentional gain Enables plasticity Perception and inference Attention and uncertainty Learning and memory

7 Hierarchical (deep) dynamic models

8 Empirical Bayes and DEM Bottom-upLateral E-Step Perceptual learning M-Step Perceptual uncertainty D-Step Perceptual inference Recurrent message passing among neuronal populations, with top-down predictions changing to suppress bottom-up prediction error Friston K Kilner J Harrison L A free energy principle for the brain. J. Physiol. Paris. 2006 Associative plasticity, modulated by precision Encoding of precision through classical neuromodulation or plasticity of lateral connections Top-down

9 L4 SG IG Message passing in neuronal hierarchies Backward predictions Forward prediction error

10 Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview

11 Synthetic song-birds SyrinxVocal centre Time (sec) Frequency Sonogram 0.511.5

12 102030405060 -5 0 5 10 15 20 prediction and error time 102030405060 -5 0 5 10 15 20 hidden states time Backward predictions Forward prediction error 102030405060 -10 -5 0 5 10 15 20 Causal states time (bins) Hierarchical recognition stimulus 0.20.40.60.8 2000 2500 3000 3500 4000 4500 5000 time (seconds)

13 Perceptual categorization Frequency (Hz) Song A 0.20.40.60.8 2000 3000 4000 5000 time (seconds) Song B 0.20.40.60.8 2000 3000 4000 5000 Song C 0.20.40.60.8 2000 3000 4000 5000

14 Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview

15 Sequences of sequences Syrinx Neuronal hierarchy Time (sec) Frequency (KHz) sonogram 0.511.5

16 Hierarchical perception 20406080100120 -20 -10 0 10 20 30 40 50 prediction and error time 20406080100120 -10 0 10 20 30 40 50 causes time 20406080100120 -20 -10 0 10 20 30 40 50 hidden states time 20406080100120 -20 -10 0 10 20 30 40 50 hidden states time time (seconds) 0.511.5 2000 2500 3000 3500 4000 4500 5000 stimulus percept Prediction error encoded by superficial pyramidal cells that generate ERPs

17 … omitting the last chirps 20406080100120 -20 -10 0 10 20 30 40 50 prediction and error time 20406080100120 -20 -10 0 10 20 30 40 50 hidden states time 20406080100120 -10 0 10 20 30 40 50 causes - level 2 time 20406080100120 -20 -10 0 10 20 30 40 50 hidden states time

18 omission and violation of predictions Stimulus but no percept Percept but no stimulus Frequency (Hz) stimulus (sonogram) 2500 3000 3500 4000 4500 Time (sec) Frequency (Hz) percept 0.511.5 2500 3000 3500 4000 4500 5000 500100015002000 -100 -50 0 50 100 peristimulus time (ms) LFP (micro-volts) ERP (error) without last syllable Time (sec) percept 0.511.5 500100015002000 -100 -50 0 50 100 peristimulus time (ms) LFP (micro-volts) with omission

19 Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview

20 Suppression of inferotemporal responses to repeated faces Main effect of faces Henson et al 2000 Repetition suppression and the MMN The MMN is an enhanced negativity seen in response to any change (deviant) compared to the standard response.

21 Hierarchical learning 102030405060 -2 0 1 2 hidden states time 102030405060 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 causes time 102030405060 -5 0 5 10 15 20 25 30 35 prediction and error time Time (sec) Frequency (Hz) 0.10.20.30.4 2000 2500 3000 3500 4000 4500 5000 Synaptic adaptationSynaptic efficacy

22 Perceptual inference: suppressing error over peristimulus time Perceptual learning: suppression over repetitions Simulating ERPs to repeated chirps 100200300 -10 0 10 LFP (micro-volts) prediction error 00.20.4 -5 0 5 hidden states Frequency (Hz) percept 0.10.20.3 2000 3000 4000 5000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 peristimulus time (ms) LFP (micro-volts) 00.20.4 -5 0 5 Time (sec) Frequency (Hz) 0.10.20.3 2000 4000 Time (sec)

23 Synthetic MMN Last presentation (after learning) First presentation (before learning) 12345 0 0.5 1 1.5 2 2.5 3 Synaptic efficacy presentation changes in parameters 12345 0 1 2 3 4 5 6 Synaptic gain presentation hyperparameters 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 0100200300400 -15 -10 -5 0 5 10 15 20 primary level (N1/P1) peristimulus time (ms) Difference waveform 0100200300400 -0.6 -0.4 -0.2 0 0.2 0.4 secondary level (MMN) peristimulus time (ms) Difference waveform primary level prediction error secondary level

24 Synthetic and real ERPs 12345 0 0.5 1 1.5 2 2.5 3 presentation changes in parameters 12345 0 1 2 3 4 5 6 presentation hyperparameters A1 STG subcortical input STG repetition effects Intrinsic connections 1234 5 0 20 40 60 80 100 120 140 160 180 200 12345 0 Extrinsic connections 20 40 60 80 100 120 140 160 180 200 presentation Synaptic efficacySynaptic gain

25 Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview

26 A brain imaging experiment with sparse visual stimuli V2 V1 Angelucci et al Coherent and predicable Random and unpredictable top-down suppression of prediction error when coherent? CRF V1 ~1 o Horizontal V1 ~2 o Feedback V2 ~5 o Feedback V3 ~10 o Classical receptive field V1 Extra-classical receptive field Classical receptive field V2 ?

27 V2 V1 V5 pCG V5 Random Stationary Coherent V1 V5 V2 Suppression of prediction error with coherent stimuli regional responses (90% confidence intervals) decreasesincreases Harrison et al NeuroImage 2006

28 This model of brain function explains a wide range of anatomical and physiological facts; for example, the hierarchical deployment of cortical areas, recurrent architectures using forward and backward connections and their functional asymmetries (Angelucci et al, 2002). In terms of synaptic physiology, it predicts Hebbian or associative plasticity and, for dynamic models, spike–timing-dependent plasticity. In terms of electrophysiology it accounts for classical and extra-classical receptive field effects and long-latency or endogenous components of evoked cortical responses (Rao and Ballard, 1998). It predicts the attenuation of responses encoding prediction error, with perceptual learning, and explains many phenomena like repetition suppression, mismatch negativity and the P300 in electroencephalography. In psychophysical terms, it accounts for the behavioural correlates of these physiological phenomena, e.g., priming, and global precedence. Predictions about the brain

29 Summary A free energy principle can account for several aspects of action and perception The architecture of cortical systems speak to hierarchical generative models Estimation of hierarchical dynamic models corresponds to a generalised deconvolution of inputs to disclose their causes This deconvolution can be implemented in a neuronally plausible fashion by constructing a dynamic system that self-organises when exposed to inputs to suppress its free-energy

30 Thank you And thanks to collaborators: Jean Daunizeau Lee Harrison Stefan Kiebel James Kilner Klaas Stephan And colleagues: Peter Dayan Jörn Diedrichsen Paul Verschure Florentin Wörgötter


Download ppt "Abstract This talk summarizes our recent attempts to integrate action and perception within a single optimization framework. We start with a statistical."

Similar presentations


Ads by Google