Presentation is loading. Please wait.

Presentation is loading. Please wait.

Deans Lecture Reception 5.30-6PM, Lecture 6-7, Lecture theatre S1, Clayton Campus, Monash University). Models, maps and modalities in brain imaging Karl.

Similar presentations


Presentation on theme: "Deans Lecture Reception 5.30-6PM, Lecture 6-7, Lecture theatre S1, Clayton Campus, Monash University). Models, maps and modalities in brain imaging Karl."— Presentation transcript:

1

2 Deans Lecture Reception 5.30-6PM, Lecture 6-7, Lecture theatre S1, Clayton Campus, Monash University). Models, maps and modalities in brain imaging Karl Friston, Wellcome Centre for Neuroimaging, UCL Abstract This talk summarizes recent attempts to integrate action and perception within a single optimization framework. We start with a statistical formulation of Helmholtz's ideas about neural energy to furnish a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. The ensuing scheme rests on Empirical Bayes and hierarchical models of how sensory information is generated. The use of hierarchical models enables the brain to construct prior expectations in a dynamic and context- sensitive fashion. Crucially, this rests upon neuronal message-passing that speaks to specific asymmetries and plasticity in forward (bottom-up) and backward (top-down) connections in the brain. We will then look at how advanced neuroimaging techniques have been used to confirm predictions about distributed processing (message passing) in the human brain and to disclose key aspects of its functional architecture. W will use fMRI to address a fundamental question about the relative role of backward (top-down) connections in visual processing and the mismatch negativity paradigm to illustrate the utility of electromagnetic (MEG) brain signals to address questions about plasticity and sensory learning.

3 “Objects are always imagined as being present in the field of vision as would have to be there in order to produce the same impression on the nervous mechanism” - Hermann Ludwig Ferdinand von Helmholtz Thomas Bayes Geoffrey Hinton Richard Feynman From the Helmholtz machine to the Bayesian brain and self-organization Hermann Haken

4 Overview What does the brain do? Entropy and equilibria Free-energy and surprise The free-energy principle Action and perception Avoiding surprise Perception Birdsong and categorization A simple fMRI experiment Learning Repetition suppression Mismatch negativity

5 temperature What is the difference between a snowflake and a bird? Phase-boundary …a bird can act (to avoid surprises)

6 What is the difference between snowfall and a flock of birds? Ensemble dynamics, clumping and swarming …birds (biological agents) stay in the same place They resist the second law of thermodynamics, which says that their entropy should increase

7 This means biological agents must self-organize to minimise surprise. In other words, to ensure they occupy a limited number of (attracting) states. But what is the entropy? …entropy is just average surprise Low surprise (I am usually here)High surprise (I am never here)

8 Particle density contours showing Kelvin-Helmholtz instability, forming beautiful breaking waves. In the self- sustained state of Kelvin-Helmholtz turbulence the particles are transported away from the mid-plane at the same rate as they fall, but the particle density is nevertheless very clumpy because of a clumping instability that is caused by the dependence of the particle velocity on the local solids-to-gas ratio (Johansen, Henning, & Klahr 2006) temperature pH falling transport Self-organization that minimises the entropy of an ensemble density to ensure a limited repertoire of states are occupied (i.e., ensuring states have a random attracting set; cf homeostasis).

9 But there is a small problem… agents cannot measure their surprise But they can measure their free-energy, which is always bigger than surprise This means agents should minimize their free-energy. So what is free-energy? ?

10 What is free-energy? …free-energy is basically prediction error where small errors mean low surprise sensations – predictions = prediction error

11 How can we minimize prediction error (free-energy)? Change sensory input sensations – predictions Prediction error Change predictions ActionPerception …prediction errors drive action and perception to suppress themselves

12 Action to minimise a bound on surprisePerception to optimise the bound Action External states in the world Internal states of the agent ( m ) Sensations More formally,

13 Synaptic gain Synaptic activity Synaptic efficacy Activity-dependent plasticity Functional specialization Attentional gain Enabling of plasticity Attention and salience Perception and inferenceLearning and memory The recognition density and its sufficient statistics Laplace approximation:

14 Adjust hypotheses sensory input Backward connections return predictions …by hierarchical message passing in the brain prediction Forward connections convey feedback So how do prediction errors change predictions? Prediction errors Predictions

15 Backward predictions Forward prediction error Perception and message-passing Synaptic plasticitySynaptic gain David Mumford More formally,

16 Summary Biological agents resist the second law of thermodynamics They must minimize their average surprise (entropy) They minimize surprise by suppressing prediction error (free-energy) Prediction error can be reduced by changing predictions (perception) Prediction error can be reduced by changing sensations (action) Perception entails recurrent message passing in the brain to optimise predictions Action makes predictions come true (and minimises surprise) Perception Birdsong and categorization A simple fMRI experiment Learning Repetition suppression Mismatch negativity

17 A synthetic bird Syrinx Neuronal hierarchy Time (sec) Frequency (KHz) sonogram 0.511.5

18 Hierarchical perception 20406080100120 -20 -10 0 10 20 30 40 50 prediction and error time 20406080100120 -10 0 10 20 30 40 50 causes time 20406080100120 -20 -10 0 10 20 30 40 50 hidden states time 20406080100120 -20 -10 0 10 20 30 40 50 hidden states time time (seconds) 0.511.5 2000 2500 3000 3500 4000 4500 5000 stimulus time (seconds) percept 0.511.5 2000 2500 3000 3500 4000 4500 5000 Prediction error encoded by superficial pyramidal cells that generate ERPs

19 A brain imaging experiment with sparse visual stimuli V2 V1 Angelucci et al Coherent and predicable Random and unpredictable top-down suppression of prediction error when coherent? CRF V1 ~1 o Horizontal V1 ~2 o Feedback V2 ~5 o Feedback V3 ~10 o Classical receptive field V1 Extra-classical receptive field Classical receptive field V2 ?

20 V2 V1 V5 pCG V5 Random Stationary Coherent V1 V5 V2 Suppression of prediction error with coherent stimuli regional responses (90% confidence intervals) decreasesincreases Harrison et al NeuroImage 2006

21 Backward predictions Forward prediction error Perception and message-passing High frequencies in lower areas should cause low frequencies in higher areas (cf, evidence accumulation)

22 Observed data input Forward model (measurement) Model inversion DCM for induced responses: Forward models and their inversion Forward model (neuronal)

23 K frequencies in j -th source Nonlinear (between-frequency) coupling Linear (within-frequency) coupling Extrinsic (between-source) coupling Neuronal model for spectral features Data in channel space Inversion of electromagnetic model L input Intrinsic (within-source) coupling DCM for induced responses CC Chen et al 2008

24 LVRV RF LF input LVRV RF LF input Frequency-specific coupling during face-processing CC Chen et al 2008

25 From 32 Hz (gamma) to 10 Hz (alpha) t = 4.72 ; p = 0.002 4 12 20 28 36 44 44 36 28 20 12 4 SPM t df 72; FWHM 7.8 x 6.5 Hz -0.1 -0.08 -0.06 -0.04 -0.02 0 0.02 0.04 0.06 0.08 0.1 -0.1 -0.08 -0.06 -0.04 -0.02 0 0.02 0.04 0.06 0.08 0.1 Right hemisphereLeft hemisphere Forward Backward Frequency (Hz) LVRV RF LF input FLBLFLBL FNBLFNBL FLBNFLBN FNBNFNBN -59890 -16308 -16306 -11895 -70000 -60000 -50000 -40000 -30000 -20000 -10000 0 Functional asymmetries in forward and backward connections CC Chen et al 2008

26 Synaptic gain Synaptic activity Synaptic efficacy Activity-dependent plasticity Functional specialization Attentional gain Enabling of plasticity Attention and salience Perception and inferenceLearning and memory The recognition density and its sufficient statistics Laplace approximation:

27 Suppression of inferotemporal responses to repeated faces Main effect of faces Henson et al 2000 Repetition suppression and the MMN The MMN is an enhanced negativity seen in response to any change (deviant) compared to the standard response.

28 Hierarchical learning 102030405060 -2 0 1 2 hidden states time 102030405060 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 causes time 102030405060 -5 0 5 10 15 20 25 30 35 prediction and error time Time (sec) Frequency (Hz) 0.10.20.30.4 2000 2500 3000 3500 4000 4500 5000 Synaptic adaptationSynaptic plasticity

29 Perceptual inference: suppressing error over peristimulus time Perceptual learning: suppression over repetitions Simulating ERPs to repeated chirps 100200300 -10 0 10 LFP (micro-volts) prediction error 00.20.4 -5 0 5 hidden states Frequency (Hz) percept 0.10.20.3 2000 3000 4000 5000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 LFP (micro-volts) 00.20.4 -5 0 5 Frequency (Hz) 0.10.20.3 2000 4000 100200300 -10 0 10 peristimulus time (ms) LFP (micro-volts) 00.20.4 -5 0 5 Time (sec) Frequency (Hz) 0.10.20.3 2000 4000 Time (sec)

30 Synthetic MMN Last presentation (after learning) First presentation (before learning) 12345 0 0.5 1 1.5 2 2.5 3 Synaptic efficacy presentation changes in parameters 12345 0 1 2 3 4 5 6 Synaptic gain presentation hyperparameters 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 100200300 -10 0 10 100200300 -0.4 -0.2 0 0.2 0100200300400 -15 -10 -5 0 5 10 15 20 primary level (N1/P1) peristimulus time (ms) Difference waveform 0100200300400 -0.6 -0.4 -0.2 0 0.2 0.4 secondary level (MMN) peristimulus time (ms) Difference waveform primary level prediction error secondary level

31 The MMN and perceptual learning MMN standard deviant ERP standards ERP deviants deviant - standard Garrido et al 2008

32 Synthetic and real ERPs: DCM for evoked responses 12345 0 0.5 1 1.5 2 2.5 3 presentation changes in parameters 12345 0 1 2 3 4 5 6 presentation hyperparameters A1 STG subcortical input STG repetition effects Intrinsic connections 1234 5 0 20 40 60 80 100 120 140 160 180 200 12345 0 Extrinsic connections 20 40 60 80 100 120 140 160 180 200 presentation Synaptic efficacySynaptic gain

33 Thank you And thanks to collaborators: CC Chen Jean Daunizeau Harriet Feldman Marta Garrido Lee Harrison Stefan Kiebel James Kilner Andre Marreiros Jérémie Mattout Rosalyn Moran Will Penny Klaas Stephan And colleagues: Peter Dayan Jörn Diedrichsen Paul Verschure Florentin Wörgötter And many others


Download ppt "Deans Lecture Reception 5.30-6PM, Lecture 6-7, Lecture theatre S1, Clayton Campus, Monash University). Models, maps and modalities in brain imaging Karl."

Similar presentations


Ads by Google