Presentation is loading. Please wait.

Presentation is loading. Please wait.

SNN Machine learning Bert Kappen, Luc Janss, Wim Wiegerinck, Vicenc Gomez, Alberto Llera, Mohammad Azar, Bart van den Broek, 2 vacancies Ender Akay, Willem.

Similar presentations


Presentation on theme: "SNN Machine learning Bert Kappen, Luc Janss, Wim Wiegerinck, Vicenc Gomez, Alberto Llera, Mohammad Azar, Bart van den Broek, 2 vacancies Ender Akay, Willem."— Presentation transcript:

1 SNN Machine learning Bert Kappen, Luc Janss, Wim Wiegerinck, Vicenc Gomez, Alberto Llera, Mohammad Azar, Bart van den Broek, 2 vacancies Ender Akay, Willem Burgers

2 Activities Approximate inference – Graphical models – Analytical methods – Sampling method Control theory – Approximate inference – Reinforcement learning – Interaction modeling Neuroscience – Adaptive BCI – ECoG – Neural networks Bioinformatics – Genetic linkage analysis – Genome-wide association studies – Missing person identification Smart Research – Wine portal – Petro-physical expert system – Credit card fraud detection Promedas – – UMCU – Promedu

3 Approximate inference Control theory Neural networks ABCI ECoG GWAS Promedas

4 Modern AI uses probability for reasoning. Probabilities as frequencies (in ensemble or repeated trials) p(ab)=p(a|b)p(b) p(a)+p(not a)=1 reasonable expectation Many proposals Probabilistic approach correct (Cox, 1948) A calculus for reasoning

5 Graphical models What are probabilities given evidence: Intractable for large number of variables: 2 n for binary variables

6 Junction tree method Complexity reduction: 2 n ! 2 k (n=8, k=3) State of the art for intermediate size problems No solution for large problems

7

8 Approximate inference

9

10 Optimal control theory Minimization of cost function: error cost at the end cost to reach the target Optimal solution hard to compute

11 Optimal control theory Optimal control as a sum over trajectories, Kappen PRL 2005

12 Linear Bellman equation: efficient computation of optimal controls linear superposition of solutions qualitative different results for high and low noise

13 Efficient computation Theodorou, Schaal, USC 2009

14 linear superposition of solutions Da Silva, Popovic, MIT 2009

15 Qualitative different results for high and low noise Delayed Choice – Optimal control predicts when to act – More noise means more delay

16 Results small noiselarge noise Control signal Trajectory

17 Modelling neural networks with activity dependent synapses Dynamic synapses Recurrent connectivity and Dynamic synapses Associative memory dynamical memories Storage capacity Sensitivity to external stimuli. Relation to up-down states and powerlaws Discussion Marro, Torres, Mejias

18 a) Electrophysiological preparation in pyramidal neurons (layer 5) for a pairing experiment. b) Pairing: several current pulses (during 200 ms) in the pre and post- synaptic neuron (4-8 action potentials, AP) are injected 30 times each 20 s. c) Before: the response to stimuli is variable and noisy. d) After: optimal response to the first current pulse and there is a decrease of response to the next pulses. e) The effect of “pairing” is robust and Hebbian. Markram and Tsodyks, nature 1996: Dynamic synapses

19 Sanchez-vives, McCormick 2000 (a) Intracellular recording in the primary visual cortex of a halothane-anesthetized cat reveals a rhythmic sequence of depolarized and hyperpolarized membrane potentials. (b) Expansion of three of the depolarizing sequences for clarity. (c) Autocorrelogram of the intracellular recording reveals a marked periodicity of about one cycle per three seconds. (d) Simultaneous intracellular and extracellular recordings of the slow oscillation in ferret visual cortical slices maintained in vitro. Note the marked synchrony between the two recordings. The intracellular recording is from a layer 5 intrinsically bursting neuron. The trigger level for the window discriminator of the extracellular multi-unit recording is indicated. (e) The depolarized state at three different membrane potentials. (f) Autocorrelogram of the intracellular recording in (d) shows a marked periodicity of approximately once per 4 seconds.

20 Phenomenological model: Tsodyks y Markram (1997)

21 Attractor neural networks

22 Associative memory with “static synapses” (D j =1, F j =1) Hopfield network

23 Oscillations occur for P>1 and more realistic neuron models (Pantic et al, Neural Comp. 2002)

24 Phase portrait

25 Storage capacity

26 Sensitivity to external stimuli: h i +  i  Stimulus grows

27 Discussion Synapses show a high variability with a diverse origin: the stochastic opening of the vesicles, variations in the Glutamate concentration through synapses or the spatial heterogeneity of the synaptic response in the dendrite tree (Franks et al. 2003). Due to synapse dynamics, the neural activity loses stability which increases the sensitivity to changing external stimuli: the concept of dynamical memories Synaptic depression reduces memory capacity Synaptic facilitation improves short time memories

28 Adaptive BCI A BCI device is called adaptive if it is able to change during performance in order to improve it. Proposed approach: Use Error Related Potentials to provide the device with feedback about its own performance. Llera, Gomez, van Gerven, Jensen

29 General idea Use error related potentials to provide the device with feedback about its own performance. Are Error Related Potentials possible to classify at the single trial level?

30 Experimental design An MEG experiment have been designed to get insight into error related fields in a BCI context. The protocol has been carefully chosen to avoid lateralization due to movement in the screen. The protocol is intended to provide us with data containing error related fields and minimal extra input.

31 Classification methods with best results till now... Transformation to 28 frequencies in range 3-30 Hz. Normalization. 273 channels 6 time steps of 100 ms 150 trials per subject Linear Support Vector Machine

32 Illustration on toy data One dimensional feature space. Two Gaussian distributions, one for each class. Learning boundary using delta rule each time that we detect an error potential. Since Error Potentials classification is not 100%, we can have two undesirable effects: – Not learn when we should (prob2). – Learn when we should not (prob1). Assume that probability of errors is the same for both classes.

33 ECoG connectivity patterns during a motor response task We have computed the brain connectivity patterns associated to a simple motor response task from ECoG data recordings: Functional connectivity : Gaussian graphical models Effective connectivity : Direct Transfer Function (DFT), Granger causality. Frequency domain. Provides a non-symmetric causal matrix. Does capture time evolution. Assumes a good fit of the MVAR model. Time domain. Provides a symmetric independence matrix. Does not capture time evolution. Assumes normally distributed residuals. Gomez, Ramsey

34 ECoG connectivity patterns during a motor response task  104 electrodes (101 after preprocessing).  Implanted on the left hemisphere.  Two days, 40 trials per condition per day.  Sampling Rate 512 Hz: 1792 samples per trial.  22 bits, bandpass filter 0.15 – Hz).  Inter-electrode distance : 1 cm.

35 ECoG connectivity patterns during a motor response task The Gaussian model reveals clusters of correlated activity and significant differences between stimulus and response states related with motor areas.

36 ECoG connectivity patterns during a motor response task With Granger causality we are able to identify a set of source electrodes (red dots) which drive another subset of target electrodes. Sources are similar in both conditions, although targets differ for stimulus and response conditions.

37 Bayesian Variable Selection: causal modeling or prediction? Stochastic search multiple-regression building (using Gibbs sampling algorithm based on George & McCulloch 1995). Efficient in large p problems (500K predictors) Extended in a hierarchical model to estimate shrinkage parameter from the data, which we have shown to avoid overfit. Model averaging using half-certain associations was shown to improve prediction substantially: Janss, Franke, Buitelaar

38 Some extensions / research topics Use of prior information to help (constrain) finding interactions between predictors Multi-phenotype modelling and prediction using an embedded Eigenvector decomposition in a Bayesian hierarchical model. Multi-layered variable selection to model genetic effects on brain fMRI, which models cognitive tasks and psychiatric disorders x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 … x y y1 y2 y3 y4 y5y1 y2 y3 y4 y5 Pathway 1 Pathway 2 x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 … x u1 u2 u3u1 u2 u3 Interactions selected within pathways EV latent vectors x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 … x y1 y2 y3y1 y2 y3 fMRI data per voxel Janss, Franke, Buitelaar

39 PROMEDAS PRObabilistic Medical Diagnostic Advisory System

40 Waarom? Toenemende complexiteit van diagnostiek Falen in medisch handelen – patiënten in de VS sterven per jaar als gevolg van falend medisch handelen – Foute diagnose is frequent (8- 40 %) Toenemende kosten gezondheidszorg Beschikbaarheid van elektronische data

41 Input: patiëntgegevens, klachten, symptomen, labgegevens Output: Diagnoses, suggesties voor vervolgonderzoek Gebruikers: Huisartsen Opleiding Management Medisch specialisten

42 Grafische modellen

43

44 Exponentiele complexiteit 101 sec sec 3015 year year year

45 Bomen zijn netwerken zonder lussen De berekening is zeer snel voor bomen Promedas graaf benaderen als boom

46 Message passing Exact op bomen Goed op netwerken met weinig lussen Wordt slechter met – aantal lussen – verbindingssterkte

47 Medische inhoud Interne geneeskunde voor specialisten – 4000 diagnoses, 4000 symptomen, relaties – informele klinische evaluatie 50 test patients – score of correct diagnoses in top 3 6 % all in the top 3 26 % two in the top 3 54 % one in the top 3 14 % not correct Sinds oktober 2008 geintegreerd in UMCU. Ongeveer 1200 sessies per maand.

48

49 Toekomst plannen Promedas wordt gecommercialiseerd door een nieuw bedrijf Promedas BV. Mogelijke markten: – Web applicatie of cd-rom – Geïntegreerd in een ziekenhuis informatiesysteem – Telemedicine – ….

50 Projectteam Algoritmes & software – SNN, Radboud Universiteit Nijmegen Medische inhoud – UMC Utrecht

51 Onderwijs N & S voor deze richting Bachelor – Neural networks and information theory – Neurofysica Master – Machine Learning – Computational Neuroscience – SNN Colloquia


Download ppt "SNN Machine learning Bert Kappen, Luc Janss, Wim Wiegerinck, Vicenc Gomez, Alberto Llera, Mohammad Azar, Bart van den Broek, 2 vacancies Ender Akay, Willem."

Similar presentations


Ads by Google