Download presentation
Presentation is loading. Please wait.
Published byDarren Cox Modified over 9 years ago
1
Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research Tsvi Achler MD/PhD Approximate Outline and References for Tutorial Department of Computer Science University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A.
2
Outline 1.Plasticity is observed in many forms. We review experiments and controversies. Intrinsic ‘membrane plasticity’ Synaptic Homeostatic ‘feedback plasticity’ System: in combination membrane and feedback can imitate synaptic 2.What does this mean for NN algorithms? Intrinsic Synaptic Homeostatic ‘Systems’ Plasticity:
3
Outline: NN Algorithms Common computational Issues Explosion in connectivity Explosion in training How can nature solve these problems with the plasticity mechanisms outlined? Synaptic Plasticity Lateral Inhibition Feedback Inhibition Algorithms:
4
1. Plasticity
5
Intrinsic ‘Membrane’ Plasticity Ion channels responsible for activity, spikes ‘Plastic’ ion channels found in membrane Voltage sensitive channel types: –(Ca ++, Na +, K + ) Plasticity independent of synapse plasticity Review: G. Daoudal, D, Debanne, Long-Term Plasticity of Intrinsic Excitability: Learning Rules and Mechanisms, Learn. Mem. 2003 10: 456-465 Intrinsic
6
Synaptic Plasticity Hypothesis Bulk of studies Synapse changes with activation Motivated by Hebb 1949 Supported by Long Term Potentiation / Depression (LTP/LTD) experiments Review: Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment of riches." Neuron 44(1): 5-21. Synaptic
7
LTP/LTD Experiment Protocol Establish ‘pre-synaptic’ cell Establish ‘post-synaptic’ cell Raise pre-synaptic activity to amplitude to A 50 where post-synaptic cell fires “50%” Induction: high frequency high voltage spike train on both pre & post electrodes Plasticity: any changes when A 50 is applied Brain Pre-synaptic electrode Post-synaptic electrode 50% A 50 ? Synaptic
8
Plasticity: change in post with A 50 LTP : increased activity with A 50 LTD : decreased activity with A 50 Can last minutes hours days –Limited by how long recording is viable Synaptic
9
Strongest Evidence Variable Evidence: areas with feedback Cortex, Thalamus, Sensory System, Hippocampus Basic relations between pre-post cells Basic mechanisms of Synaptic Plasticity are Still controversial Why so difficult? Connections change with activity Strong evidence: Muscles, early development (occular dominance colums) Tetanic stimulation Applesia siphon responses Supported by Long Term Potentiation (LTP) experiments Systems w/minimal feedback: Motor, Musculature & tetanic stimulation Sensory/muscle junction of Aplesia Gill Siphon Reflex Early Development: Retina → Ocular Dominance Columns Synaptic
10
Variable Evidence Cortex, Thalamus, Sensory Systems & Hippocampus Basic mechanisms still controversial 60 years and 13,000 papers in pubmed It is difficult to establish/control when LTP or LTD occurs Synaptic
11
LTP vs LTD Criteria is Variable The criteria to induce LTP or LTD are also a current subject of debate (Froemke et al, 2006). Some studies find that if the presynaptic neuron is activated within tens of milliseconds before the postsynaptic neuron (pre-post), LTP is induced. The reversed order of firing (post-pre) results in LTD (ie Bi & Poo 1998; Markram et al. 1997). In other studies, timing of the first spike (Froemke & Dan 2002) or the last spike (Wang et al. 2005) in each burst is found to be dominant in determining the sign and magnitude of LTP. Yet other studies show that synaptic modification is frequency dependent and that high-frequency bursts of pre- and postsynaptic spikes lead to LTP, regardless of the relative spike timing (Sjöström et al. 2001; Tzounopoulos et al. 2004). Even other studies show that somatic spikes are not even necessary for the induction of LTP and LTD (Golding et al. 2002; Lisman & Spruston 2005). In addition, it is unclear if these mechanisms drive single synapse changes as predicted by synaptic plasticity because physical changes in synapses show variability as well. Synaptic Change Activity dependent changes in synaptic spine morphology has been reported in the hippocampus (see Yuste et al. 2001 for review) including localized changes to single synapses with caged-glutamate sub-spike stimulation (Mazrahi et al., 2004). However changes in synapses can also occur with: estrus cycle (Woolley et al 1990), irradiation (Brizzee 1980), hibernation (Popov et al 1992), exercise (ie. Fordyce & Wehner 1993), epilepsy (Multani et al., 1994) and synaptic blockade using Mg+ (Kirkov & Harris, 1999). Also the synaptic morphology may not coincide with synaptic/behavioral function (Yuste et al. 2001). Furthermore brain regions responsible for recognition processing may display different characteristics. For example synaptic changes with experience in the mouse barrel cortex appear to be more variable than the hippocampus (Trachtenberg et al 2002). The strongest evidence supporting the synaptic plasticity hypothesis has been reported in the gill withdrawal reflex of the marine mollusk aplysia (Antonov, Antonova & Kandel, 2003). However the changes occur between sensory and motor neurons, not between processes responsible for recognizing stimuli. It may be the case that motor learning occurs via synaptic plasticity while recognition processing occurs through recurrent feedback. Furthermore, even if synaptic plasticity is found in the sensory cortex, it may be specific to pre-motor processing. Such pre-motor processes may co-exist with recognition circuits in the same regions Pre-Post spike timing: (Bi & Poo 1998; Markram et al. 1997) –Pre-synaptic spike before post LTP –Post-synaptic spike before pre LTD: First spike in burst most important (Froemke & Dan 2002) Last spike most important (Wang et al. 2005) Frequency most important: Freq LTP (Sjöström et al. 2001; Tzounopoulos et al. 2004). Spikes are not necessary (Golding et al. 2002; Lisman & Spruston 2005) Synaptic
12
Many factors affect LTP & LTD Voltage sensitive channels ie. NMDA Cell signaling channels ie via Ca++ Protein dependent components Fast/slow Synaptic tagging Synaptic Review: Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment of riches." Neuron 44(1): 5-21.
13
Studies of Morphology Unclear Synapse Morphology and density studies: Spine changes ≠ Function changes Many other causes of changes in spines: –Estrus, Exercise, Hibernation, Epilepsy, Irradiation Review: Yuste, R. and T. Bonhoeffer (2001). "Morphological changes in dendritic spines associated with long-term synaptic plasticity." Annu Rev Neurosci 24: 1071-89. Synaptic
14
Many Components & Variability Indicates a system is complex –involving more than just the recorded pre- synaptic and postsynaptic cells Means NN learning algorithms are difficult to justify But the system regulates itself Review of LTP & LTD variability: Froemke, Tsay, Raad, Long, Dan, Yet al. (2006) J Neurophysiol 95(3): 1620-9. Synaptic
15
Homeostatic Plasticity Self-Regulating Plasticity Networks Adapt to: Channel Blockers Genetic Expression of Channels Homeostatic
16
Establish baseline recording Bathe culture in channel blocker (2 types) –Either ↑ or ↓ Firing Frequency Observe System changes after ~1 day Washing out blocker causes reverse phenomena Adaptation to Blockers Pre-Synaptic CellPost-Synaptic Cell Culture Dish Pre-synaptic electrode Post-synaptic electrode Homeostatic
17
Homeostatic Adaptation to Blockers Turrigiano & Nelson (2004) Pre-Synaptic CellPost-Synaptic Cell ↑ Frequency → ↓ Frequency → Frequency x Strength = Baseline → ↓ Synaptic Strength → ↑ Synaptic Strength Displays Feedback Inhibition Response
18
Homeostatic Adaptation to Expression CellChannels Involved 1 2 3 Marder & Goaillard (2006) Cells with different numbers & types of channels show same electrical properties Homeostatic
19
Homeostatic Summary Adapts networks to a homeostatic baseline Utilizes feedback-inhibition (regulation) Homeostatic
20
Feedback Inhibition Pre-Synaptic CellPost-Synaptic Cell Feedback Ubiquitously Throughout Brain Homeostatic
21
Feedback Throughout Brain Nice lnk but pictures poor quality http://images.google.com/imgres?imgurl=http://www.benbest.com/science/anatmind/Fi gVII6.gif&imgrefurl=http://www.benbest.com/science/anatmind/anatmd7.html&h=320 &w=1193&sz=19&tbnid=dhilzMBJy4rbFM:&tbnh=40&tbnw=150&hl=en&start=5&p rev=/images%3Fq%3Dthalamus%2Bfeedback%26svnum%3D10%26hl%3Den%26lr %3D%26client%3Dfirefox-a%26rls%3Dorg.mozilla:en-US:official_s%26sa%3DG http://arken.umb.no/~compneuro/figs/LGN-circuit.jpg LaBerge, D. (1997) "Attention, Awareness, and the Triangular Circuit". Consciousness and Cognition, 6, 149-181Consciousness and Cognition, http://psyche.cs.monash.edu.au/v4/psyche-4-07-laberge.html Thalamus & Cortex Homeostatic
22
Modified from Chen, Xiong & Shepherd (2000).Chen, Xiong & Shepherd (2000 Feedback and Pre-Synaptic Inhibition found in Many Forms Feedback loops Tri-synaptic connections Antidromic Activation NO (nitric oxide) Homeostatic Plasticity Regulatory Mechanisms Suggest Pre-Synaptic Feedback Feedback loops 3 cells Antidromic Adjustment of pre-synaptic processes in Homeostatic Plasticity Overwhelming Amount of Feedback Inhibition Feedback & Pre-Synaptic Inhibition Evidence has Many Forms Figure from Aroniadou-Anderjaska, Zhou, Priest, Ennis & Shipley 2000Aroniadou-Anderjaska, Zhou, Priest, Ennis & Shipley 2000 Homeostatic
23
Summary Homeostatic Plasticity requires and maintains Feedback Inhibition Homeostatic
24
Feedback Inhibition combined with Intrinsic Plasticity Can be Indistinguishable from Synaptic Plasticity ‘Systems’ Plasticity ‘Systems’
25
Pre & Post synaptic cells are never in isolation Studies: In Vivo Brain slices Cultures: only viable with 1000’s of cells Culture Dish Pre-synaptic electrode Post-synaptic electrode Many cells are always present in plasticity experiments Changes in neuron resting activity is tolerated ‘Systems’
26
Feedback Inhibition Network Increase pre-synaptic cell activity until recorded postsynaptic cell fires 50% Then learning is induced artificially by activating both neurons together ∆↓ ∆↑∆↓ but this is rarely considered Induction can affect all connected post-synaptic cells With Pre-Synaptic Inhibition LTP protocol: find pre-synaptic and post-synaptic cells Pre-synaptic cells connect to many post-synaptic cells Immeasurable changes of all connected neurons Causes big change in the recorded neuron Only the two recorded cells and the synapse between them are considered ‘Systems’
27
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 LTP LTD Immeasurable changes of all connected neurons Lternatively LTD = negatively bias recorded cell LTP = bias recorded cell Baseline Normalized Activity Scale ( 0-1 ) All Neurons 0.01 Resting ∆ Value Simulation: Up to 26 Cell Interaction Causes big change in the recorded neuron ‘Systems’
28
Significance Experiments can not distinguish between synaptic plasticity and feedback inhibition Membrane voltage Vm allowed Δ ~6mV 0.01 = ~∆Vm of 0.3 mV Thus not likely to see membrane affects Presynaptic cells connect to >> 26 cells –Effect much more pronounced in real networks ‘Systems’
29
Regulatory Feedback Plasticity Feedback Inhibition + Intrinsic Plasticity are indistinguishable in current experiments from Synaptic Plasticity theory Why have ‘apparent’ synaptic plasticity? Feedback Inhibition is important for processing simultaneous patterns ‘Systems’
30
Break
31
2. Algorithms Synaptic Plasticity Lateral Inhibition Feedback Inhibition
32
Y1Y1 Y2Y2 Y3Y3 Y4Y4 I4I4 I1I1 I2I2 I3I3 x1x1 x2x2 x3x3 x4x4 Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 22 w 23 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43 Input Feedback Weights Regulatory Feedback Neural Networks Challenges In Neural Network Understanding Y1Y1 Y2Y2 Y3Y3 Y4Y4 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 14 w 21 w 22 w 23 w 24 w 31 w 32 w 33 w 34 w 41 w 42 w 43 w 44 lw 13 lw 12 lw 23 Large Network Problems Lateral Connections: connectivity explosion Limited Cognitive Intuition Strong evidence of feedback Replace with binary bidirectional connections. Could feedback dynamics be necessary vital? Strong evidence of feedback Replace with binary bidirectional connections.
33
0.8 ? What would a weight variable between them mean? Millions of representations possible -> a connection required to logically relate between representations Lateral Connectivity Can lead to an implausible number of connections and variables Symbolic Logic based on direct connections Combinatorial Explosion in Connectivity Every representation can not be connected to all others in the brain Combinatorial Explosion in Connectivity Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4
34
Y1Y1 Y2Y2 Y3Y3 Y4Y4 I4I4 I1I1 I2I2 I3I3 x1x1 x2x2 x3x3 x4x4 Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 22 w 23 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43 Input Feedback Weights Regulatory Feedback Neural Networks Challenges In Neural Network Understanding Y1Y1 Y2Y2 Y3Y3 Y4Y4 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 14 w 21 w 22 w 23 w 24 w 31 w 32 w 33 w 34 w 41 w 42 w 43 w 44 lw 13 lw 12 lw 23 Large Network Problems Weights: combinatorial training Lateral Connections: connectivity explosion Strong evidence of feedback Replace with binary bidirectional connections. Could feedback dynamics be necessary vital? Strong evidence of feedback Replace with binary bidirectional connections. w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43
35
Superposition Catastrophe Teach A B C … Z separately Test multiple simultaneous letters A A B B C D E Not Taught with simultaneous patterns: Will not recognize simultaneous patterns Teaching simultaneous patterns is a combinatorial problem A D G E Weights: Training Difficulty Given Simultaneous Patterns Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43
36
Teach A B C … Z separately Test multiple simultaneous letters Weights: Training Difficulty Given Simultaneous Patterns Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43 ‘Superposition Catastrophe’ (Rosenblatt 1962) Can try to avoid by this segmenting each pattern individually but it often requires recognition or not possible A D G E
37
Composites Common Segmentation not trivial Segmentation is not possible in most modalities Natural Scenarios (cluttered rainforest) Scenes Noisy ‘Cocktail Party’ Conversations Odorant or Taste Mixes (requires recognition?)
38
Chick B A B C D Frog Chick & Frog Simultaneously Letters learned individually Building If can’t segment image must interpret composite A01001A01001 B11000B11000 C01011C01011 B11000B11000 Simultaneous 2 4 0 1 2 Recognition Given Simultaneous: A, Bx2, and C Feat ure Spa ce + Train Test 01010101 Feature Space 10011001 01010101 10011001 11021102 + = Segmenting Composites New Scenario: Learn: Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43
39
Y1Y1 Y2Y2 Y3Y3 Y4Y4 I4I4 I1I1 I2I2 I3I3 x1x1 x2x2 x3x3 x4x4 Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 22 w 23 Input Feedback Weights Regulatory Feedback Neural Networks Challenges In Neural Network Understanding Y1Y1 Y2Y2 Y3Y3 Y4Y4 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 14 w 21 w 22 w 23 w 24 w 31 w 32 w 33 w 34 w 41 w 42 w 43 w 44 Large Network Problems Weights: combinatorial training Lateral Connections: connectivity explosion Feedback Inhibition: avoids combinatorial issues interprets composites Strong evidence of feedback Replace with binary bidirectional connections. Could feedback dynamics be necessary vital? Strong evidence of feedback Replace with binary bidirectional connections. w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43
40
Feedback Inhibition Every output inhibits only its own inputs Gain control mech for each input Massive feedback to inputs Iteratively evaluates input use Avoids optimized weight parameters Training establishes binary relationships Testing iteratively evaluates input use Self-Regulatory Feedback Input Output Input Output Control Theory Perspective Neuroscience Perspective x1x1 x2x2 y a I2 I2 I 1 yb yb x1x1 x2x2 Network Feedback Inhibition
41
Qshunting inhibition. Q b shunting inhibition at input b. Ccollection of all output cells C a cell “a”. N a the set of input connections to cell C a. n a the number of processes in set N a of cell C a. Pprimary inputs (not affected by shunting inhibition). Icollection of all inputs I b input cell “b”. M b the set of recurrent feedback connections to input I b. m b the number of connections in set M b Equations Used j b X X b Raw Input Activity a Y b X b I b Q x1x1 x2x2 y a I2 I2 I 1 yb yb x1x1 x2x2 Feedback Inhibition
42
Qshunting inhibition. Q b shunting inhibition at input b. Ccollection of all output cells C a cell “a”. N a the set of input connections to cell C a. n a the number of processes in set N a of cell C a. Pprimary inputs (not affected by shunting inhibition). Icollection of all inputs I b input cell “b”. M b the set of recurrent feedback connections to input I b. m b the number of connections in set M b Equations j X b Raw Input Activity I b Input after feedback Q b Feedback a Y b X b I b Q x1x1 x2x2 y a I2 I2 I 1 yb yb x1x1 x2x2 b b b Q X I Feedback Inhibition
43
Qshunting inhibition. Q b shunting inhibition at input b. Ccollection of all output cells C a cell “a”. N a the set of input connections to cell C a. n a the number of processes in set N a of cell C a. Pprimary inputs (not affected by shunting inhibition). Icollection of all inputs I b input cell “b”. M b the set of recurrent feedback connections to input I b. m b the number of connections in set M b Output Equations Inhibition a Yi i a a a I n tY ttY )( )( j b b b Q X I x1x1 x2x2 y a I2 I2 I 1 yb yb x1x1 x2x2 Q1=ya+ybQ1=ya+yb Q2=ybQ2=yb Feedback b X jb tYQ)( j = = Y a Output Activity X b Raw Input Activity I b Input after feedback Q b Feedback n a # connections of Y a a Y b X b I b Q W Feedback Inhibition
44
Qshunting inhibition. Q b shunting inhibition at input b. Ccollection of all output cells C a cell “a”. N a the set of input connections to cell C a. n a the number of processes in set N a of cell C a. Pprimary inputs (not affected by shunting inhibition). Icollection of all inputs I b input cell “b”. M b the set of recurrent feedback connections to input I b. m b the number of connections in set M b Output Equations Inhibition Feedback a Yi i a a a I n tY ttY )( )( b X j jb tYQ)( b b b Q X I x1x1 x2x2 y a I2 I2 I 1 yb yb x1x1 x2x2 Q1=ya+ybQ1=ya+yb Q2=ybQ2=yb j = = Repeat No Oscillations No Chaos Feedback Inhibition
45
Y1Y1 Y2Y2 Y3Y3 Y4Y4 Output Nodes Input Nodes I4I4 I1I1 I2I2 I3I3 I1I1 I2I2 I3I3 Y2Y2 Simple Connectivity x1x1 x2x2 x3x3 x4x4 W New node only connects to its inputs All links have same strength Source of Connectivity Problems Source of Training Problems Inputs have positive real values indicating intensity Feedback Inhibition
46
Allows Modular Combinations I1 I1 Y1 Y1 Y2 Y2 I1 I1 I2 I2 ‘P’‘R’ Outputs Inputs 1010 Features 1111 Feedback Inhibition
47
Interprets Composite Patterns Inputs 1, 0 1, 1 0, 1 A=1 and B = ½ ( =1, =½) Supports Non-Binary Inputs Inputs ( P A, P B ) Results (C 1, C 2 ) (x 1 ≥ x 2 ) (x 1 –x 2, x 2 ) (x 1 ≤ x 2 ) (0, ( x 1 +x 2 ) /2) Steady State: x1x1 x2x2 y1y1 y2y2 Inputs simultaneously supporting both outputs Network Configuration Steady State Inputs x 1, x 2 Outputs y 1, y 2 Outputs (1, ¼) (¾, ¼) (⅓, 1) (0, ⅔) ( P A, P B ) (C 1, C 2 ) (0, ½) (¾, ¼) (0, ⅔) (½, ½) (1, ¼) (⅓, 1) 2, 2 0, 2 2, 1 1, 1 ‘P’ Behaves as if there is an inhibitory connection yet there is no direct connection between x 2 & y 1 ( - ) ‘R’ 2Rs P&R Solution y1y1 y2y2 x 1 ≥ x 2 x 1 –x 2 x2x2 x 1 ≤ x 2 0(x 1 +x 2 )/2 Algorithm
48
Y2 Y2 Iterative Evaluation I2 I2 I1 I1 Y1 Y1 x1x1 x2x2 Forward Connections: 1 1 N N 1 1 1 x w xy thus W y = Outputs Inputs How it Works Feedback Inhibition Algorithm
49
Back I2 I2 I1 I1 x1x1 x2x2 Y2 Y2 Y1 Y1 Outputs Inputs Feedback Inhibition Algorithm How it Works
50
Y2 Y2 Forward I2 I2 I1 I1 Y1 Y1 x1x1 x2x2 Outputs Inputs Feedback Inhibition Algorithm How it Works
51
Back I2 I2 I1 I1 x1x1 x2x2 Y2 Y2 Y1 Y1 Outputs Inputs Feedback Inhibition Algorithm How it Works
52
I2 I2 I1 I1 Y2 Y2 Y1 Y1 Outputs Inputs Active (1) Inactive (0) 1111 Features = Feedback Inhibition Algorithm How it Works
53
C2C2 I2 I2 I1 I1 Outputs Inputs Active (1) Inactive (0) Initially both outputs become active Feedback Inhibition Algorithm How it Works
54
C2C2 I2 I2 Active (1) Inactive (0) Outputs Inputs I 1 gets twice as much inhibition as I 2 Feedback Inhibition Algorithm How it Works
55
C2C2 I2 I2 Active (1) Inactive (0) Outputs Inputs I 1 gets twice as much inhibition as I 2 Feedback Inhibition Algorithm How it Works
56
Outputs Inputs Active (1) Inactive (0) Feedback Inhibition Algorithm How it Works
57
Outputs Inputs Active (1) Inactive (0) This affects Y 1 more than Y 2 Feedback Inhibition Algorithm How it Works
58
I2 I2 Outputs Inputs Active (1) Inactive (0) This separation continues iteratively Feedback Inhibition Algorithm How it Works
59
Outputs Inputs Active (1) Inactive (0) This separation continues iteratively Feedback Inhibition Algorithm How it Works
60
I2 I2 I1 I1 Steady State 0 0 1 1 5 Activity Y1Y1 Y2Y2 Simulation Time (T) Graph of Dynamics 324 Outputs Inputs Until the most encompassing representation predominates 1111 Features = 1111 1010 Y1Y1 Y2Y2 How it Works ‘R’
61
Demonstration: Appling Learned Information to New Scenarios Nonlinear: mathematical analysis difficult –demonstrated via examples Teach patterns separately Test novel pattern combinations Requires decomposition of composite Letter patterns are used for intuition Demonstration
62
Superposition Catastrophe Learn A B C … Z separately A B C D E Teach single patterns only 11021102 A01001A01001 B11000B11000 C01011C01011 B11000B11000 Simultaneous 2 4 0 1 2 Recognition Given Simultaneous: A, Bx2, and C + Train Test 1100011000 A01001 A01001 11021102 + = Feature Space A01001 A01001........ B11000B11000........ C01011C01011........ D10101D10101........ E11011E11011........ Appling Learned Information to New Scenarios ……. 26 Nodes Demonstration Features Nodes ……. Modular Combination
63
This Defines Network Nothing is changed or re-learned further Demonstration Comparison networks are trained & tested with the same patterns –Neural Networks (NN)* Representing synaptic plasticity –Lateral Inhibition (Winner-take-all with ranking of winners) * Waikato Environment for Knowledge Analysis (WEKA) repository tool for most recent and best algorithms
64
Tests: Increasingly Complex 26 patterns presented one at a time –All methods recognize 100% Choose 2 letters, present simultaneously –Either: union logical-‘or’ features –add features Choose 4 letters, present simultaneously –Either: add or ‘or’ features –Include repeats in add case (ie ‘A+A+A+A’) or = A01001 A01001........ B11000B11000........ A|B 1 0 1........ To Networks + A+B 1 2 0 1........ 325 Combinations 14,950 Combinations 456,976 Combinations Demonstration
65
i.e. (A B C D) Figure 5: NN with two letter retraining %combinations = number of x correct matches number of combinations 0 10 20 30 40 50 60 70 80 90 100 Letters Correctly Classified % of combinations 325 Combinations Train 26 nodes Test w/2 patterns Do 2 top nodes match? 0/2 1/2 2/2 Two Patterns Simultaneously A B Feedback Inhibition Synaptic Plasticity Lateral Inhibition
66
Superposition Catastrophe Not Taught with simultaneous patterns: Will not recognize simultaneous patterns Teaching simultaneous patterns is a combinatorial problem 11021102 or = A01001A01001 B11000B11000 C01011C01011 B11000B11000 Simultaneous 2 4 0 1 2 Recognition Given Simultaneous: A, Bx2, and C + Train Test 1100011000 A01001 A01001 11021102 + = Feature Space A01001 A01001........ C01011C01011........ D 1 0 1 01........ E11011E11011........ A D C E A|C|D|E 1........ or To Network Four pattern union Demonstration Simultaneous Patterns
67
Union of Four Patterns : i.e. (A B C D) Figure 5: NN with two letter retraining %combinations = number of x correct matches number of combinations 0 10 20 30 40 50 60 70 80 90 100 Letters Correctly Classified % of combinations A B C D 14,950 Combinations Same 26 nodes Test w/4 patterns Do 4 top nodes match? 0/4 1/42/43/4 4/4 Feedback Inhibition Synaptic Plasticity Lateral Inhibition
68
Union of Five Patterns: i.e. (A B C D) Figure 5: NN with two letter retraining %combinations = number of x correct matches number of combinations 0 10 20 30 40 50 60 70 80 90 100 Letters Correctly Classified % of combinations A B C D E 65,780 Combinations Same 26 nodes Test w/5 patterns Do 5 top nodes match? 0/5 1/52/53/5 4/5 5/5 Feedback Inhibition Synaptic Plasticity Lateral Inhibition
69
Superposition Catastrophe Not Taught with simultaneous patterns: Will not recognize simultaneous patterns Teaching simultaneous patterns is a combinatorial problem 11021102 + = A01001A01001 B11000B11000 C01011C01011 B11000B11000 Simultaneous 2 4 0 1 2 Recognition Given Simultaneous: A, Bx2, and C + Train Test 1100011000 A01001 A01001 11021102 + = Feature Space A01001 A01001........ C01011C01011........ D10101D10101........ E11011E11011........ A D C E A+C+D+E 2 3 1 2 4........ + + To Network Pattern Addition Demonstration Improves feedback inhibition performance further
70
i.e. (A B C D) Figure 5: NN with two letter retraining %combinations = number of x correct matches number of combinations 0 10 20 30 40 50 60 70 80 90 100 Letters Correctly Classified % of combinations A B C D 0/4 1/42/43/4 4/4 Lateral Inhibition Synaptic Plasticity Pre-Synaptic Inhibition 14,950 Combinations X C O M K S A V Same 26 nodes Test w/4 patterns Do 4 top nodes match? Addition of Four Patterns :
71
Addition of Eight Patterns: i.e. (A B C D) Figure 5: NN with two letter retraining %combinations = number of x correct matches number of combinations 0 10 20 30 40 50 60 70 80 90 100 Letters Correctly Classified % of combinations A G B L C D X E Synaptic Plasticity Lateral Inhibition 1,562,275 Combinations Same 26 nodes Test w/8 patterns Do 8 top nodes match? Feedback Inhibition 0/81/82/83/84/85/86/87/88/8
72
Superposition Catastrophe Repeated patterns reflected by value of corresponding nodes 11021102 + = A01001A01001 B11000B11000 C01011C01011 B11000B11000 Simultaneous 2 4 0 1 2 Nodes: A=1 B=2 C=1 D→Z=0 + Train Test 1100011000 A01001 A01001 11021102 + = Feature Space A01001 A01001........ C01011C01011........ A B B C A+B+B+C 2 4 0 1 2........ + + With Addition Feedback Algorithm Can Count B11000B11000........ B11000B11000........ Demonstration 100% 456,976 Combinations
73
Tested on Random Patterns 50 randomly generated patterns From 512 features 4 presented at a time 6,250,000 combinations (including repeats) 100% correct including count Demonstration Computer starts getting slow
74
A+B 1 2 0 1 This vector is ‘A’ & ‘B’ together This vector is ‘A’ ‘C’ ‘D’ & ‘E’ together A+C+D+E 2 3 1 2 4 What if Conventional Algorithms are Trained for this Task? Insight
75
Teach pairs: 325 combinations A B Teach triples: 2600 combinations Quadruples: 14,950. Training complexity increases combinatorialy A C A D A E Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43 M P L K S A V 26 letters Training is not practical Furthermore ABCD can be misinterpreted as AB & CD, or ABC & D Insight
76
Training Difficulty Given Simultaneous Patterns Y1Y1 Y2Y2 Y3Y3 x1x1 x2x2 x3x3 x4x4 w 11 w 12 w 13 w 21 w 22 w 23 w 31 w 32 w 33 w 41 w 42 w 43 Known as: ‘Superposition Catastrophe’ (Rosenblatt 1962; Rachkovskij & Kussul 2001) A D G E Feedback inhibition inference seems to avoid this problem Insight
77
Binding problem Simultaneous Representations Chunking features: Computer Algorithms have similar problems with much simpler representations Computer Algorithms similar problems with simpler representations
78
Resolving Pattern Interactions Inputs all are patterns matched y1y1 x2x2 x3x3 y3y3 Outputs x1x1 y2y2 x2x2 x1x1 ‘Wheels’ ‘Barbell’‘Car Chassis’ unless the network is explicitly trained otherwise. AvA Given: However it is a binding error to call this a barbell. Simultaneous Representations Cause The Binding Problem
79
Conventional: Requires training data to predict binding combinations 0 0.2 0.4 0.6 0.8 1 Vector Activity Feedback Inhibition ‘Wheels’‘Barbell’ ‘Car Chassis’ y1y1 y2y2 y3y3 Binding Comparison x1x1 x2x2 x3x3 y1y1 y2y2 y3y3 Synaptic Plasticity Lateral Inhibition
80
Resolving Pattern Interactions Inputs x1x1 x2x2 x3x3 y1y1 y2y2 y3y3 Binding: Network-Wide Solution Most precise output configuration Outputs 1, 0, 0 1, 1, 0 0, 1, 0 Inputs Outputs 1, 1, 1 1, 0, 1 x 1, x 2, x 3 y 1, y 2, y 3 Wheels Barbell Car Barbell
81
Network Under Dynamic Control Recognition inseparable from attention Feedback: an automatic way to access inputs ‘Symbolic’ control via bias
82
Resolving Pattern Interactions Inputs x1x1 x2x2 x3x3 y1y1 y2y2 y3y3 Symbolic Effect of Bias Most precise output configuration Outputs Inputs Outputs 1, 1, 1 0.02, 0.98, 0.71 x 1, x 2, x 3 y 1, y 2, y 3 Barbell Interested in y 2 : Bias y 2 = 0.15 Is barbell present? Bias y 2 = 0.15
83
Summary Feedback inhibition combined with intrinsic plasticity generates a ‘systems’ plasticity that looks like synaptic plasticity Feedback inhibition gives algorithms more flexibility with simultaneous patterns Brain processing and learning is still unclear: likely a paradigm shift is needed
84
Acknowledgements Eyal Amir Cyrus Omar, Dervis Vural, Vivek Srikumar Intelligence Community Postdoc Program & National Geospatial-Intelligence Agency HM1582-06--BAA-0001
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.