Design for a Genetic System Capable of Hebbian Learning Chrisantha Fernando Systems Biology Centre Birmingham University January 2006 Chrisantha Fernando.

Slides:



Advertisements
Similar presentations
Molecular Biomedical Informatics Machine Learning and Bioinformatics Machine Learning & Bioinformatics 1.
Advertisements

Molecular Mechanisms of Learning and Memory
Cellular and Molecular Basis of Memory Engram Temporal Types of Memory
Cellular Mechanisms of Learning
Neural Basis of Cognition Lecture 6 Learning and Memory, Part II.
Activity-Dependent Development I April 23, 2007 Mu-ming Poo 1.Development of OD columns 2.Effects of visual deprivation 3. The critical period 4. Hebb’s.
Ming-Feng Yeh1 CHAPTER 13 Associative Learning. Ming-Feng Yeh2 Objectives The neural networks, trained in a supervised manner, require a target signal.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Justin Besant BIONB 2220 Final Project
Spike timing-dependent plasticity Guoqiang Bi Department of Neurobiology University of Pittsburgh School of Medicine.
Plasticity in the nervous system Edward Mann 17 th Jan 2014.
Cell signaling: responding to the outside world Cells interact with their environment by interpreting extracellular signals via proteins that span their.
CELL CONNECTIONS & COMMUNICATION AP Biology Ch.6.7; Ch. 11.
Chapter 7 Supervised Hebbian Learning.
Neural Mechanisms of Memory Storage Molecular, synaptic, and cellular events store information in the nervous system. New learning and memory formation.
Mutations Georgia Standard:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Supervised Hebbian Learning. Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing.
DAVID SANTUCCI QUANTITATIVE BIOLOGY BOOTCAMP 2009 A BRIEF HISTORY OF THE SYNAPSE.
1 Activity-dependent Development (2) Hebb’s hypothesis Hebbian plasticity in visual system Cellular mechanism of Hebbian plasticity.
PART 5 Supervised Hebbian Learning. Outline Linear Associator The Hebb Rule Pseudoinverse Rule Application.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
Critical periods A time period when environmental factors have especially strong influence in a particular behavior. –Language fluency –Birds- Are you.
Copyright © 2007 Wolters Kluwer Health | Lippincott Williams & Wilkins Neuroscience: Exploring the Brain, 3e Chapter 25: Molecular Mechanisms of Learning.
Synthetic biology: New engineering rules for emerging discipline Andrianantoandro E; Basu S; Karig D K; Weiss R. Molecular Systems Biology 2006.
PSY105 Neural Networks 4/5 4. “Traces in time” Assignment note: you don't need to read the full book to answer the first half of the question. You should.
Artificial Neural Network Unsupervised Learning
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Molecular mechanisms of memory. How does the brain achieve Hebbian plasticity? How is the co-activity of presynaptic and postsynaptic cells registered.
Neural Plasticity: Long-term Potentiation Lesson 15.
Chapter 15 Controls over Genes. When DNA Can’t Be Fixed? Changes in DNA are triggers for skin cancer, like the most deadly type– malignant melanoma Cancers:
synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength in response to either use or disuse of transmission.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Copyright  2005 McGraw-Hill Australia Pty Ltd PPTs t/a Biology: An Australian focus 3e by Knox, Ladiges, Evans and Saint 11-1 Chapter 11: Gene expression.
Mechanisms for memory: Introduction to LTP Bailey Lorv Psych 3FA3 November 15, 2010.
Prediction of proteins that participate in learning process by machine learning Dan Evron Miri Michaeli Project Advisors: Dr. Gal Chechik Ossnat Bar Shira.
Bain on Neural Networks and Connectionism Stephanie Rosenthal September 9, 2015.
The Function of Synchrony Marieke Rohde Reading Group DyStURB (Dynamical Structures to Understand Real Brains)
1960s, 1970s, converging evidence from cognitive neuropsychology, psychology, neurobiology support the view of Multiple memory systems, efforts to experimentally.
Complexities of Gene Expression Cells have regulated, complex systems –Not all genes are expressed in every cell –Many genes are not expressed all of.
Melanie Tavone. Curriculum Expectations D3.3 explain the steps involved in the process of protein synthesis and how genetic expression is controlled in.
Strong claim: Synaptic plasticity is the only game in town. Weak Claim: Synaptic plasticity is a game in town. Biophysics class: section III The synaptic.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Cell Communication Chapter 11.
LONG-TERM POTENTIATION (LTP) Introduction LTP as a candidate mechanism for the activity-dependent change in the strength of synaptic connections LTP is.
Trends in Biomedical Science Making Memory. The following slides are mostly derived from The Brain from Top to Bottom, an Interactive Website about the.
YI, SeongBae A transition to modern: Hebb. Questions What is the main idea of Hebb’s theory if we say in a easy way? Why it is important to repeat to.
Metabolism and Enzymes
Lesson 3 – Gene Expression
PLANT BIOTECHNOLOGY & GENETIC ENGINEERING (3 CREDIT HOURS) LECTURE 13 ANALYSIS OF THE TRANSCRIPTOME.
Copyright © 2009 Allyn & Bacon How Your Brain Stores Information Chapter 11 Learning, Memory, and Amnesia.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Perceptron vs. the point neuron Incoming signals from synapses are summed up at the soma, the biological “inner product” On crossing a threshold, the cell.
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
Genes in ActionSection 2 Section 2: Regulating Gene Expression Preview Bellringer Key Ideas Complexities of Gene Regulation Gene Regulation in Prokaryotes.
Who is smarter and does more tricks you or a bacteria? YouBacteria How does my DNA compare to a prokaryote? Show-off.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Ch. 13 A face in the crowd: which groups of neurons process face stimuli, and how do they interact? KARI L. HOFFMANN 2009/1/13 BI, Population Coding Seminar.
Hebb and Perceptron.
Covariation Learning and Auto-Associative Memory
Unit III Information Essential to Life Processes
7.2 Transcription and gene expression
Supervised Hebbian Learning
Neuroscience: Exploring the Brain, 3e
Presentation transcript:

Design for a Genetic System Capable of Hebbian Learning Chrisantha Fernando Systems Biology Centre Birmingham University January 2006 Chrisantha Fernando Systems Biology Centre Birmingham University January 2006

Hebbian Learning  Donald Hebb “ Let us assume that the persistence or repetition of a reverberatory activity (or "trace") tends to induce lasting cellular changes that add to its stability. When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased. ”  Long term potentiation in neurons subsequently demonstrated, along with LTD.  Hebbian Learning can implement  Associative Learning, e.g. LTP in Hippocampus.  Auto-associative Memory, e.g. Cerebellar motor memories.  Self-Organized Map Formation, e.g. ocular dominance columns.  Donald Hebb “ Let us assume that the persistence or repetition of a reverberatory activity (or "trace") tends to induce lasting cellular changes that add to its stability. When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased. ”  Long term potentiation in neurons subsequently demonstrated, along with LTD.  Hebbian Learning can implement  Associative Learning, e.g. LTP in Hippocampus.  Auto-associative Memory, e.g. Cerebellar motor memories.  Self-Organized Map Formation, e.g. ocular dominance columns.

Examples of Hebbian Learning Associative learning: Classical Conditioning UCS = Collision CS = Proximity

Hebbian Learning in Rabbits

Self-Organized Maps

How is Hebbian Learning Implemented in Brains? Pre Post +/-

What about Learning in Single Cells?  No conclusive evidence that unicellular organisms can do associative learning on their own.  Bacteria can habituate and be sensitized but they have not been shown to undertake associative learning.  No conclusive evidence that unicellular organisms can do associative learning on their own.  Bacteria can habituate and be sensitized but they have not been shown to undertake associative learning.

Vibration Shock

Slower Replicating Eukaryotes in Complex Environments Might Benefit from Lifetime Learning.  Fine tuning of obstacle avoidance/motion.  Learn that chemical A (or a spatio-temporal pattern of A) is associated with food or harm rather than chemical B or C or D…  Achieve robust development in the face of lifetime noise due to external perturbations and internal (mutational) perturbations.  Fine tuning of obstacle avoidance/motion.  Learn that chemical A (or a spatio-temporal pattern of A) is associated with food or harm rather than chemical B or C or D…  Achieve robust development in the face of lifetime noise due to external perturbations and internal (mutational) perturbations.

Intra-cellular Molecular Networks  Dennis Bray (Nature 2003). Neural networks have a remarkable ability to learn different patterns of inputs by changing the strengths of their connections (10). They are widely used in a variety of tasks of machine recognition. From the standpoint of a living cell, the closest approximation to a neural network is probably found in the pathways of intracellular signals (11). Multiple receptors on the outside of a cell receive sets of stimuli from the environment and relay these through cascades of coupled molecular events to one or a number of target molecules (associated with DNA, for example, or the cytoskeleton). Because of the directed and highly interconnected nature of these reactions, the ensemble as a whole should perform many of the functions commonly seen in neural networks. Thus, in aggregate, the signaling pathways of a cell are capable of recognizing sets of inputs and responding appropriately, with their connection "strengths" having been selected during evolution.1011  Eric Winfree and Hopfield view gene transcription networks as CTRNNs, pto.  Dennis Bray (Nature 2003). Neural networks have a remarkable ability to learn different patterns of inputs by changing the strengths of their connections (10). They are widely used in a variety of tasks of machine recognition. From the standpoint of a living cell, the closest approximation to a neural network is probably found in the pathways of intracellular signals (11). Multiple receptors on the outside of a cell receive sets of stimuli from the environment and relay these through cascades of coupled molecular events to one or a number of target molecules (associated with DNA, for example, or the cytoskeleton). Because of the directed and highly interconnected nature of these reactions, the ensemble as a whole should perform many of the functions commonly seen in neural networks. Thus, in aggregate, the signaling pathways of a cell are capable of recognizing sets of inputs and responding appropriately, with their connection "strengths" having been selected during evolution.1011  Eric Winfree and Hopfield view gene transcription networks as CTRNNs, pto.

Incomplete promotor region Sigmoidal activation function is sequence programmable but hard wired.

The New RNA World  John Mattick, Sean Eddy etc… are investigating RNA networks in Eukaryotes.  98% of RNAs in human cells is Intron or non- coding.  10x more RNA on Chromosome 21 and 22 is non- coding than exons.  Many types of novel RNA have been discovered capable of regulation of gene expression.  It may eventually explain the C-value paradox.  John Mattick, Sean Eddy etc… are investigating RNA networks in Eukaryotes.  98% of RNAs in human cells is Intron or non- coding.  10x more RNA on Chromosome 21 and 22 is non- coding than exons.  Many types of novel RNA have been discovered capable of regulation of gene expression.  It may eventually explain the C-value paradox.

1

What if RNA networks are really like little neural networks?  Current neural network metaphors for cellular networks have not included plasticity.  How difficult would it be to design (or evolve) a network capable of Hebbian learning?  What sorts of tasks could an intra-cellular Hebbian learning mechanism solve?  Current neural network metaphors for cellular networks have not included plasticity.  How difficult would it be to design (or evolve) a network capable of Hebbian learning?  What sorts of tasks could an intra-cellular Hebbian learning mechanism solve?

u 1 + W* 1 u 2 + W* 2 u 2 W* 2 u 1 W* 1 RNA W* 2 W W* 1 W RNA + u 1 RNAu 1 RNA + u 2 RNAu 2 Promotor Gene k1k1 k2k k1k1 k2k2

Time [Conc] [RNA] [TF 1 ] [TF 2 ] [RNATF 1 ] [RNATF 2 ] Stimulate with unit TF 2 Stimulate with unit TF 1 Experiment 1: Sensitization

Time [Conc] [W 1 *] [W 2 *] ‘Weight’ Change in Experiment 1

Associative Learning  UCS (e.g. glocuse) stimulates TF 1, which binds strongly to the promotor and produces RNA (innately).  CS (e.g. potassium permanganate, or NO or something else) stimulates TF 2 but these TFs bind only very weakly to the promotor.  Paired exposure to UCS and CS result in strengthening of the TF 2 binding to promotor, and a response to the ‘smell’ associated with glucose.  UCS (e.g. glocuse) stimulates TF 1, which binds strongly to the promotor and produces RNA (innately).  CS (e.g. potassium permanganate, or NO or something else) stimulates TF 2 but these TFs bind only very weakly to the promotor.  Paired exposure to UCS and CS result in strengthening of the TF 2 binding to promotor, and a response to the ‘smell’ associated with glucose.

Self-Organized Maps  Two TFs act on two genes  Now add lateral inhibition  Gene ‘receptive field’ changes  Gene A represents TF 1 level  Gene B represents TF 2 level  Two TFs act on two genes  Now add lateral inhibition  Gene ‘receptive field’ changes  Gene A represents TF 1 level  Gene B represents TF 2 level

Supervised Learning: Training a gene perceptron.

Applications  Engineering cells as biosensors in complex environments. Put cell in, then measure W* levels to obtain the recorded weights.  To train a cell to produce a protein under required conditions, without having to hard- wire the promotor-TF interaction perfectly.  Engineering cells as biosensors in complex environments. Put cell in, then measure W* levels to obtain the recorded weights.  To train a cell to produce a protein under required conditions, without having to hard- wire the promotor-TF interaction perfectly.

Thanks to  Dov Stekel  Jon Rowe  Bruce Shapiro  Sally Milwidsky  Dov Stekel  Jon Rowe  Bruce Shapiro  Sally Milwidsky