מנחה הקורס ד"ר לרי מנביץ' 1. נושאים  מה זה נוירון ? 3 דורות של מודל הנוירון נוירון ביולוגי סימולציה לנוירון IPSP/EPSP LSM  סקירת מאמרים 2.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Chrisantha Fernando & Sampsa Sojakka
Slides from: Doug Gray, David Poole
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Introduction to Artificial Neural Networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Introduction to Neural Networks Computing
Phantom Limb Phenomena. Hand movement observation by individuals born without hands: phantom limb experience constrains visual limb perception. Funk M,
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Biological and Artificial Neurons Michael J. Watts
Lecture 16 Spiking neural networks
Artificial Neural Networks - Introduction -
Neural Computation Prof. Nathan Intrator
Lecture 14 – Neural Networks
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks An Introduction.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
Artificial Neural Network
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial neural networks:
Introduction to Neural Networks CMSC475/675
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks
Explorations in Neural Networks Tianhui Cai Period 3.
2101INT – Principles of Intelligent Systems Lecture 10.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
EEE502 Pattern Recognition
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
The Brain. Brain versus Computer (hardware) NumbersHuman brainVon Neumann computer # elements neurons transistors # connections.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Neural networks.
Introduction to Neural Networks
Neural Networks.
Artificial neural networks
Real Neurons Cell structures Cell body Dendrites Axon
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
CSSE463: Image Recognition Day 17
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Introduction to Neural Network
The McCullough-Pitts Neuron
David Kauchak CS158 – Spring 2019

PYTHON Deep Learning Prof. Muhammad Saeed.
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

מנחה הקורס ד"ר לרי מנביץ' 1

נושאים  מה זה נוירון ? 3 דורות של מודל הנוירון נוירון ביולוגי סימולציה לנוירון IPSP/EPSP LSM  סקירת מאמרים 2

Man versus Machine (hardware) 3 NumbersHuman brainVon Neumann computer # elements neurons transistors # connections / element switching frequency10 3 Hz10 9 Hz energy / operation Joule10 -6 Joule power consumption10 Watt Watt reliability of elementslowreasonable reliability of systemhighreasonable

Man versus Machine (information processing) 4 FeaturesHuman BrainVon Neumann computer Data representationanalogdigital Memory localizationdistributedlocalized Controldistributedlocalized Processingparallelsequential Skill acquisitionlearningprogramming No memory management, No hardware/software/data distinction

Biologically Inspired  Electro-chemical signals  Threshold output firing 5

The Perceptron  Binary classifier functions  Threshold activation function 6 Y j : output from unit j W ij : weight on connection from j to i X i : weighted sum of input to unit i x i = ∑ j w ij y j y i = f(x i –  i) Threshold

Type 1. Perceptron  feedforward  Structure: 1 input layer 1 output layer  Supervised learning  Hebb learning rule  Able : AND or OR.  Unable: XOR

Learning in a Simple Neuron Perceptron Learning Algorithm: 1. Initialize weights 2. Present a pattern and target output 3. Compute output : 4. Update weights : Repeat starting at 2 until acceptable level of error

Computing other functions: the OR function  Assume a binary threshold activation function.  What should you set w 01, w 02 and w 0b to be so that you can get the right answers for y 0 ? 9 i1i1 i2i2 y0y

Many answers would work y = f (w 01 i 1 + w 02 i 2 + w 0b b) recall the threshold function the separation happens when w 01 i 1 + w 02 i 2 + w 0b b = 0 move things around and you get i 2 = - (w 01/ w 02 )i 1 - (w 0b b/w 02 ) 10 i2i2 i2i2 i1i1 i1i1

The XOR Function 11 X1/X2X2 = 0X2 = 1 X1= 001 X1 = 110 i2i2 i2i2 i1i1 i1i1

12

Type 2. Multi-Layer-Perceptron  feed forward  1 input layer, 1 or more hidden layers, 1 output layer  supervised learning  delta learning rule, backpropagation (mostly used)  Able : every logical operation

The Perceptron 14

Type 3. Backpropagation Net  feedforward  1 input layer, 1 or more hidden layers, 1 output layer  supervised  backpropagation  sigmoid  Used :complex logical operations, pattern classification, speech analysis

The Back-propagation Algorithm On-Line algorithm: 1. Initialize weights 2. Present a pattern and target output 3. Compute output : 4. Update weights : Repeat starting at 2 until acceptable level of error

Pattern Separation and NN architecture 17

סימולציות של רשתות נוירונים דור I McCulloch-Pitts threshold מסוגל לחשב משוואות בוליאניות דור II feed-forward, recurrent neural networks and backward propagation מסוגלות לחשב משוואות פולינומיאלית ונקראים גם universal approximation מכיוון שמסוגלים לחקות כל משוואה אנלוגית. בהשוואה לנוירון הביולוגי הדור 2 מסוגל ל"דבר" ב-rate coding או frequency coding שזה תדירות היריות (המרחק בין ירייה לירייה) 18

סימולציות של רשתות נוירונים דור III עליה נוספת בקירוב הסימולציות לנוירון הביולוגי. הנוירונים מסוגלים לרבב multiplexing תדרים יריות ו"לדבר" ב - pulse coding במקום ב – rate coding ובכך להעביר כמה "מילים" באותו זמן "שעון" עצמי לכל יחידה 19

20

21

22

23

24

25

26

27

28

29

Hodgkin-Huxley Model 100 mV 0 stimulus inside outside Ka Na Ion channels Ion pump Cglgl gKgK g Na I K = אשלגן Na = נתרן 30

Hodgkin-Huxley Model stimulus inside outside Ka Na Ion channels Ion pump u u h 0 (u) m 0 (u) pulse input I(t) 31

דוגמאות  Integrate & Fire Neural Network.htm  actionpotential.swf 32

מבנה נוירולוגי כללי  בגופנו יש שני סוגי מערכות, ממריץ (אנדרנלין) מרגיע\מאיט (אציטין חולין)  יש לפחות שני סוגי סינפציות: מעקבות : לרוב מתחברת לגוף התא מעוררת : לרוב מתחברות בסופי העצים  מרווח בין הסינפצות הוא 20 ננומטר.  מימד הזמן- ניתן ליראות את הלמידה אסוציאטיבית היא שימוש במימד הזמן, - שני מאורעות שקורים בו זמנית הן בדרך כלל קשורים אחד לשני. לדוגמא ניסוי פאבלוב טעם גירוי וזמן. 33

34

35

36

37

38

39

40 השלב השני בחיזוק הקשר בין תאים נובע מיצירת סינפסות חדשות, מתמשך לטווח ארוך ומצריך שפעול גנים.

41

42

פלט / קלט טיפוסי 43

44

Generation of multiple Action Potentials  Rate is dependent on depolarization  Firing frequency  1 per second is 1 Hz  Maximum is about 1000Hz  Absolute refractory period  Relative refractory period  I(ion)=g(ion)(Vm-Eion) 45

תגובה אינסטינקטיבית 46

Mapping from biological neuron Nervous SystemComputational Abstraction NeuronNode DendritesInput link and propagation Cell BodyCombination function, threshold, activation function AxonOutput link Spike rateOutput Synaptic strengthConnection strength/weight 47

EPSP excitatory postsynaptic potential IPSP Inhibitory postsynaptic potential 48

חוק Hebb: הקשר בין תאים הפועלים בו זמנית יתחזק 49

Liquid State Machine (LSM) 50

Liquid State Machine (LSM) Maass’ LSM is a spiking recurrent neural network which satisfies two properties –Separation property (liquid) –Approximation property (readout) LSM features –Only attractor is rest –Temporal integration –Memoryless linear readout map –Universal computational power: can approximate any time invariant filter with fading memory –It also does not require any a-priori decision regarding the ``neural code'' by which information is represented within the circuit. 51

Maass’ Definition of the Separation Property The current state x(t) of the microcircuit at time t has to hold all information about preceding inputs. Approximation Property Readout can approximate any continuous function f that maps current liquid states x(t) to outputs v(t). 52

2 motors, 1 minute footage of each case, 3400 frames Readouts could utilize wave interference patterns 53

Zero One 54

סקירת מאמרים  Spiking neural networks, an introduction.pdf הקדמה וסיכום על רשתות נוירונים ומבנה של דור 3 במודלים  is the integrate-and-fire model good enough – a review.pdf השוואה בין מודל I&F לבין מודל HH כולל הרחבה של מודל I&F למודל שמשלב את שניהם  LSM (Liquid State Machine) Liquid State Machines,a review.pdf Liquid State Machine Built of Hodgkin–Huxley Neurons.pdf The Echo State approach to analysing and training recurrent neural networks.pdf  LSM  Turing Maching On the Computational Power of Circuits of Spiking neurons.pdf The Echo State approach to analysing and training recurrent neural networks.pdf 55

סקירת מאמרים  The Tempotron מודל לנוירון LIF המסוגל למיין סדרות פולסים עם לימוד  Spike Timing Dependent Plasticity Finds the Start of Repeating Patterns in Continuous Spike Trains2.PDF מודל LIF המסוגל לזהות רצף של חזרות בפולסים ללא לימוד, רק ע"י שינוי משקלי הקלטים  Hubb’s Rule Hebbian learning and spiking neurons.pdf Competitive Hebbian learning through spike-timing dependent synaptic plasticity.pdf Spike-Timing-Dependent Hebbian Plasticity as Temporal Difference Learning.pdf  Pitch Perception Models.pdf מודל המנסה לחקות ולהבין שמיעה ע"י קימוט של התדרים 56