Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Topic Nerves.
Chapter 2 Nerve Cells and Nerve Impulses
The Electrical Nature of Nerves
Lecture packet 9 Reading: Chapter 7
Neurons HBS3B.
Neuro I Or: What makes me do that Voodoo that I Do so Well!
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Objectives 31.1 The Neuron -Identify the functions of the nervous system. -Describe the function of neurons. -Describe how a nerve impulse is transmitted.
Bioelectricity Provides basis for “irritability” or “excitability Fundamental property of all living cells Related to minute differences in the electrical.
Biological and Artificial Neurons Michael J. Watts
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Chapter Eleven Exam Four Material Chapters 11, 12, &13.
Lecture 09 Clustering-based Learning
Notes The Nervous System Chapter 35 Section 2.
Nervous systems. Keywords (reading p ) Nervous system functions Structure of a neuron Sensory, motor, inter- neurons Membrane potential Sodium.
Ch. 12 Nervous Tissue. Objectives Understand how the nervous system is divided and the types of cells that are found in nervous tissue Know the anatomy.
Neuroscience and Behavior Most information in this presentation is taken directly from UCCP content, unless otherwise noted.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Biology 41.1 nervous System
Artificial Neural Network Unsupervised Learning
David Sadava H. Craig Heller Gordon H. Orians William K. Purves David M. Hillis Biologia.blu C – Il corpo umano Neurons and Nervous Tissue.
The Neuron An everyday (every second!) use of active transport
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Nervous System.
Neurons, Synapses and Signaling
Chapter 48 Neurons, Synapses, and Signaling. Copyright © 2008 Pearson Education, Inc., publishing as Pearson Benjamin Cummings Overview: Lines of Communication.
The Nervous System The nervous system controls and coordinates functions throughout the body and responds to internal and external stimuli.
Honors Biology Powerpoint #3 Unit 8 – Chapter 35 The Senses Activities.
Cellular Anatomy Lecture 2
Nervous System Structure and Function Pt 1. Nervous System Function The nervous system controls and coordinates functions throughout the body, and responds.
Basics of the Nervous System
Neuron organization and structure reflect function in information transfer The squid possesses extremely large nerve cells and is a good model for studying.
Neurons. The Nervous System We learned in Bio 11 that animals have the ability to respond to their environment Animal response But how do they do that?
P. Ch 48 – Nervous System pt 1.
Copyright © 2009 Pearson Education, Inc. Neurons and Neurological Cells: The Cells of the Nervous System  The nervous system  Integrates and coordinates.
Neurons, Synapses, & Signaling Campbell and Reece Chapter 48.
1 The Nervous system Dr. Paromita Das 217 Biomedical Research Facility Tallahassee FL.
Nervous System IB Biology. Nervous System In order to survive and reproduce an organism must respond rapidly and appropriately to environmental stimuli.
ACTION POTENTIALS Chapter 11 Part 2 HONORS ANATOMY & PHYSIOLOGY.
Biology 211 Anatomy & Physiology I Dr. Thompson Electrophysiology.
Neurons & Nervous Systems. nervous systems connect distant parts of organisms; vary in complexity Figure 44.1.
Nervous System Cells Ch 12 *By the end of this, you should be able to answer all of Obj. 12 questions.
8.2 Structures and Processes of the Nervous System
The Neuron An everyday (every second!) use of active transport.
Copyright © 2008 Pearson Education, Inc., publishing as Pearson Benjamin Cummings Ch 48 – Neurons, Synapses, and Signaling Neurons transfer information.
Neurons Structure and Function G.Burgess. Neuron Specialized cells that send electric signals as impulses through the body.
End Show Slide 1 of 38 Copyright Pearson Prentice Hall 35-2 The Nervous System.
Nerve Impulses.
Neuron Structure and Function. Nervous System  Nervous system is composed of specialized cells called neurons.  Neurons have long “arms” called axons.
Nervous Systems Three Main Functions: 1. Sensory Input 2. Integration 3. Motor Output.
Chapter 35-2 Nervous System.
THE NERVOUS SYSTEM 35-2 BIO 1004 Flora. NERVOUS SYSTEM  Nervous system – controls and coordinates functions throughout the body and responds to internal.
Objectives 31.1 The Neuron -Identify the functions of the nervous system. -Describe the function of neurons. -Describe how a nerve impulse is transmitted.
The Nervous System & Neurons Unit 9 Chapter 35-2.
Nervous System Endocrine and nervous systems cooperate to maintain homeostasis.
Neurons, Synapses, & Signaling Campbell and Reece Chapter 48.
Nervous System All you could ever Want to know about the nervous system and its anatomy.
Ch. 10 Nervous System basic Structure and Function
Nervous System. The nervous system is broken down into two major parts:
Electrical Properties of the Nervous System Lundy-Ekman, Chapter 2 D. Allen, Ph.D.
Neurons and Synapses 6.5. The Nervous System Composed of cells called neurons. These are typically elongated cells that can carry electrical impulses.
Neurons, Synapses, and Signaling
Lesson Overview 31.1 The Neuron.
Neurons, Synapses, and Signaling
SEC 31.1 THE NEURON.
Presentation transcript:

3.2. Neurons and their networks 3.2.1 Biological neurons Tasks such as navigation, but also cognition, memory etc. happen in the nervous system (more specifically the brain).

The nervous system is made up of several different types of cells: - Neurons - Astrocytes - Microglia - Schwann cells Neurons do the computing, the rest is infrastructure

Astrocytes Star-shaped, abundant, and versatile Guide the migration of developing neurons Act as K+ and NT buffers Involved in the formation of the blood brain barrier Function in nutrient transfer

Microglia Specialized immune cells that act as the macrophages of the central nervous system

Schwann cells and Oligodendrocytes Produce the myelin sheath which provides the electrical insulation for neurons and nerve fibers Important in neuronal regeneration

Myelination – electrically insulates the axon, which increases the transport speed of the action potential

Types of neurons Brain Sensory Neuron Lots of interneurons Motor

What they look like

...or schematically

In fact, things are a bit more crowded

Neurons communicate with each other, we will see later how this works Neurons communicate with each other, we will see later how this works. This will be the "neural network"

Thus, neurons need to be able to conduct information in 2 ways: From one end of a neuron to the other end.This is accomplished electrically via action potentials Across the minute space separating one neuron from another. This is accomplished chemically via neurotransmitters.

Resting potential of neurons K+ Cl- Na+ Outside of Cell Cell Membrane at rest Na+ - 70 mV K+ A- Cl- Inside of Cell Potassium (K+) can pass through to equalize its concentration Sodium and Chlorine cannot pass through Result - inside is negative relative to outside

Now lets open a Na channel in the membrane... If the initial amplitude of the GP is sufficient, it will spread all the way to the axon hillock where V-gated channels reside. At this point an action potential can be excited if the voltage is high enough.

N.B. The gating properties of ion channels were determined long before it was known they existed from electrical measurements (conductivity of squid axons to Na and K) Similar for the transport of K – the different coefficients imply the number of opening and gating bits...

With modern crystalography, these effects have been observed...

Transport of the action potential, like a row of dominos falling...

This goes a lot faster with myelinated axons – saltating transport...

Once at the syapse, the signal is transmitted chamically via neurotransmitters (e.g. Acetylcholin) These are then used to excite a new graded potential in the next neuron

This graded potential can be both positive and negative, depending on the environment

The intensity of the signal is given by the firing frequency

These properties are caricatured in the McCulloch-Pitts neuron Learning happens when the weights wij are changed in response to the environment – this needs an updating rule

Common in informatics is the iterative learning, which needs a teacher Common in informatics is the iterative learning, which needs a teacher. I.e. The weights are adjusted so that in every learning step, the distance to the correct answer is obtained. This is known as the perceptron

With the use of hidden layers, not linearly separable variable can be learnt...

An example: letter recognition

The problems that can be solved depend on the structure of the network

3.2.2 Hebbian learning This means that a synapse gets stronger as neighbouring cells are more correlated

Hebb’s Law can be represented in the form of two rules: 1. If two neurons on either side of a connection are activated synchronously, then the weight of that connection is increased. 2. If two neurons on either side of a connection are activated asynchronously, then the weight of that connection is decreased. Hebb’s Law provides the basis for learning without a teacher. Learning here is a local phenomenon occurring without feedback from the environment.

Hebbian learning in a neural network

A Hebbian Cell Assembly By means of the Hebbian Learning Rule, a circuit of continuously firing neurons could be learned by the network. The continuing activation in this cell assembly does not require external input. The activation of the neurons in this circuit would correspond to the perception of a concept.

A Cell Assembly Input from the environment

A Cell Assembly Input from the environment

A Cell Assembly Input from the environment

A Cell Assembly Input from the environment

A Cell Assembly Note that the input from the environment is gone...

A Cell Assembly

Hebbian learning implies that weights can only increase Hebbian learning implies that weights can only increase. To resolve this problem, we might impose a limit on the growth of synaptic weights. It can be done by introducing a non-linear forgetting factor into Hebb’s Law: where  is the forgetting factor. The forgetting factor usually falls in the interval between 0 and 1, typically between 0.01 and 0.1, to allow only a little “forgetting” while limiting the weight growth.

First simulation of Hebbian learning Rochester et al. attempted to simulate the emergence of cell assemblies in a small network of 69 neurons. They found that everything became active in their network. They decided that they needed to include inhibitory synapses. This worked and cell assemblies did, indeed, form. This was later confirmed in real brain circuitry.

In fact, these inhibitory connections are distance dependent and as such give rise to structure

Exciation happens within columns and inhibition further away

See also the excursion into pattern formation in Sec 3.6 Long range inhibition and short range activation gives rise to patterns See also the excursion into pattern formation in Sec 3.6

Feature mapping Kohonen model

Iterate... Competitive learning Set initial synaptic weights to small random values, say in an interval [0, 1], and assign a small positive value to the learning rate parameter . Update weights: j(p) is the neighbourhood function centred around jX Iterate...

To illustrate competitive learning, consider the Kohonen network with 100 neurons arranged in the form of a two-dimensional lattice with 10 rows and 10 columns. The network is required to classify two-dimensional input vectors  each neuron in the network should respond only to the input vectors occurring in its region. The network is trained with 1000 two-dimensional input vectors generated randomly in a square region in the interval between –1 and +1. The learning rate parameter  is equal to 0.1.

Initial random network

After 100 steps

After 1000 steps

After 10000 steps

Or for letter recognition

In the cortex, this gives rise to the homunculus, the spatial distribution of nerve cells responsible for senses

Similar for other features in the cortex

In a Hopfield Network, every neuron is connected to every other neuron 3.2.3 Associative networks In a Hopfield Network, every neuron is connected to every other neuron

Topological state analysis for a three neuron Hopfield network

The stable state-vertex is determined by the weight matrix W, the current input vector X, and the threshold matrix . If the input vector is partially incorrect or incomplete, the initial state will converge into the stable state-vertex after a few iterations.

Energy function of Hopfield net: multidimensional landscape

Example: Restoring corrupted memory patterns 20% of T corrupted Half is corrupted Original T

Recap Sec. 3.2 The brain is a network of neurons, whose properties are important in how we learn Within neurons, signals are transported electrically, between chemically This can be abstracted in a McCulloch Pitts neuron Hebbian learning makes strong connections stronger (leads to pattern formation) This is taken further in Kohonen networks and competitive learning