COGNITIVE NEUROSCIENCE

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Electrophysiology.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Organizational Notes no study guide no review session not sufficient to just read book and glance at lecture material midterm/final is considered hard.
Neural Networks Basic concepts ArchitectureOperation.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
Frontal Lobes The Immune System A healthy brain and a healthy body.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Four Main Approaches Experimental cognitive psychology Cognitive neuropsychology Computational cognitive science Cognitive neuroscience.
November 5, 2009Introduction to Cognitive Science Lecture 16: Symbolic vs. Connectionist AI 1 Symbolism vs. Connectionism There is another major division.
COMPUTATIONAL COGNITIVE SCIENCE. Cognitive Revolution Development of computer led to rise of cognitive psychology and artificial intelligence BINAC: the.
Visual Cognition I basic processes. What is perception good for? We often receive incomplete information through our senses. Information can be highly.
Experiment Design 4: Theoretical + Operational Def’ns Martin Ch. 7.
Chapter Seven The Network Approach: Mind as a Web.
Artificial Neural Networks (ANNs)
COMPUTATIONAL COGNITIVE SCIENCE. Cognitive Revolution Development of computer led to rise of cognitive psychology and artificial intelligence BINAC: the.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
Cognitive Neuroscience How do we connect cognitive processes in the mind with physical processes in the brain?
Mapping the Brain Pages Daily Learning Objectives: THE STUDENT WILL Describe why we call them Brain waves Explain scanning techniques, such as.
Neural mechanisms of Spatial Learning. Spatial Learning Materials covered in previous lectures Historical development –Tolman and cognitive maps the classic.
Methods in Cognitive Neuroscience I. The Emergence of Cognitive Neuroscience Fueled by the development of powerful new imaging instruments and techniques.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Neural Information in the Visual System By Paul Ruvolo Bryn Mawr College Fall 2012.
Explorations in Neural Networks Tianhui Cai Period 3.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
NEURAL NETWORKS FOR DATA MINING
Cognitive Psychology, 2 nd Ed. Chapter 2. Mind and Brain Materialism regards the mind as the product of the brain and its physiological processes, perhaps.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Connectionist Models of Language Development: Grammar and the Lexicon Steve R. Howell McMaster University, 1999.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Chapter 3: Neural Processing and Perception. Neural Processing and Perception Neural processing is the interaction of signals in many neurons.
Cognitive Psychology PSYC231 Cognition and the Brain: Basic Principles 2 Dr. Jan Lauwereyns, EA619, ext
Foundations (cont.) Complexity Testing explanations in psychology Cognitive Neuroscience.
Phrenology Wrong!. Outer Surface of Human Brain Gray Matter = Neuron cell bodies & dendrites White Matter = Myelin (=fat)- covered axons Cortex = Outer.
Introduction to Neural Networks and Example Applications in HCI Nick Gentile.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation.
Reverse engineering the brain Prof. Jan Lauwereyns Advanced Engineering A.
Minds and Computers Discovering the nature of intelligence by studying intelligence in all its forms: human and machine Artificial intelligence (A.I.)
The Language of Thought : Part II Joe Lau Philosophy HKU.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
CHAPTER SEVEN The Network Approach: Mind as a Web.
Chapter 2 Cognitive Neuroscience. Some Questions to Consider What is cognitive neuroscience, and why is it necessary? How is information transmitted from.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Biology and Behavior Neuroscience  Scientific study of the brain and of the links between brain activity and behavior.
NEURONAL NETWORKS AND CONNECTIONIST (PDP) MODELS Thorndike’s “Law of Effect” (1920’s) –Reward strengthens connections for operant response Hebb’s “reverberatory.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
How can we study the brain?
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Outline Of Today’s Discussion
What is cognitive psychology?
Fall 2004 Perceptron CS478 - Machine Learning.
Split-Brain Studies What do you see? “Nothing”
Artificial Intelligence (CS 370D)
Other Classification Models: Neural Network
Intelligent Information System Lab
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
Capabilities of Threshold Neurons
Computer Vision Lecture 19: Object Recognition III
The Cognitive Science Approach
The Network Approach: Mind as a Web
Presentation transcript:

COGNITIVE NEUROSCIENCE

Note Please read book to review major brain structures and their functions Please read book to review brain imaging techniques See also additional slides available on class website

Cognitive Neuroscience the study of the relation between cognitive processes and brain activities Potential to measure some “hidden” processes that are part of cognitive theories (e.g. memory activation, attention, “insight”) Measuring when and where activity is happening. Different techniques have different strengths: tradeoff between spatial and temporal resolution

Techniques for Studying Brain Functioning Single unit recordings Hubel and Wiesel (1962, 1979) Event-related potentials (ERPs) Positron emission tomography (PET) Magnetic resonance imaging (MRI and fMRI) Magneto-encephalography (MEG) Transcranial magnetic stimulation (TMS)

The spatial and temporal ranges of some techniques used to study brain functioning.

Single Cell Recording (usually in animal studies) Measure neural activity with probes. E.g., research by Hubel and Wiesel:

Hubel and Wiesel (1962) Studied LGN and primary visual cortex in the cat. Found cells with different receptive fields – different ways of responding to light in certain areas LGN On cell (shown on left) LGN Off cell Directional cell Action potential frequency of a cell associated with a specific receptive field in a monkey's field of vision. The frequency increases as a light stimulus is brought closer to the receptive field.

COMPUTATIONAL COGNITIVE SCIENCE

Computer Models Artificial intelligence Computational modeling Constructing computer systems that produce intelligent outcomes Computational modeling Programming computers to model or mimic some aspects of human cognitive functioning. Modeling natural intelligence.  Simulations of behavior

Why do we need computational models? Provides precision need to specify complex theories. Makes vague verbal terms specific Provides explanations Obtain quantitative predictions just as meteorologists use computer models to predict tomorrow’s weather, the goal of modeling human behavior is to predict performance in novel settings

Neural Networks Alternative to traditional information processing models Also known as: PDP (parallel distributed processing approach) and Connectionist models Neural networks are networks of simple processors that operate simultaneously Some biological plausibility

Idealized neurons (units) Inputs S Processor Output Abstract, simplified description of a neuron

Different ways to represent information with neural networks: localist representation Unit 6 Unit 1 Unit 5 Unit 2 Unit 3 Unit 4 1 concept 1 concept 2 concept 3 (activations of units; 0=off 1=on) Each unit represents just one item  “grandmother” cells

Coarse Coding/ Distributed Representations Unit 6 Unit 1 Unit 5 Unit 2 Unit 3 Unit 4 1 concept 1 concept 2 concept 3 (activations of units; 0=off 1=on) Each unit is involved in the representation of multiple items

Advantage of Distributed Representations Efficiency Solve the combinatorial explosion problem: With n binary units, 2n different representations possible. (e.g.) How many English words from a combination of 26 alphabet letters? Damage resistance Even if some units do not work, information is still preserved – because information is distributed across a network, performance degrades gradually as function of damage (aka: robustness, fault-tolerance, graceful degradation)

Suppose we lost unit 6 Unit 6 Unit 1 Unit 5 Unit 2 Unit 3 Unit 4 1 concept 1 concept 2 concept 3 (activations of units; 0=off 1=on) Can the three concepts still be discriminated?

An example calculation for a single neuron Diagram showing how the inputs from a number of units are combined to determine the overall input to unit-i. Unit-i has a threshold of 1; so if its net input exceeds 1 then it will respond with +1, but if the net input is less than 1 then it will respond with –1

Neural-Network Models The simplest models include three layers of units: (1) The input layer is a set of units that receives stimulation from the external environment. (2) The units in the input layer are connected to units in a hidden layer, so named because these units have no direct contact with the environment. (3) The units in the hidden layer in turn are connected to those in the output layer. Each connection from an input unit ei­ther excites or inhibits a hidden unit. Furthermore, each connection has a weight, a measure of the strength of its influence on the receiving unit. Some networks include feedback loops, for example, with connections from hidden units to input units. Here is a crucial point: the pattern of weights in the entire network serves to repre­sent associations between input and output. Neural networks not only use parallel processing, they rely on distributed parallel processing, in which a representation is a pattern of weights, not a single weight, node, or connection. (p. 42)

Multi-layered Networks Activation flows from a layer of input units through a set of hidden units to output units Weights determine how input patterns are mapped to output patterns Network can learn to associate output patterns with input patterns by adjusting weights Hidden units tend to develop internal representations of the input-output associations Backpropagation is a common weight-adjustment algorithm output units hidden units input units

Example of Learning Networks http://www.cs.ubc.ca/labs/lci/CIspace/Version3/neural/index.html

Another example: NETtalk Connectionist network learns to pronounce English words: i.e., learns spelling to sound relationships. Listen to this audio demo. teacher /k/ target output 26 output units 80 hidden units 7 groups of 29 input units The net was presented with seven consecutive letters (e.g., “a_cat_”) simultaneously as input. NETtalk learned to pronounce the phoneme associated with the central letter (“c” in this example) NETtalk achieved a 90% success rate during training. When tested on a set of novel inputs that it had not seen during training, NETtalk’s performance remained steady at 80%-87%. _ a c t 7 letters of text input target letter (after Hinton, 1989)

Other demos Hopfield network http://www.cbu.edu/~pong/ai/hopfield/hopfieldapplet.html Backpropagation algorithm and competitive learning: http://www.cs.ubc.ca/labs/lci/CIspace/Version4/neural/ http://www.psychology.mcmaster.ca/4i03/demos/demos.html Competitive learning: http://www.neuroinformatik.ruhr-uni-bochum.de/ini/VDM/research/gsn/DemoGNG/GNG.html Various networks: http://diwww.epfl.ch/mantra/tutorial/english/ Optical character recognition: http://sund.de/netze/applets/BPN/bpn2/ochre.html Brain-wave simulator http://www.itee.uq.edu.au/%7Ecogs2010/cmc/home.html

Neural Network Models Inspired by real neurons and brain organization but are highly idealized Can spontaneously generalize beyond information explicitly given to network Retrieve information even when network is damaged (graceful degradation) Networks can be taught: learning is possible by changing weighted connections between nodes