Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation.

Slides:



Advertisements
Similar presentations
Computer Vision Lecture 18: Object Recognition II
Advertisements

Artificial Neural Network
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Machine Learning Neural Networks
Neural Networks.
Neural Networks Basic concepts ArchitectureOperation.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
COGNITIVE NEUROSCIENCE
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
COMPUTATIONAL COGNITIVE SCIENCE. Cognitive Revolution Development of computer led to rise of cognitive psychology and artificial intelligence BINAC: the.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Artificial Neural Networks Torsten Reil
Artificial Neural Networks (ANNs)
1 Pendahuluan Pertemuan 1 Matakuliah: T0293/Neuro Computing Tahun: 2005.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
COMPUTATIONAL COGNITIVE SCIENCE. Cognitive Revolution Development of computer led to rise of cognitive psychology and artificial intelligence BINAC: the.
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
Artificial Neural Networks
Neural Networks An Introduction.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
Un Supervised Learning & Self Organizing Maps Learning From Examples
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
NEURAL NETWORKS Introduction
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Introduction to Neural Networks CMSC475/675
Where We’re At Three learning rules  Hebbian learning regression  LMS (delta rule) regression  Perceptron classification.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
CS-485: Capstone in Computer Science Artificial Neural Networks and their application in Intelligent Image Processing Spring
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
NEURAL NETWORKS FOR DATA MINING
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Artificial Neural Networks (Cont.) Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Artificial Neural Networks. 2 Outline What are Neural Networks? Biological Neural Networks ANN – The basics Feed forward net Training Example – Voice.
Artificial Intelligence Methods Neural Networks Lecture 1 Rakesh K. Bissoondeeal Rakesh K.
The Language of Thought : Part II Joe Lau Philosophy HKU.
Biological and cognitive plausibility in connectionist networks for language modelling Maja Anđel Department for German Studies University of Zagreb.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Computational Intelligence Semester 2 Neural Networks Lecture 2 out of 4.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Introduction to Neural Networks
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
Artificial Neural Networks
The Network Approach: Mind as a Web
Introduction to Neural Network
David Kauchak CS158 – Spring 2019
PYTHON Deep Learning Prof. Muhammad Saeed.
CSC 578 Neural Networks and Deep Learning
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation

Neural network: An example 1. NN: Basic Ideas

A neural network has been shown to be a universal Turing machine; it can compute anything that is computable (Siegelmann & Sontag, 1991). Further, if equipped with appropriate algorithms, the neural network can be made into an intelligent computing machine that solves problems in finite time. Such algorithms have been developed in recent years: –Backpropagation learning rule (1985) –Hebbian learning rule (1949) –Kohonen’s self-organizing feature map (1982) –Hopfield nets (1982) –Boltzman machine (1986) Biological neural network (BNN) vs artificial neural network (ANN)

McCulloch-Pitts Networks (1943) - Model of artificial neurons that computes Boolean logical functions where Y, X 1, X 2 take on binary values of 0 or 1, and W 1, W 2, Q take on continuous values (e.g.)for W1 = 0.3, W2 = 0.5, W0 = 0.6 X1X2 Y Boolean AND Computation

McCulloch & Pitts (1943, Fig. 1)

Knowledge representation in M-P network Hiring Rule #1: “A job candidate who has either good grade or prior job experience and also gets strong letters and receives positive mark in interview tends to make a desirable employee and therefore should be hired.” “Knowledge is in the connection weights (and the threshold).”

Learning in M-P network Hiring Rule #1A: “A job candidate who has either good grade or prior job experience and also gets strong letters or receives positive mark in interview tends to make a desirable employee and therefore should be hired.” “Learning through weight modification (e.g., Hebb rule).”

Acquisition of new knowledge in M-P network Hiring Rule #2: “In addition to Rule 1A, a job candidate who has no prior job experience and receives negative mark in interview shouldn’t be hired.” “Acquisition of new knowledge through creation of new neurons (i.e., synaptogenesis).”

1. Distributed representation 2. Later inhibition 3. Bi-directional interaction 4. Error-correction learning 5. Hebbian learning 2. Computational Principles

1. Distributed Representation (vs. localist representation) An object is represented by multiple units; the same unit participates in multiple representations:

Why distributed representation? 1. Efficiency Solve the combinatorial explosion problem: With n binary units, 2 n different representations possible. (e.g.) How many English words from a combination of 26 alphabet letters? 2, Robustness (fault-tolerance) Loss of one or a few units does not affect representation. (e.g., holographic image, Fourier transformation).

2. Lateral Inhibition Selective activation through activation-dependent inhibitory competition (i.e., WTA).

3. Bi-directional Interaction (interactive, recurrent connections) This top-down-and-bottom-up processing computes constraint- optimization, and is generally faster than the uni-directional computation (e.g., word recognition).

Bi-directional interactions in Speech Perception

4. Error-correction Learning (e.g., backpropagation) dW ik =e*A i *(T k - B k )

A three-layer feed forward network with backpropagation learning can approximates any measurable function to any desired degree of accuracy by increasing the number of hidden units (Hornik et al, 1989, 1990; Hecht-Nielsen, 1989). However, biological plausibility of backpropagation learning is yet to be confirmed.

dW ik =e B k (A i - W ik ) 5. Hebbian Learning (unsupervised/self-organizing)

Hebbian Rule Encodes Correlational Information Asymptotically, W ik = P(A i = ‘fire |B k =‘fire’ ) In other words, the weight W ik stores information about correlation (i.e., co-firing activities) between the input and output units. Q1: How about encoding the other half of the correlational information, that is, P(B k = ‘fire|A i =‘fire’) Q2: Discuss implications of an anti-Hebbian learning rule such as dW ik =- e A i (B k - W ik )

Biological Plausibility of Hebbian Learning The neurophysiological plausibility is well documented (e.g., Levy & Stewards, 1980), including sub-cellular mechanisms (NMDA-mediated long-term potentiation (LTP)).

Noise-tolerant memory Pattern completion Content-addressable memory Language learning Sensory motor control Visual perception Speech perception Word recognition ….. 3. Examples of Neural Computation

NETtalk (program that learns to pronounce English words)

Submarine Sonar