Artificial Neural Network

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Multi-Layer Perceptron (MLP)
Perceptron Lecture 4.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Introduction to Neural Networks Computing
Artificial Neural Networks (1)
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
Artificial neural networks:
Biological and Artificial Neurons Michael J. Watts
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Simple Neural Nets For Pattern Classification
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
An Illustrative Example
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
1 Pendahuluan Pertemuan 1 Matakuliah: T0293/Neuro Computing Tahun: 2005.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
NEURAL NETWORKS Introduction
Artificial Intelligence Chapter 20.5: Neural Networks
Artificial neural networks:
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Explorations in Neural Networks Tianhui Cai Period 3.
2101INT – Principles of Intelligent Systems Lecture 10.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
ICS 586: Neural Networks Dr. Lahouari Ghouti Information & Computer Science Department.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Perceptrons Michael J. Watts
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Chapter 6 Neural Network.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Computational Intelligence Semester 2 Neural Networks Lecture 2 out of 4.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Neural networks.
Fall 2004 Backpropagation CS478 - Machine Learning.
Artificial neural networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
CSSE463: Image Recognition Day 17
CSE P573 Applications of Artificial Intelligence Neural Networks
ECE 471/571 - Lecture 17 Back Propagation.
CSE 573 Introduction to Artificial Intelligence Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
The Network Approach: Mind as a Web
David Kauchak CS158 – Spring 2019

PYTHON Deep Learning Prof. Muhammad Saeed.
Presentation transcript:

Artificial Neural Network Lecture # 8 Dr. Abdul Basit Siddiqui

Covered so for (Revision) 1943 McCulloch and Pitts proposed the McCulloch-Pitts neuron model. 1949 Hebb published his book The Organization of Behavior, in which the Hebbian learning rule was proposed. 1958 Rosenblatt introduced the simple single layer networks now called Perceptrons. 1969 Minsky and Papert’s book Perceptrons demonstrated the limitation of single layer perceptrons, and almost the whole field went into hibernation. 1982 Hopfield published a series of papers on Hopfield networks.

Covered so for (Revision) 1982 Kohonen developed the Self-Organising Maps that now bear his name 1986 The Back-Propagation learning algorithm for Multi-Layer Perceptrons was rediscovered and the whole field took off again. 1990s The sub-field of Radial Basis Function Networks was developed. 2000s The power of Ensembles of Neural Networks and Support Vector Machiness becomes apparent.

Covered so for (Revision) Introduction to Biological Neurons and its components The McCullock-Pitts Neurons Vastly simplified model of real neurons (Threshold Logic Unit)

Covered so for (Revision) Activation functions Step Function (Threshold or sign function) Sigmoid Function

Covered so for (Revision) McCulloch-Pitts Neuron Equation in terms of input and output can be written as: Θ is threshold used to squash neuron’s output

Networks of McCulloch-Pitts Neuron We may have many neurons labeled by indices k, i, j and activation flows between them via synapses with strengths wki , wij :

The Perceptron Any number of McCulloch-Pitts neurons can be connected together in any way. An arrangement of one input layer of McCulloch-Pitts neurons feeding forward to one output layer of McCulloch-Pitts neurons is known as a Perceptron.

Implementing Logic Gates using McCulloch-Pitts Neurons Logical operators AND, OR and NOT can be implemented using MP neurons. We can easily find it with inception.

Need to find Weights Analytically XOR function Required to calculate the suitable parameters instead of finding solution by trial and error. It is required to calculate the weights and thresholds.

Finding Weights Analytical for the AND Network We have two weights w1 and w2 and the threshold Θ, and for each training pattern we need to satisfy We have four inequalities There are infinite number of solutions for AND, OR and NOT networks.

Perceptron’s Limitations XOR function What is problem?? We need more complex networks Or we need different activation/threshold/transfer functions We need to learn parameters

Structures of ANN ANN is a weighted directed graph considering the activation flowing between the processing units through one way connections. Three types of ANNs Single-Layer Feed-forward NNs One input layer, one output layer with no feedback connections (a simple perceptron) Multi-Layer Feed-forward NNs One input layer, one output layer, one or more hidden layers with no feedback connections (A multilayer perceptron) Recurrent NNs The network has atleast one feedback connection. May or may not have hidden units (A simple Recurrent Network)

Network’s Structures Examples

More Activation Functions Sigmoid Functions These are smooth (differentiable) and monotonically increasing.

Bias- A special kind of Weight (Instead of Θ) To simplify mathematics, consider threshold as a weight such that Suppose then Hence the equation of perceptron is simplified as

Problems with Perceptron How to classify the data points using perceptron: Without calculating large no. of inequalities, we have to find weights and thresholds.

Decision Boundaries in Two Dimensions For simple problems, our perceptron is forming decision boundaries between the classes. The decision boundary (between out = 0 and out = 1) is at w1in1 + w2in2 - Θ = 0 Along the line In two dimensions, the boundaries are always straight lines.