1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Advertisements

Perceptron Lecture 4.
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
G53MLE | Machine Learning | Dr Guoping Qiu
NEURAL NETWORKS Perceptron
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Artificial Intelligence (CS 461D)
Simple Neural Nets For Pattern Classification
The back-propagation training algorithm
Introduction to Machine Learning
PERCEPTRON. Chapter 3: The Basic Neuron  The structure of the brain can be viewed as a highly interconnected network of relatively simple processing.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
An Illustrative Example
Lecture 08 Classification-based Learning
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
BEE4333 Intelligent Control
Artificial neural networks:
Artificial neural networks:
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
2101INT – Principles of Intelligent Systems Lecture 10.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
10/17/2015Intelligent Systems and Soft Computing1 Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction,
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Artificial Intelligence & Neural Network
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
 Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning n Introduction, or how the brain works n The neuron.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Machine Learning Artificial Intelligence
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
Artificial Intelligence (CS 370D)
Artificial neural networks:
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Intelligent Systems and Soft Computing
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Perceptron as one Type of Linear Discriminants
Artificial Intelligence Lecture No. 28
Introduction to Neural Network
Presentation transcript:

1 Machine Learning The Perceptron

2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)

In a Knowledge Based Representation the solution is a set of rules RULE Prescribes_Erythromycin IF Diseases = strep_throat AND Allergy <> erythromycin THEN Prescribes = erythromycin CNF 95 BECAUSE "Erythromycin is an effective treatment for a strep throat if the patient is not allergic to it"; In a Genetic Algorithm Based Representation the solution is a chromosome

4 Genetic algorithm

5 n The neuron as a simple computing element n The perceptron n Multilayer artificial neural networks n Accelerated learning in multilayer ANNs Topics

6 Machine Learning Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy. Learning capabilities can improve the performance of an intelligent system over time. The most popular approaches to machine learning are artificial neural networks and genetic algorithms. This lecture is dedicated to neural networks.

7 n Our brain can be considered as a highly complex, non-linear and parallel information-processing system. n Information is stored and processed in a neural network simultaneously throughout the whole network, rather than at specific locations. In other words, in neural networks, both data and its processing are global rather than local. n Learning is a fundamental and essential characteristic of biological neural networks. The ease with which they can learn led to attempts to emulate a biological neural network in a computer.

8 Biological neural network

9 n The neurons are connected by weighted links passing signals from one neuron to another. n The human brain incorporates nearly 10 billion neurons and 60 trillion connections, synapses, between them. By using multiple neurons simultaneously, the brain can perform its functions much faster than the fastest computers in existence today.

10 n A artificial neural network can be defined as a model of reasoning based on the human brain. The brain consists of a densely interconnected set of nerve cells, or basic information-processing units, called neurons. n An artificial neural network consists of a number of very simple processors, also called neurons, which are analogous to the biological neurons in the brain.

11 Architecture of a typical artificial neural network

12 Analogy between biological and artificial neural networks

13 n Each neuron has a simple structure, but an army of such elements constitutes a tremendous processing power. n A neuron consists of a cell body, soma, a number of fibers called dendrites, and a single long fiber called the axon.

14 The neuron as a simple computing element Diagram of a neuron

15 The Perceptron n The operation of Rosenblatt’s perceptron is based on the McCulloch and Pitts neuron model. The model consists of a linear combiner followed by a hard limiter. n The weighted sum of the inputs is applied to the hard limiter. (Step or sign function)

16 Single-layer two-input perceptron

17 n The neuron computes the weighted sum of the input signals and compares the result with a threshold value, . If the net input is less than the threshold, the neuron output is –1. But if the net input is greater than or equal to the threshold, the neuron becomes activated and its output attains a value +1. n The neuron uses the following transfer or activation function: n This type of activation function is called a sign function.

18 Y = sign(x1 * w1 + x2 * w2 - threshold); static int sign(double x) { return (x < - 0) ? -1 : 1; }

19 Possible Activation functions

20 Exercise A neuron uses a step function as its activation function, has threshold 0.2 and has W1 = 0.1, W2 = 0.1. What is the output with the following values of x1 and x2? Do you recognize this? x1x2Y Can you find weights that will give an or function? An xor function?

21 Can a single neuron learn a task? n In 1958, Frank Rosenblatt introduced a training algorithm that provided the first procedure for training a simple ANN: a perceptron. n The perceptron is the simplest form of a neural network. It consists of a single neuron with adjustable synaptic weights and a hard limiter.

22 This is done by making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned, usually in the range [  0.5, 0.5], and then updated to obtain the output consistent with the training examples. How does the perceptron learn its classification tasks?

23 n If at iteration p, the actual output is Y(p) and the desired output is Y d (p), then the error is given by: where p = 1, 2, 3,... where p = 1, 2, 3,... Iteration p here refers to the pth training example presented to the perceptron. n If the error, e(p), is positive, we need to increase perceptron output Y(p), but if it is negative, we need to decrease Y(p).

24 The perceptron learning rule where p = 1, 2, 3,...  is the learning rate, a positive constant less than unity. The perceptron learning rule was first proposed by Rosenblatt in Using this rule we can derive the perceptron training algorithm for classification tasks.

25 Step 1: Initialisation Set initial weights w 1, w 2,…, w n and threshold  to random numbers in the range [  0.5, 0.5]. Perceptron’s training algorithm

26 Step 2: Activation Activate the perceptron by applying inputs x 1 (p), x 2 (p),…, x n (p) and desired output Y d (p). Calculate the actual output at iteration p = 1 where n is the number of the perceptron inputs, and step is a step activation function. Perceptron’s tarining algorithm (continued)

27 Step 3: Weight training Update the weights of the perceptron where is the weight correction at iteration p. The weight correction is computed by the delta rule: Step 4: Iteration Increase iteration p by one, go back to Step 2 and repeat the process until convergence. Perceptron’s tarining algorithm (continued)

28 Epochx1x1 X2X2 YdYd w1w1 w2w2 Yerr w1w1 w2w Exercise: Complete the first two epochs training the perceptron to learn the logical AND function. Assume  = 0.2,  = 0.1

29 n The perceptron in effect classifies the inputs, x 1, x 2,..., x n, into one of two classes, say A 1 and A 2. n In the case of an elementary perceptron, the n- dimensional space is divided by a hyperplane into two decision regions. The hyperplane is defined by the linearly separable function:

30 If you proceed with the algorithm described above the termination occurs with w1 = 0.1 and w2 = 0.1 This means that the line is 0.1x x 2 = 0.2 or x 1 + x = 0 The region below the line is where x 1 + x < 0 and above the line is where x 1 + x >= 0

31 Linear separability in the perceptron

32 Perceptron Capability Independence of the activation function (Shynk, 1990; Shynk and Bernard, 1992)

33 General Network Structure The output signal is transmitted through the neuron’s outgoing connection. The outgoing connection splits into a number of branches that transmit the same signal. The outgoing branches terminate at the incoming connections of other neurons in the network.The output signal is transmitted through the neuron’s outgoing connection. The outgoing connection splits into a number of branches that transmit the same signal. The outgoing branches terminate at the incoming connections of other neurons in the network. Training by back-propagation algorithmTraining by back-propagation algorithm Overcame the proven limitations of perceptronOvercame the proven limitations of perceptron

34 Architecture of a typical artificial neural network