Perceptron Networks and Vector Notation n CS/PY 231 Lab Presentation # 3 n January 31, 2005 n Mount Union College.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Beyond Linear Separability
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
also known as the “Perceptron”
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Lecture 14 – Neural Networks
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Exam 1 – 115a. Basic Probability For any event E, The union of two sets A and B, A  B, includes items that are in either A or B. The intersection, A.
(Page 554 – 564) Ping Perez CS 147 Summer 2001 Alternative Parallel Architectures  Dataflow  Systolic arrays  Neural networks.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Neural networks.
Prelude A pattern of activation in a NN is a vector A set of connection weights between units is a matrix Vectors and matrices have well-understood mathematical.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks
Explorations in Neural Networks Tianhui Cai Period 3.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
1 Chapter 20 Section Slide Set 2 Perceptron examples Additional sources used in preparing the slides: Nils J. Nilsson’s book: Artificial Intelligence:
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Choosing Weight and Threshold Values for Single Perceptrons n CS/PY 231 Lab Presentation # 2 n January 24, 2005 n Mount Union College.
Methodology of Simulations n CS/PY 399 Lecture Presentation # 19 n February 21, 2001 n Mount Union College.
Introduction to the TLearn Simulator n CS/PY 399 Lab Presentation # 5 n February 8, 2001 n Mount Union College.
College Algebra Sixth Edition James Stewart Lothar Redlin Saleem Watson.
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
Neural Networks in Computer Science n CS/PY 231 Lab Presentation # 1 n January 14, 2005 n Mount Union College.
Perceptron Models The perceptron is a kind of binary classifier that maps its input x (a vector of type Real) to an output value f(x) (a scalar of type.
Section 9-4 Sequences and Series.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Controlling a Robot with a Neural Network n CS/PY 231 Lab Presentation # 9 n March 30, 2005 n Mount Union College.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
U8L1: Arithmetic Sequences and Series EQ: What are arithmetic sequences and series? How do I find values regarding them?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Investigating Sequences and Series Arithmetic Sequences.
Computational Properties of Perceptron Networks n CS/PY 399 Lab Presentation # 3 n January 25, 2001 n Mount Union College.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Introduction to the TLearn Simulator n CS/PY 231 Lab Presentation # 5 n February 16, 2005 n Mount Union College.
Neural networks.
Neural Networks.
Ranga Rodrigo February 8, 2014
A Simple Artificial Neuron
CSE 473 Introduction to Artificial Intelligence Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Lecture 9 MLP (I): Feed-forward Model
Neural Networks Advantages Criticism
of the Artificial Neural Networks.
XOR problem Input 2 Input 1
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Artificial Intelligence Lecture No. 28
Maths Unit 1 - Algebra Order of operations - BIDMAS Algebraic Notation
Neural Networks References: “Artificial Intelligence for Games”
Artificial Intelligence 12. Two Layer ANNs
David Kauchak CS158 – Spring 2019

Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Perceptron Networks and Vector Notation n CS/PY 231 Lab Presentation # 3 n January 31, 2005 n Mount Union College

A Multiple Perceptron Network for computing the XOR function n We found that a single perceptron could not compute the XOR function n Solution: set up one perceptron to detect if x 1 = 1 and x 2 = 0 n set up another perceptron for x 1 = 0 and x 2 = 1 n feed the outputs of these two perceptrons into a third one that produces an output of 1 if either input is a 1

A Nightmare! n Even for this simple example, choosing the weights that cause a network to compute the desired output takes skill and lots of patience n Much more difficult than programming a conventional computer: OR function: if x1 + x2 > 1, output 1; otherwise output 0 XOR function: if x1 + x2 = 1, output 1; otherwise output 0

There must be a better way…. n These labs and demos were designed to show that manually adjusting weights is tedious and difficult n This is not what happens in nature –No creature says, “Hmmm, what weight should I choose for this neural connection?” n Formal training methods exist that allow networks to learn by updating weights automatically (explored next week)

Expanding to More Inputs n artificial neurons may have many more than two input connections n calculation performed is the same: multiply each input by the weight of the connection, and find the sum of all of these products n notation can become unwieldy: sum = x 1 ·w 1 + x 2 ·w 2 + x 3 ·w 3 + … + x 100 ·w 100

Some Mathematical Notation n Most references (e.g., Plunkett & Elman text) use mathematical summation notation n Sums of large numbers of terms are represented by the symbol Sigma (  ) –previous sum is denoted as: 100  x k ·w k k = 1

Summation Notation Basics n Terms are described once, generally n Index variable shows range of possible values n Example: 5  k / (k - 1) = 3/2 + 4/3 + 5/4 k = 3

Summation Notation Example n Write the following sum using Sigma notation: 3·x 0 + 4·x 1 + 5·x 2 + 6·x 3 + 7·x 4 + 8·x 5 + 9·x ·x 7 n Answer: 7  (k + 3) ·x k k = 0

Vector Notation n The most compact way to specify values for inputs and weights when we have many connections n The ORDER in which values are specified is important n Example: if w 1 = 3.5, w 2 = 1.74, and w 3 = 18.2, we say that the weight vector w = (3.5, 1.74, 18.2)

Vector Operations n Vector Addition: adding two vectors means adding the values from the same position in each vector –result is a new vector n Example: (9.2, 0, 17) + (1, 2, 3) = (10.2, 2, 20) n Vector Subtraction: subtract corresponding values n (9.2, 0, 17) - (1, 2, 3) = (8.2, -2, 14)

Vector Operations n Dot Product: mathematical name for what a perceptron does n x · m = x 1 ·m 1 + x 2 ·m 2 + x 3 ·m 3 + … + x last ·m last n Result of a Dot Product is a single number n example: (9.2, 0, 17) · (1, 2, 3) = = 60.2

Perceptron Networks and Vector Notation n CS/PY 231 Lab Presentation # 3 n January 31, 2005 n Mount Union College