AI – CS289 Machine Learning - Labs Machine Learning – Lab 4 02 nd November 2006 Dr Bogdan L. Vrusias

Slides:



Advertisements
Similar presentations
G53MLE | Machine Learning | Dr Guoping Qiu
Advertisements

Perceptron Learning Rule
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Overview over different methods – Supervised Learning
Simple Neural Nets For Pattern Classification
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Perceptron Learning Rule
An Illustrative Example
Neural Networks Chapter Feed-Forward Neural Networks.
Artificial Neural Networks
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
CS 4700: Foundations of Artificial Intelligence
Matlab Fuzzy Toolkit Example
CS 484 – Artificial Intelligence
Artificial Neural Network
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
Artificial neural networks:
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
AI – CS289 Fuzzy Logic - Labs Fuzzy Logic – Lab 1 Starting up… 05 th October 2006 Dr Bogdan L. Vrusias
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Connectionist Luger: Artificial.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
AI – CS289 Fuzzy Logic - Labs Fuzzy Logic – Lab 3 19 th October 2006 Dr Bogdan L. Vrusias
Multi-Layer Perceptron
CS621 : Artificial Intelligence
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32: sigmoid neuron; Feedforward.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
AI – CS364 Matlab Fuzzy Toolkit Running the Matlab Fuzzy Toolkit 10 th October 2006 Dr Bogdan L. Vrusias
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Fall 2004 Backpropagation CS478 - Machine Learning.
Artificial Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Self organizing networks
CS Fall 2016 (Shavlik©), Lecture 2
Neural Networks Chapter 5
Neural Networks References: “Artificial Intelligence for Games”
Artificial Intelligence 12. Two Layer ANNs
CSSE463: Image Recognition Day 17
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Perceptron Learning Rule
Word representations David Kauchak CS158 – Fall 2016.
Perceptron Learning Rule
Perceptron Learning Rule
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

AI – CS289 Machine Learning - Labs Machine Learning – Lab 4 02 nd November 2006 Dr Bogdan L. Vrusias

AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © Instuctions The following slides demonstrate the capabilities of Machine Learning. The examples are taken from: –Negnevitsky, M., "Artificial Intelligence: A Guide to Intelligent Systems", 2nd edn. Addison Wesley, Harlow, England, Download and unzip the following file, that contains all examples: or alternately: To run the examples, use Matlab (left hand side window called current directory) to navigate to the directory where you have downloaded and unzipped the files, and then simply type (case sensitive) the name of the file without the “.m” extention. –E.g. to run the Perceptron_AND.m you type Perceptron_AND on Matlab’s command window.

AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © The perceptron: learning linearly separable functions Filename: Perceptron_AND.m Matlab command: Perceptron_AND Problem: Two-input perceptron is required to perform logical operation AND Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how well it separates the two categories (category 0 and category 1) –The input pairs ([0,0], [0,1], [1,0], [1,1]) are classified correctly after the training.

AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © The perceptron: learning linearly separable functions Filename: Perceptron_OR.m Matlab command: Perceptron_OR Problem: Two-input perceptron is required to perform logical operation OR Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how well it separates the two categories (category 0 and category 1) –Observe how the categorisation line changes for every step of the training. –The input pairs ([0,0], [0,1], [1,0], [1,1]) are classified correctly after the training.

AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © The perceptron: an attempt to learn linearly non- separable functions Filename: Perceptron_XOR.m Matlab command: Perceptron_XOR Problem: Two-input perceptron is required to perform logical operation XOR Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how well it separates the two categories (category 0 and category 1) –Observe how the categorisation line changes for every step of the training. Does the system learn? –The input pairs ([0,0], [0,1], [1,0], [1,1]) are NOT classified correctly after the training. –Can you tell why?

AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © Back-propagation algorithm Filename: XOR_bp.m Matlab command: XOR_bp Problem: The three-layer back-propagation network is required to perform logical operation Exclusive-OR. Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how the training is performed. The blue line represents the error that drops through time (iterations). –Observe how there are now two line to separate the categories (from the two hidden neurons) –The input pairs ([0,0], [0,1], [1,0], [1,1]) are classified correctly after the training.

AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © Competitive learning Filename: Competitive.m Matlab command: Competitive Problem: A single-layer competitive network is required to classify a set of two-element input vectors into four natural classes. Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you the coordinates of the input data vectors (red) together with the weight vectors (blue) –Observe how there are labels given to each cluster. –The test vectors (green) are then labelled based on how close they fall to a formed labelled cluster.