Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.

Slides:



Advertisements
Similar presentations
FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:
Advertisements

Perceptron Lecture 4.
EE 690 Design of Embodied Intelligence
G53MLE | Machine Learning | Dr Guoping Qiu
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Decision Support Systems
Simple Neural Nets For Pattern Classification
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Neural Networks Part 4 Dan Simon Cleveland State University 1.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Introduction to Neural Networks Simon Durrant Quantitative Methods December 15th.
An Illustrative Example
Artificial Neural Networks (ANNs)
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
131 Feed-Forward Artificial Neural Networks MEDINFO 2004, T02: Machine Learning Methods for Decision Support and Discovery Constantin F. Aliferis & Ioannis.
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
WEKA - Explorer (sumber: WEKA Explorer user Guide for Version 3-5-5)
Chapter 11 Analysis and Explanation. Chapter 11 Outline Explain how CI systems do what they do Only a few methodologies are discussed here Sensitivity.
Artificial Neural Networks
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Chapter 9 Neural Network.
Neural Network Tool Box Khaled A. Al-Utaibi. Outlines  Neuron Model  Transfer Functions  Network Architecture  Neural Network Models  Feed-forward.
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
NEURAL NETWORKS FOR DATA MINING
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Classification / Regression Neural Networks 2
Artificial Intelligence Techniques Multilayer Perceptrons.
MATLAB.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Computational Intelligence: Methods and Applications Lecture 16 Model evaluation and ROC Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
PMS RI 2010/2011 Prof. dr Zikrija Avdagić, dipl.ing.el. ANFIS Editor GUI ANFIS Editor GUI.
國立雲林科技大學 National Yunlin University of Science and Technology Intelligent Database Systems Lab 1 Self-organizing map for cluster analysis of a breast cancer.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
Classification of Breast Cancer Cells Using Artificial Neural Networks and Support Vector Machines Emmanuel Contreras Guzman.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
129 Feed-Forward Artificial Neural Networks AMIA 2003, Machine Learning Tutorial Constantin F. Aliferis & Ioannis Tsamardinos Discovery Systems Laboratory.
Big data classification using neural network
Neural Networks Toolbox
第 3 章 神经网络.
Ranga Rodrigo February 8, 2014
A Simple Artificial Neuron
Classification with Perceptrons Reading:
CSSE463: Image Recognition Day 17
Generalization ..
Artificial Intelligence Chapter 3 Neural Networks
network of simple neuron-like computing elements
Pejman Mohammadi, Niko Beerenwinkel, Yaakov Benenson  Cell Systems 
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 17
Somi Jacob and Christian Bach
Artificial Intelligence Chapter 3 Neural Networks
Computational Intelligence: Methods and Applications
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Presentation transcript:

Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1

Matlab Neural Network Toolbox provides tools for designing, implementing, visualizing, and simulating neural networks. It supports feedforward networks, radial basis networks, dynamic networks, self-organizing maps, and other proven network paradigms. We will follow Matlab’s examples to learn to use four graphical tools for training neural networks to solve problems in function fitting, pattern recognition (clustering, and time series on your own). 2

To start the master GUI type – nnstart This enables us to access the GUIs for the following tasks – Function fitting – Pattern recognition – Data clustering – Time series analysis The second way of using the toolbox is through command line operation, which we will not cover. 3

Standard steps in designing a NN in Matlab are 1.Collect data 2.Create the network 3.Configure the network 4.Initialize the weights and biases 5.Train the network 6.Validate the network 7.Use the network 4

5

Suppose, for instance, that you have data from a housing application. You want to design a network that can predict the value of a house (in $1000s), given 13 pieces of geographical and real estate information. You have a total of 506 example homes for which you have those 13 items of data and their associated market values 6

Click Fitting Tool to open the Neural Network Fitting Tool. 7

8

9 Click Load Example Data Set in the Select Data window. The Fitting Data Set Chooser window opens

10 Select House Pricing, and click Import. This returns you to the Select Data window.

11 Click Next to display the Validation and Test Data window, shown in the following figure. The validation and test data sets are each set to 15% of the original data.

12 With these settings, the input vectors and target vectors will be randomly divided into three sets as follows: 70% will be used for training. 15% will be used to validate that the network is generalizing and to stop training before overfitting. The last 15% will be used as a completely independent test of network generalization.

13 The standard network that is used for function fitting is a two-layer feedforward network, with a sigmoid transfer function in the hidden layer and a linear transfer function in the output layer. The default number of hidden neurons is set to 10

14 The training continued until the validation error failed to decrease for six iterations (validation stop).

15 Under Plots, click Regression. This is used to validate the network performance.

16 The regression plots display the network outputs with respect to targets for training, validation, and test sets. For a perfect fit, the data should fall along a 45 degree line, where the network outputs are equal to the targets. Here, the fit is reasonably good for all data sets, with R values in each case of 0.92 or above. If even more accurate results were required, you could retrain the network by clicking Retrain in nftool. This will change the initial weights and biases of the network, and may produce an improved network after retraining. Other options are provided on the following pane.

17 View the error histogram to obtain additional verification of network performance. Under the Plots pane, click Error Histogram.

18

19

20

In addition to function fitting, neural networks are also good at recognizing patterns. For example, suppose you want to classify a tumor as benign or malignant, based on uniformity of cell size, clump thickness, mitosis, etc. You have 699 example cases for which you have 9 items of data and the correct classification as benign or malignant. Nnstart and click or nprtool 21

22

23

24 Click Load Example Data Set. The Pattern Recognition Data Set Chooser window opens.

25 Select Breast Cancer and click Import. You return to the Select Data window.

26

27

28 The standard network that is used for pattern recognition is a two-layer feedforward network, with sigmoid transfer functions in both the hidden layer and the output layer

29

30

31 Under the Plots pane, click Confusion in the Neural Network Pattern Recognition Tool. The next figure shows the confusion matrices for training, testing, and validation, and the three kinds of data combined. The network outputs are very accurate, as you can see by the high numbers of correct responses in the green squares and the low numbers of incorrect responses in the red squares. The lower right blue squares illustrate the overall accuracies.

32

33 ROC Curve The colored lines in each axis represent the ROC curves. The ROC curve is a plot of the true positive rate (sensitivity) versus the false positive rate (1 - specificity) as the threshold is varied. A perfect test would show points in the upper-left corner, with 100% sensitivity and 100% specificity. For this problem, the network performs very well.

34