Artificial Neural Networks

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
Artificial Neural Networks
Modules. A module is a file containing Python definitions and statements intended for use in other Python programs. There are many modules as part of.
Explorations in Neural Networks Tianhui Cai Period 3.
Appendix B: An Example of Back-propagation algorithm
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Multi-Layer Perceptron
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Tips for Training Neural Network
EEE502 Pattern Recognition
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Multinomial Regression and the Softmax Activation Function Gary Cottrell.
Neural networks.
The Gradient Descent Algorithm
Learning with Perceptrons and Neural Networks
Computer Science and Engineering, Seoul National University
Artificial neural networks:
COMP24111: Machine Learning and Optimisation
Ranga Rodrigo February 8, 2014
Perceptrons Lirong Xia.
A Simple Artificial Neuron
Classification with Perceptrons Reading:
Introduction to CuDNN (CUDA Deep Neural Nets)
CSSE463: Image Recognition Day 17
Random Another fun module.
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Networks for Pattern Recognition
CSC 578 Neural Networks and Deep Learning
Neural Networks Advantages Criticism
MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
CSC 578 Neural Networks and Deep Learning
Artificial Intelligence Chapter 3 Neural Networks
Face Recognition with Neural Networks
network of simple neuron-like computing elements
What is an artificial neural network?
An Introduction To The Backpropagation Algorithm
CSSE463: Image Recognition Day 17
Neural Networks Geoff Hulten.
Capabilities of Threshold Neurons
CSSE463: Image Recognition Day 18
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
Machine Learning: Lecture 4
Neural Networks References: “Artificial Intelligence for Games”
CSSE463: Image Recognition Day 18
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 18
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
David Kauchak CS51A Spring 2019
Perceptrons Lirong Xia.
Pattern Recognition: Statistical and Neural
Prof. Carolina Ruiz Department of Computer Science
An introduction to neural network and machine learning
Machine Learning for Cyber
Presentation transcript:

Artificial Neural Networks Lab demonstration (2)

Python Modules A module is a file containing Python definitions and statements intended for use in other Python programs. There are many Python modules that come with Python as part of the standard library. Once we import the module, we can use things that are defined inside.

To use elements of a module Import the module Use the dot to refer to the element of the module

Example: The turtle module Source: http://interactivepython.org/runestone/static/thinkcspy/PythonModules/modules.html

What modules are available in Python? A list of modules that are part of the standard library is available in Python documentation at: https://docs.python.org/3/py- modindex.html

In your file “network.py”

The random module Example applications in which we need to generate random numbers: To play a game of chance where the computer needs to throw some dice, pick a number, or flip a coin, To shuffle a deck of playing cards randomly, To randomly allow a new enemy spaceship to appear and shoot at you, For encrypting your banking session on the Internet.

The random module

The numpy module Used to create multidimensional arrays In numpy, dimensions of an array are called axes The number of axes is called the rank of the array Example: What are the rank and axes of the following numpy array?

The numpy module Used to create multidimensional arrays In numpy, dimensions of an array are called axes The number of axes is called the rank of the array Example: What are the rank and axes of the following numpy array? Rank = 1 1 axis of length 3

How to create a numpy array?

Create a multidimensional numpy array

Initializing the content of an array

Lab Exercise Create a 1-dimensional array (1 axis) containing five ones Create a 2-dimensional array (2 axes) containing 4 x 5 zeros Create a 3-dimensional array (3 axes) containing 4x3x2 ones

Lab Exercise: solution Create a 1-dimensional array (1 axis) containing five ones Create a 2-dimensional array (2 axes) containing 4 x 5 zeros Create a 3-dimensional array (3 axes) containing 4x3x2 ones

Initializing a random array from normal distribution

Initializing multiple arrays from a normal distribution 4 arrays: each one is 3x2

Exercise: Generate 3 arrays of random numbers The first array is 3 x 1 The second array is 5 x 1 The third array is 2 x 1

Solution

Exercise: Given a list of layers for a neural network, generate random bias vectors for each layer Example for this figure (from mid-term exam), the bias vectors can be: 𝑏 1 1 𝑏 2 1 = 0.16548184 0.72878268 𝑏 1 2 = 0.7278303

Solution

Specifying a neural network Input: a vector of number of neurons in each layer The first number in the input vector contains the number of input variables. Ex: [3, 2, 1] ============ >

Initializing biases

Initializing weights Sizes = [3,2,1] The first weight array is 2x3 𝑤 11 1 𝑤 12 1 𝑤 13 1 𝑤 21 1 𝑤 22 1 𝑤 23 1 The second weight array is 1x2 𝑤 11 2 𝑤 12 2

Initializing weight arrays

Exercise: Create a Neural Network Class Create an __init__ function for the network class Initialize self.biases Initialize self.weights

Getting code and data git clone https://github.com/mnielsen/neural-networks-and-deep-learning.git

For Python 3.4  we need to make some changes to the mnist_loader Open the file mnist_loader.py Change cPickle to picke on lines 13 and 43 On line 43: change the call to training_data, validation_data, test_data = pickle.load(f, encoding='latin1')

For Python 3.4  we need to make some changes to the mnist_loader Open the file mnist_loader.py Wrap the zip() calls with list() calls

In file network.py //Add this line

In file network.py Find all xrange and replace it with range

The MNIST Data set A large number of scanned images of handwritten digits Each image is 28 x 28 = 784 pixels We need to create a neural network that accepts 784 inputs values [X1, …. X784]

Load data and create network

Compatibility with Python 3.4

Output …..

What is this output? Recall the algorithm of Least Mean Square: Calculates error based on 1 input pattern x(n) Updates weights based on 1 input pattern x(n)

Backpropagation Algorithm Start with randomly chosen weights [𝒘 𝒋𝒌 𝒍 ] While error is unsatisfactory: for each input pattern x: feedforward: for each l = 1, 2,…, L compute 𝒛 𝒌 𝒍 𝒂𝒏𝒅 𝒂 𝒌 𝒍 Compute the error at output layer: 𝛿 𝑘 𝐿 = 𝑑 𝑘 𝐿 − 𝑎 𝑘 𝐿 𝜎′( 𝑧 𝑘 𝐿 ) Backpropagate the error: for l = L-1, L-2, … 2 compute 𝛿 𝑘 𝑙 = 𝛿 𝑘 𝑙+1 𝑤 𝑘𝑗 𝑙+1 𝜎′(𝑧 𝑗 𝑙 ) Calculate the gradients: 𝜕𝐸 𝜕 𝑤 𝑘𝑗 𝑙 = 𝜹 𝒋 𝒍 𝑎 𝑘 𝑙−1 and 𝜕𝐸 𝜕 𝑏 𝑗 𝑙 = 𝜹 𝒋 𝒍 end for end while Updates weights based on 1 input pattern x(n) © Mai Elshehaly

Three strategies to update the weights: Update after the network sees every single input pattern Update after the network sees a mini_batch of input patterns Update after the network sees the entire batch of input patterns The difference between the three strategies will be discussed in the lecture.

The mini_batch strategy: Ex.: mini_batch_size = 5

The mini_batch strategy: 1. Input one batch to the network: 2. Adjust weights 3. Move to the next batch 4. Repeat until no more batches in the training data set This is one epoch

To increase the accuracy: التكرار يعلم الشطار Repeat the previous process for a number of epochs Don’t input the mini batches in the same order (random.shuffle) With each new epoch, you can see that the accuracy increases Correctly classified samples Total number of test samples

To see the effect of parameters on accuracy Try passing different values for epochs, mini_batch_size, and eta

How to implement this shuffling and batching strategy? Example: Say you have a deck of 30 cards with labels 1… 30 You want to take 10 cards in each draw You want to keep drawing until no more cards You want to shuffle the cards then repeat 8 times

Shuffling cards

Shuffle for 8 epochs

Explore by 10 cards in each epoch:

Exercise: Write a function sum_mini_batches(training_data, epochs, mini_batch_size) that does the following for epochs times: Shuffles the cards in training_data Creates a number of mini batches each of which is of size mini_batch_size Prints the sum of the numbers in each mini batch

Lab Demo: Second Round

Review items Numpy’s dot() function The weights and biases of ANN Zip() function Negative indices in Python Matrix shape The Backpropagation pseudocode

dot() function

zip() function Example: initializing weights and biases

zip() function Example: to iterate over layers of weights and biases

Exercise Reuse the mini_batches code that we wrote earlier to generate inputs. Iterate over layers of weights and biases to calculate the z values of different layers. Assume that actual output = net input (a=z) for simplicity. Print z at each iteration.

Negative indices in Python: Try the following

Solution