Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.

Slides:



Advertisements
Similar presentations
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.
Biological and Artificial Neurons Michael J. Watts
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Simple Neural Nets For Pattern Classification
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
1 Pendahuluan Pertemuan 1 Matakuliah: T0293/Neuro Computing Tahun: 2005.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Data Mining with Neural Networks (HK: Chapter 7.5)
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
NEURAL NETWORKS Introduction
Artificial Neural Network
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Artificial Neural Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Feed-Forward Neural Networks 主講人 : 虞台文. Content Introduction Single-Layer Perceptron Networks Learning Rules for Single-Layer Perceptron Networks – Perceptron.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Lecture 5 Neural Control
Ten MC Questions taken from the Text, slides and described in class presentation. COSC 4426 AJ Boulay Julia Johnson.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Artificial Neural Networks An Introduction. Outline Introduction Biological and artificial neurons Perceptrons (problems) Backpropagation network Training.
INTRODUCTION TO NEURAL NETWORKS 2 A new sort of computer What are (everyday) computer systems good at... and not so good at? Good at..Not so good at..
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural networks.
Fall 2004 Backpropagation CS478 - Machine Learning.
Neural Networks.
Real Neurons Cell structures Cell body Dendrites Axon
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
G5AIAI Introduction to AI
Backpropagation.
The Network Approach: Mind as a Web
Introduction to Neural Network
David Kauchak CS158 – Spring 2019

PYTHON Deep Learning Prof. Muhammad Saeed.
Presentation transcript:

Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple nodes communicate with each other through synapses that interconnect them Imitative of structure of biological nervous systems. Anns are more accurately described as a class of parallel algorithms

Knowledge in Anns Long term knowledge is stored in the networks in the states of the synaptic interconnections – in anns as weights between nodes Short term knowledge is temporarily stored in the on/off states of the nodes Both kinds of stored information determine how the network will respond to to inputs

Training of ANNS Networks are organized by by automated training methods, this simplifies the development of specific applications There is big advantage in all situations where no clear set of logical rules are given The inherent fault tolerance of nns is also a big advantage Nns can also be made to be tolerant against noise in the input : with increased noise the quality of the output only degrades slowly. (Graceful degradation)

Training of Networks A network will begin with no memories of the input space A network needs to go through a training phase in which it classifies input vectors

One of the major advantages of nns is their ability to generalize One of the major advantages of nns is their ability to generalize. This means that a trained net could classify data from the same class as the learning data that it has never seen before Training set – used to train the net Validation set – used to determine the the performance of the net on patterns not trained during learning phase A test set for finally checking the over all performance of a NN

Mcculloch Pitts Model Acts a feature detector N inputs N weights M outputs Threshold θ

Mcculloch Pitts Model Input is modulated by weighting the value of the connection Input is then integrated by the unit to produce the stimulation signal to the unit This becomes the activation If activation if >= θ the neuron fires Inhibitory input is absolute in keeping a neuron off

Mcculloch Pitts Model The mcculloch pitts model is severely limited Can only categorize linearly separable domains No training regime

Rosenblatt’s Perceptron Single layer network Again can categorize patterns However a training algorithm exist to adjust weights within the network which causes the network to ‘learn’ Hebbian learning algorithm used

Hebbian Learning Donald hebb proposed a learning theory If a neuron X in a nervous system repeatedly stimulates a second neuron, Y to fire, then the pathways from X to Y becomes increasingly efficient in conveying that stimulus The perceptron is essentially mcculloch pitt’s model with hebbian training model

Problem With the Perceptron Minksy & papert proved the perceptron couldn’t categorize a problem as simple as the XOR problem

Introduction of a Hidden Layer The hidden layer solves the issue of linear separability This introduces the idea of a multi-layer network, the multi-perceptron is born

The Dawn of New Networks Back-propagation Hopfield Kohonen SOM – self organizing maps

The Back-propagation Algorithm Essentially a multi-layer perceptron with A different threshold function A more robust capable learning rule Backprop acts as a classifier/predictor eg. Loan evaluation A trained network is guaranteed to find relationships between input and output presented to it

Neural networks are universal approximators Backprop has been shown to always find the right model it will always converge

Training a Back Propagation Network An input pattern is propagated forward in the net until activation reaches the output layer The output at the output layer is compared with the teaching input The error j (if there is one) is propagated back through the network from the output through the hidden layers weights are adjusted so that all those nodes that contribute to the error are adjusted

Backprop In online learning the change in weights are applied to the network after each training pattern after each forward and back pass In offline or batch learning the weight changes are cumulated for all patterns in the input set after one full cycle (epoch) through the training pattern file In the back propagation algorithm online training is usually faster than batch training especially in the case of large training sets with many similar examples

Limitations of Neural Networks Scalability Neural networks can become unstable when applied to larger problems

A simple example - OR (single layer see diagrams below) We want a network to be able to respond to learn the OR input pattern Input Output 0 0 0 0 1 1 1 0 1 1 1 1 The network has 2 inputs, and one output. All are binary. The output is determined by 1 if W0 *I0 + W1 * I1 > 0 0 if W0 *I0 + W1 * I1 <= 0 We want it to learn simple OR: output a 1 if either I0 or I1 is 1.

Single layer Input 1 output Input 2

Inside the neuron stage 1 W1 * input1 Input 1 output + = Activation Input 2 W2 W2 * input2

Inside the neuron stage 2 If activation (a) >= threshold () output = 1 else output =0 Input 1 output Input 2

Neuron fundamentals activation a = wixi threshold =  output = 1 if a >= threshold output = 0 if a <threshold -1 > wi < 1