-Artificial Neural Network- Chapter 2 Basic Model

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Perceptron Lecture 4.
There are two basic categories: There are two basic categories: 1. Feed-forward Neural Networks These are the nets in which the signals.
Introduction to Artificial Neural Networks
Introduction to Neural Networks Computing
1 Image Classification MSc Image Processing Assignment March 2003.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
-Artificial Neural Network- Counter Propagation Network
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華 教授.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
-Artificial Neural Network- Adaptive Resonance Theory(ART) 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Chapter 4 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Chapter 5 Back Propagation Network
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
-Artificial Neural Network- Matlab操作介紹 -以類神經網路BPN Model為例
朝陽科技大學 資訊科技研究所 流程改善方法 第一次報告 指 導 教 授:陳 隆 昇 學 生:李 延 浚 ( )
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授.
ISMP Lab 新生訓練課程 Artificial Neural Networks 類神經網路
Neuron Model and Network Architecture
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
C. Benatti, 3/15/2012, Slide 1 GA/ICA Workshop Carla Benatti 3/15/2012.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Hebbian Coincidence Learning
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Connectionist Luger: Artificial.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Multi-Layer Perceptron
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
2002 年編寫 朝陽科技大學資管系 李麗華 What went wrong? (1995) The Standish Group published a study “CHAOS.” 365 information technology executive managers who managed.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Chapter 6 Neural Network.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Computer Graphics 台科大 資管系 楊傳凱 助理教授. Course Syllabus – 1/3 Course Web Site: Please go to the blackboard system( ) to get.
1 Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory.
南台科技大學 資訊工程系 An effective solution for trademark image retrieval by combining shape description and feature matching 指導教授:李育強 報告者 :楊智雁 日期 : 2010/08/27.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
CSE 473 Introduction to Artificial Intelligence Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Prof. Carolina Ruiz Department of Computer Science
Convolutional Neural Networks
XOR problem Input 2 Input 1
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Chapter 5
-Artificial Neural Network- Perceptron (感知器)
Copyright © 2014 Elsevier Inc. All rights reserved.
ARTIFICIAL NEURAL networks.
Introduction to Neural Network
Artificial Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

-Artificial Neural Network- Chapter 2 Basic Model 朝陽科技大學 資訊管理系 李麗華 教授

Introduction to ANN Basic Model Input layer Hidden layer Output layer Weights Processing Element(PE) Learning Recalling Energy function 朝陽科技大學 李麗華 教授

ANN Components (1/4) Input layer:[X1,X2,…..Xn]t , where t means vector transpose. Hidden layer: I j => net j => Y j Output layer:Yj Three ways of generating output: normalized, competitive output, competitive learning Weights :Wij means the connection value between layers Wij Y ‧ X1 X2 Xn Yj Y1 朝陽科技大學 李麗華 教授

ANN Components (2/4) 5. Processing Element(PE) (A)Summation Function: (supervised) or (unsupervised) (B)Activity Function: or or (C)Transfer Function: Discrete type Linear type Non-linear type 朝陽科技大學 李麗華 教授

ANN Components (3/4) 6. Learning: 7. Recalling: Based on the ANN model used, learning is to adjust weights to accommodate a set of training pattern in the network. 7. Recalling: Based on the ANN model used, recalling is to apply the real data pattern to the trained network so that the outputs are generated and examined. 朝陽科技大學 李麗華 教授

ANN Components (4/4) 8. Energy function: Energy function is a verification function which determines if the network energy has converged to its minimum. Whenever the energy function approaches to zero, the network approaches to its optimum solution. 朝陽科技大學 李麗華 教授

Transfer Functions (1/3) Discrete type transfer function: 1 -1 1 net j > 0 Step function or perceptron fc. Yj= if 0 net j <=0 1 -1 Hopfield-Tank fc. 1 net j > 0 0 net j<0 if Ynj= Yn-1j netj=0 1 -1 -1 net j<=0 1 net j > 0 if Yj = Signum fc. 朝陽科技大學 李麗華 教授

Transfer Functions (2/3) Discrete type transfer function: 1 -1 1 net j > 0 -1 net j<0 if Yj = netj = 0 Signum0 fc. 1 -1 1 net j > 0 -1 net j<0 if Ynj = net j = 0 Yn-1j BAM fc. 朝陽科技大學 李麗華 教授

Transfer Functions (3/3) Linear type: Nonlinear type transfer function: Yj = net j net j net j > 0 Yj = if 0 net j <=0 Yj = Sigmoid function Yj = Hyperbolic Tangent function 朝陽科技大學 李麗華 教授

Energy function (a) The energy function for supervised network learning: E = where E is the energy value ΔW= this is the value for adjusting weight Wij (b) The energy function for unsupervised network learning: E = ΔW= this is the value for adjusting weight Wij 朝陽科技大學 李麗華 教授