PSY105 Neural Networks 2/5 2. “A universe of numbers”

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks 2. Overview  The McCulloch-Pitts neuron  Pattern space  Limitations  Learning.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Artificial Neural Networks (1)
Perceptron Learning Rule
Artificial Intelligence Techniques. Aims: Section fundamental theory and practical applications of artificial neural networks.
G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.
Introduction to Artificial Intelligence (G51IAI)
Biological and Artificial Neurons Michael J. Watts
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Neural Networks (II) Simple Learning Rule
Neural Network Oleh Danny Manongga
Radial Basis Functions
September 14, 2010Neural Networks Lecture 3: Models of Neurons and Neural Networks 1 Visual Illusions demonstrate how we perceive an “interpreted version”
September 16, 2010Neural Networks Lecture 4: Models of Neurons and Neural Networks 1 Capabilities of Threshold Neurons By choosing appropriate weights.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #30 4/15/02 Neural Networks.
(Page 554 – 564) Ping Perez CS 147 Summer 2001 Alternative Parallel Architectures  Dataflow  Systolic arrays  Neural networks.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
Rohit Ray ESE 251. What are Artificial Neural Networks? ANN are inspired by models of the biological nervous systems such as the brain Novel structure.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Intelligence Chapter 2 Stimulus-Response Agents
PSY105 Neural Networks 4/5 4. “Traces in time” Assignment note: you don't need to read the full book to answer the first half of the question. You should.
Explorations in Neural Networks Tianhui Cai Period 3.
2101INT – Principles of Intelligent Systems Lecture 10.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
PSY105 Neural Networks 1/5 1. “Patterns emerge”. π.
Logical Calculus of Ideas Immanent in Nervous Activity McCulloch and Pitts Deep Learning Fatima Al-Raisi.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
An informal description of artificial neural networks John MacCormick.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Perceptron Networks and Vector Notation n CS/PY 231 Lab Presentation # 3 n January 31, 2005 n Mount Union College.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
PSY105 Neural Networks 5/5 5. “Function – Computation - Mechanism”
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Artificial Intelligence & Neural Network
© SEMINAR ON ARTIFICIAL NEURAL NETWORK AND ITS APPLICATIONS By Mr. Susant Kumar Behera Mrs. I. Vijaya.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS 621 Artificial Intelligence Lecture /11/05 Guest Lecture by Prof
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
April 5, 2016Introduction to Artificial Intelligence Lecture 17: Neural Network Paradigms II 1 Capabilities of Threshold Neurons By choosing appropriate.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Lecture 2. Basic Neurons To model neurons we have to idealize them:
Ranga Rodrigo February 8, 2014
CSE 473 Introduction to Artificial Intelligence Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
بحث في موضوع : Neural Network
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Perceptron as one Type of Linear Discriminants
network of simple neuron-like computing elements
Artificial Intelligence Lecture No. 28
Capabilities of Threshold Neurons
Fuzzy Logic Colter McClure.
Prepared by: Mahmoud Rafeek Al-Farra
Lecture 02: Perceptron By: Nur Uddin, Ph.D.
Introduction to Neural Network
David Kauchak CS158 – Spring 2019

Artificial Neural Network
Presentation transcript:

PSY105 Neural Networks 2/5 2. “A universe of numbers”

Lecture 1 recap We can describe patterns at one level of description that emerge due to rules followed at a lower level of description. Neural network modellers hope that we can understand behaviour by creating models of networks of artificial neurons.

Warren McCullock First artificial neuron model Warren McCulloch (neurophysiologist) Walter Pitts (mathematician)

A simple artificial neuron Threshold Add weight input activation Multiply inputs by weights and add. If the sum is larger than a threshold output 1, otherwise output 0 Threshold logic unit (TLU)

0 1 output activation TLU: the output relation threshold The relation is non-linear – small changes in activation give different changes in the output depending on the initial activation

Model neuron function, reminders… Inputs vary, they can be 0 or 1 – Weights change, effectively ‘interpreting’ inputs There is a weight for each input – This can be a +ve number (excitation) or a –ve number (inhibition) – Weights do not change when inputs change Activation = weighted sum of inputs – Activation = input1 x weight1 + input2xweight2 etc If activation>threshold, output = 1, otherwise output=0 – Threshold = 1

States, weights & functions States: all the possible combinations of inputs Weights: how each input is multiplied before contributing to the activation of the unit Functions: a way inputs are combined to produce outputs

Computing with neurons: identify (1) Input 0 1 Weight 0.7 Activation Output 0 X input weight output ? Threshold = 1 Act. State 1 State 2

Computing with neurons: identity (2) Input 0 1 Weight 1 Activation 0 1 Output 0 1 input weight output ? Threshold = 1 Act. State 1 State 2

Question: How could you use these simple neurons (TLUs) to compute the AND function? Input Input Output 0 1

Computing with neurons: AND inputs weights output ? Input Input Activation Output 0 1 Threshold = 1, Weight 1 = 0.5, Weight 2 = 0.5 Act. State 1 State 2 State 3 State 4

Networks of such neurons are Turing complete

Semilinear node Squashing function weight input Add activation

0 1 output activation Semilinear node: the output relation (squashing function) threshold