Chapter Seven The Network Approach: Mind as a Web.

Slides:



Advertisements
Similar presentations
KULIAH II JST: BASIC CONCEPTS
Advertisements

5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Basic concepts ArchitectureOperation.
Knowing Semantic memory.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
COGNITIVE NEUROSCIENCE
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Artificial Neural Network
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
General Knowledge Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Explorations in Neural Networks Tianhui Cai Period 3.
2101INT – Principles of Intelligent Systems Lecture 10.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Bain on Neural Networks and Connectionism Stephanie Rosenthal September 9, 2015.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
CHAPTER SEVEN The Network Approach: Mind as a Web.
NEURONAL NETWORKS AND CONNECTIONIST (PDP) MODELS Thorndike’s “Law of Effect” (1920’s) –Reward strengthens connections for operant response Hebb’s “reverberatory.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
Information Processing
Neural networks.
Neural Network Architecture Session 2
Neural Networks.
Fall 2004 Perceptron CS478 - Machine Learning.
Artificial neural networks
Other Classification Models: Neural Network
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
XOR problem Input 2 Input 1
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
CSE 573 Introduction to Artificial Intelligence Neural Networks
The Network Approach: Mind as a Web
Introduction to Neural Network

Presentation transcript:

Chapter Seven The Network Approach: Mind as a Web

Connectionism The major field of the network approach. The major field of the network approach. Connectionists construct Artificial Neural Networks (ANNs), which are computer simulations of how groups of neurons might perform some task. Connectionists construct Artificial Neural Networks (ANNs), which are computer simulations of how groups of neurons might perform some task.

Information processing ANNs utilize a processing strategy in which large numbers of computing units perform their calculations simultaneously. This is known as parallel distributed processing. ANNs utilize a processing strategy in which large numbers of computing units perform their calculations simultaneously. This is known as parallel distributed processing. In contrast, traditional computers are serial processors, performing one computation at a time. In contrast, traditional computers are serial processors, performing one computation at a time.

Serial and parallel processing architectures

Serial vs. Parallel Computing No difference in computing power. No difference in computing power. Parallel computing is simulated by general purpose computers. Parallel computing is simulated by general purpose computers. Modern general purpose computers are not strictly serial. Modern general purpose computers are not strictly serial.

Approaches The traditional approach in cognition and AI to solving problems is to use an algorithm in which every processing step is planned. (Not really.) It relies on symbols and operators applied to symbols. This is the knowledge-based approach. The traditional approach in cognition and AI to solving problems is to use an algorithm in which every processing step is planned. (Not really.) It relies on symbols and operators applied to symbols. This is the knowledge-based approach. Connectionists instead let the ANN perform the computation on its own without any (with less) planning. They are concerned with the behavior of the network. This is the behavior-based approach. (Not really.) Connectionists instead let the ANN perform the computation on its own without any (with less) planning. They are concerned with the behavior of the network. This is the behavior-based approach. (Not really.)

Knowledge representation Information in an ANN exists as a collection of nodes and the connections between them. This is a distributed representation. Information in an ANN exists as a collection of nodes and the connections between them. This is a distributed representation. Information in semantic networks, however, can be stored in a single node. This is a form of local representation. Information in semantic networks, however, can be stored in a single node. This is a form of local representation.

Characteristics of ANNs A node is a basic computing unit. A node is a basic computing unit. A link is the connection between one node and the next. A link is the connection between one node and the next. Weights specify the strength of connections. Weights specify the strength of connections. A node fires if it receives activation above threshold. A node fires if it receives activation above threshold.

Characteristics of ANNs A basis function determines the amount of stimulation a node receives. A basis function determines the amount of stimulation a node receives. An activation function maps the strength of the inputs onto the node’s output. An activation function maps the strength of the inputs onto the node’s output. A sigmoidal activation function

Early neural networks Hebb (1949) describes two type of cell groupings. Hebb (1949) describes two type of cell groupings. A cell assembly is a small group of neurons that repeatedly stimulate themselves. A cell assembly is a small group of neurons that repeatedly stimulate themselves. A phase sequence is a set of cell assemblies that activate each other. A phase sequence is a set of cell assemblies that activate each other. Hebb Rule: When one cell repeatedly activates another, the strength of the connection increases. Hebb Rule: When one cell repeatedly activates another, the strength of the connection increases.

Early neural networks Perceptrons were simple networks that could detect and recognize visual patterns. Perceptrons were simple networks that could detect and recognize visual patterns. Early perceptrons had only two layers, an input and an output layer. Early perceptrons had only two layers, an input and an output layer.

Modern ANNs More recent ANNs contain three layers, an input, hidden, and output layer. More recent ANNs contain three layers, an input, hidden, and output layer. Input units activate hidden units, which then activate the output units. Input units activate hidden units, which then activate the output units.

Backpropagation learning in ANNs An ANN can learn to make a correct response to a particular stimulus input. An ANN can learn to make a correct response to a particular stimulus input. The initial response is compared to a desired response represented by a teacher. The initial response is compared to a desired response represented by a teacher. The difference between the two, an error signal, is sent back to the network. The difference between the two, an error signal, is sent back to the network. This changes the weights so that the actual response is now closer to the desired. This changes the weights so that the actual response is now closer to the desired.

Criteria of different ANNs Supervised networks have a teacher. Unsupervised networks do not. Supervised networks have a teacher. Unsupervised networks do not. Networks can be either single-layer or multilayer. Networks can be either single-layer or multilayer. Information in a network can flow forward only, a feed-forward network, or it can flow back and forth between layers, a recurrent network. Information in a network can flow forward only, a feed-forward network, or it can flow back and forth between layers, a recurrent network.

Network typologies Hopfield-Tank networks. Supervised, single-layer, and laterally connected. Good at recovering “clean” versions of noisy patterns. Hopfield-Tank networks. Supervised, single-layer, and laterally connected. Good at recovering “clean” versions of noisy patterns. Kohonen networks. An example of a two-layer, unsupervised network. Able to create topological maps of features present in the input. Kohonen networks. An example of a two-layer, unsupervised network. Able to create topological maps of features present in the input. Adaptive Resonance Networks (ART). An unsupervised multilayer recurrent network that classifies input patterns. Adaptive Resonance Networks (ART). An unsupervised multilayer recurrent network that classifies input patterns.

Evaluating connectionism  Advantages: 1. Biological plausibility 2. Graceful degradation 3. Interference 4. Generalization  Disadvantages: 1. No massive parallelism 2. Convergent dynamic 3. Stability-plasticity dilemma 4. Catastrophic interference

Semantic networks Share some features in common with ANNs. Share some features in common with ANNs. Individual nodes represent meaningful concepts. Individual nodes represent meaningful concepts. Used to explain the organization and retrieval of information from LTM. Used to explain the organization and retrieval of information from LTM.

Characteristics of semantic networks Spreading activation. Activity spreads outward from nodes along links and activates other nodes. Spreading activation. Activity spreads outward from nodes along links and activates other nodes. Retrieval cues. Nodes associated with others can activate them indirectly. Retrieval cues. Nodes associated with others can activate them indirectly. Priming. Residual activation can facilitate responding. Priming. Residual activation can facilitate responding.

A hierarchical semantic network Sentence verification tasks suggest a hierarchical organization of concepts in semantic memory (Collins and Quillian, 1969). Sentence verification tasks suggest a hierarchical organization of concepts in semantic memory (Collins and Quillian, 1969). Meaning for concepts such as animals may be arranged into superordinate, ordinate, and subordinate categories. Meaning for concepts such as animals may be arranged into superordinate, ordinate, and subordinate categories. Vertical distance in the network corresponds to category membership. Vertical distance in the network corresponds to category membership. Horizontal distance corresponds to property information. Horizontal distance corresponds to property information.

Example of A Hierarchical Semantic Network From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

Propositional networks Can represent propositional or sentence-like information. Example: “The man threw the ball.” Can represent propositional or sentence-like information. Example: “The man threw the ball.” Allow for more complex relationships between concepts such as agents, objects, and relations. Allow for more complex relationships between concepts such as agents, objects, and relations. Can also code for episodic knowledge. Can also code for episodic knowledge.

Example of A Propositional Semantic Network From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

Episodic Memory in Cassie a SNePS-Based Agent  *NOW contains SNePS term representing current time.  *NOW moves when Cassie acts or perceives a change of state.

B6 Representation of Time find lex action object B1 ! agent act event time NOW !! beforeafterbeforeafter ????????????? I

Movement of Time t1 t2! beforeafter t3! beforeafter NOW

Performing a Punctual Act t1 t3! beforeafter NOW t2! beforeafter ! time event

Performing a Durative Act t1 NOW ! beforeafter t2 ! time event NOW t3 ! supint subint