Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter Seven The Network Approach: Mind as a Web.

Similar presentations


Presentation on theme: "Chapter Seven The Network Approach: Mind as a Web."— Presentation transcript:

1 Chapter Seven The Network Approach: Mind as a Web

2 Connectionism The major field of the network approach. The major field of the network approach. Connectionists construct Artificial Neural Networks (ANNs), which are computer simulations of how groups of neurons might perform some task. Connectionists construct Artificial Neural Networks (ANNs), which are computer simulations of how groups of neurons might perform some task.

3 Information processing ANNs utilize a processing strategy in which large numbers of computing units perform their calculations simultaneously. This is known as parallel distributed processing. ANNs utilize a processing strategy in which large numbers of computing units perform their calculations simultaneously. This is known as parallel distributed processing. In contrast, traditional computers are serial processors, performing one computation at a time. In contrast, traditional computers are serial processors, performing one computation at a time.

4 Serial and parallel processing architectures

5 Serial vs. Parallel Computing No difference in computing power. No difference in computing power. Parallel computing is simulated by general purpose computers. Parallel computing is simulated by general purpose computers. Modern general purpose computers are not strictly serial. Modern general purpose computers are not strictly serial.

6 Approaches The traditional approach in cognition and AI to solving problems is to use an algorithm in which every processing step is planned. (Not really.) It relies on symbols and operators applied to symbols. This is the knowledge-based approach. The traditional approach in cognition and AI to solving problems is to use an algorithm in which every processing step is planned. (Not really.) It relies on symbols and operators applied to symbols. This is the knowledge-based approach. Connectionists instead let the ANN perform the computation on its own without any (with less) planning. They are concerned with the behavior of the network. This is the behavior-based approach. (Not really.) Connectionists instead let the ANN perform the computation on its own without any (with less) planning. They are concerned with the behavior of the network. This is the behavior-based approach. (Not really.)

7 Knowledge representation Information in an ANN exists as a collection of nodes and the connections between them. This is a distributed representation. Information in an ANN exists as a collection of nodes and the connections between them. This is a distributed representation. Information in semantic networks, however, can be stored in a single node. This is a form of local representation. Information in semantic networks, however, can be stored in a single node. This is a form of local representation.

8 Characteristics of ANNs A node is a basic computing unit. A node is a basic computing unit. A link is the connection between one node and the next. A link is the connection between one node and the next. Weights specify the strength of connections. Weights specify the strength of connections. A node fires if it receives activation above threshold. A node fires if it receives activation above threshold.

9 Characteristics of ANNs A basis function determines the amount of stimulation a node receives. A basis function determines the amount of stimulation a node receives. An activation function maps the strength of the inputs onto the node’s output. An activation function maps the strength of the inputs onto the node’s output. A sigmoidal activation function

10 Early neural networks Hebb (1949) describes two type of cell groupings. Hebb (1949) describes two type of cell groupings. A cell assembly is a small group of neurons that repeatedly stimulate themselves. A cell assembly is a small group of neurons that repeatedly stimulate themselves. A phase sequence is a set of cell assemblies that activate each other. A phase sequence is a set of cell assemblies that activate each other. Hebb Rule: When one cell repeatedly activates another, the strength of the connection increases. Hebb Rule: When one cell repeatedly activates another, the strength of the connection increases.

11 Early neural networks Perceptrons were simple networks that could detect and recognize visual patterns. Perceptrons were simple networks that could detect and recognize visual patterns. Early perceptrons had only two layers, an input and an output layer. Early perceptrons had only two layers, an input and an output layer.

12 Modern ANNs More recent ANNs contain three layers, an input, hidden, and output layer. More recent ANNs contain three layers, an input, hidden, and output layer. Input units activate hidden units, which then activate the output units. Input units activate hidden units, which then activate the output units.

13 Backpropagation learning in ANNs An ANN can learn to make a correct response to a particular stimulus input. An ANN can learn to make a correct response to a particular stimulus input. The initial response is compared to a desired response represented by a teacher. The initial response is compared to a desired response represented by a teacher. The difference between the two, an error signal, is sent back to the network. The difference between the two, an error signal, is sent back to the network. This changes the weights so that the actual response is now closer to the desired. This changes the weights so that the actual response is now closer to the desired.

14 Criteria of different ANNs Supervised networks have a teacher. Unsupervised networks do not. Supervised networks have a teacher. Unsupervised networks do not. Networks can be either single-layer or multilayer. Networks can be either single-layer or multilayer. Information in a network can flow forward only, a feed-forward network, or it can flow back and forth between layers, a recurrent network. Information in a network can flow forward only, a feed-forward network, or it can flow back and forth between layers, a recurrent network.

15 Network typologies Hopfield-Tank networks. Supervised, single-layer, and laterally connected. Good at recovering “clean” versions of noisy patterns. Hopfield-Tank networks. Supervised, single-layer, and laterally connected. Good at recovering “clean” versions of noisy patterns. Kohonen networks. An example of a two-layer, unsupervised network. Able to create topological maps of features present in the input. Kohonen networks. An example of a two-layer, unsupervised network. Able to create topological maps of features present in the input. Adaptive Resonance Networks (ART). An unsupervised multilayer recurrent network that classifies input patterns. Adaptive Resonance Networks (ART). An unsupervised multilayer recurrent network that classifies input patterns.

16 Evaluating connectionism  Advantages: 1. Biological plausibility 2. Graceful degradation 3. Interference 4. Generalization  Disadvantages: 1. No massive parallelism 2. Convergent dynamic 3. Stability-plasticity dilemma 4. Catastrophic interference

17 Semantic networks Share some features in common with ANNs. Share some features in common with ANNs. Individual nodes represent meaningful concepts. Individual nodes represent meaningful concepts. Used to explain the organization and retrieval of information from LTM. Used to explain the organization and retrieval of information from LTM.

18 Characteristics of semantic networks Spreading activation. Activity spreads outward from nodes along links and activates other nodes. Spreading activation. Activity spreads outward from nodes along links and activates other nodes. Retrieval cues. Nodes associated with others can activate them indirectly. Retrieval cues. Nodes associated with others can activate them indirectly. Priming. Residual activation can facilitate responding. Priming. Residual activation can facilitate responding.

19 A hierarchical semantic network Sentence verification tasks suggest a hierarchical organization of concepts in semantic memory (Collins and Quillian, 1969). Sentence verification tasks suggest a hierarchical organization of concepts in semantic memory (Collins and Quillian, 1969). Meaning for concepts such as animals may be arranged into superordinate, ordinate, and subordinate categories. Meaning for concepts such as animals may be arranged into superordinate, ordinate, and subordinate categories. Vertical distance in the network corresponds to category membership. Vertical distance in the network corresponds to category membership. Horizontal distance corresponds to property information. Horizontal distance corresponds to property information.

20 Example of A Hierarchical Semantic Network From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

21 Propositional networks Can represent propositional or sentence-like information. Example: “The man threw the ball.” Can represent propositional or sentence-like information. Example: “The man threw the ball.” Allow for more complex relationships between concepts such as agents, objects, and relations. Allow for more complex relationships between concepts such as agents, objects, and relations. Can also code for episodic knowledge. Can also code for episodic knowledge.

22 Example of A Propositional Semantic Network From S. C. Shapiro, Knowledge Representation. In L. Nadel, Ed., Encyclopedia of Cognitive Science, Macmillan, 2003.

23 Episodic Memory in Cassie a SNePS-Based Agent  *NOW contains SNePS term representing current time.  *NOW moves when Cassie acts or perceives a change of state.

24 B6 Representation of Time find lex action object B1 ! agent act event time NOW !! beforeafterbeforeafter ????????????? I

25 Movement of Time t1 t2! beforeafter t3! beforeafter NOW

26 Performing a Punctual Act t1 t3! beforeafter NOW t2! beforeafter ! time event

27 Performing a Durative Act t1 NOW ! beforeafter t2 ! time event NOW t3 ! supint subint


Download ppt "Chapter Seven The Network Approach: Mind as a Web."

Similar presentations


Ads by Google