Modeling Cross-Episodic Migration of Memory Using Neural Networks by, Adam Britt Definitions: Episodic Memory – Memory of a specific event, combination.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Multi-Layer Perceptron (MLP)
Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
NEURAL NETWORKS Perceptron
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
Artificial Neural Networks
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Machine Learning Neural Networks
1 Part I Artificial Neural Networks Sofia Nikitaki.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Radial Basis Functions
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Introduction to Neural Network Justin Jansen December 9 th 2002.
Chapter Seven The Network Approach: Mind as a Web.
Chapter 6: Multilayer Neural Networks
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
Hub Queue Size Analyzer Implementing Neural Networks in practice.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CS Reinforcement Learning1 Reinforcement Learning Variation on Supervised Learning Exact target outputs are not given Some variation of reward is.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Chapter 11: Artificial Intelligence
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
Sign Classification Boosted Cascade of Classifiers using University of Southern California Thang Dinh Eunyoung Kim
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Neural Network Architecture Session 2
Chapter 11: Artificial Intelligence
The Gradient Descent Algorithm
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
CSSE463: Image Recognition Day 17
Artificial Neural Networks for Pattern Recognition
Chapter 12 Advanced Intelligent Systems
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
Pattern Recognition & Machine Learning
CSSE463: Image Recognition Day 17
Artificial Intelligence 12. Two Layer ANNs
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
David Kauchak CS51A Spring 2019
The Network Approach: Mind as a Web
Learning Combinational Logic
EE 193/Comp 150 Computing with Biological Parts
Presentation transcript:

Modeling Cross-Episodic Migration of Memory Using Neural Networks by, Adam Britt Definitions: Episodic Memory – Memory of a specific event, combination of experiences into a schema Cross-Episodic Migration of Memory – Memory error confusing different schemas Neural Network - Biologically inspired model of computation that can be trained on how to respond to given input, store memory similar to the way we think our brain does or both. Sample Neural Network Learning Neural Network with no learning. It starts with randomized weights on the edges. Same Network after learning two different patterns. Project This Project was designed to take neural networks and use them to mimic the results of a study done by one of St. Lawrence’s psychology professors Dr. Sharon Hannigan. The study was very specific in episodic memory, discerning between two types of episodic memory errors that all humans make. This is something that has not been done before. It is important because once an accurate model has been made, similar studies can be made on a computer, which is cheaper and faster, rather than to perform another empirical study. If data from the model indicates new knowledge, then a new empirical study can be made using the parameters found from the model, furthering our knowledge. Challenges Since something of this specific nature had not been done before, there were some difficulties in how to implement this exactly. There are several different types of neural networks, and deciding which would be best to use took trial and error. Once an acceptable neural network structure was created, interpreting the results became the most difficult task. Without any precedent it was difficult to design an interpretation algorithm that could accurately explain the data in terms of the desired results. What kept coming up was “These results look good but what does it mean?” The problem was that all of our testing criteria was arbitrary and it was through trial and error that we come up with better explain the data Results Really Big Neural Networks. The output interpretation algorithm I came up with did not yield the same results as seen in Dr. Hannigan’s study, although they showed some similarities that gives hope that, given the right interpretation algorithm, the neural networks can accurately model episodic memory migration. Data Used The data was taken from different restaurants and groceries around the area with each restaurant or grocery representing an episode. Each episode consisted of 27 different attributes that could be either true or false. Some of the attributes were specific only to grocery (does it have a produce section?), some only to restaurant (does it have a bar?) and there were overlapping attributes (is it large or small?). This data did not change throughout the testing phase of the program. How does a Neural Network work? One can easily calculate the values of a neural network. The first layer is the input, which is a given. The weights on each edge are randomized so that’s another given. On the first pass through the network, start at the input layer, pick a node and follow an edge to a node in the middle layer. The partial value of the node in the middle layer is the value of the node going to it times the weight of the edge. The whole value will be the sum of all edges coming into that node. Do this for all nodes in the middle layer. Now you can move on to the output layer. Do the same thing for the output layer. Once you have done this for all the output nodes you take the actual results and compare them to the desired results. If the results meet a predetermined threshold, you accept the network The retrieval changes nothing with the neural network. It just gives the saves the values of the nodes of any given input. Mentor: Dr. Ed Harcourt Cognition Advisor: Dr. Sharon Hannigan Image from: