AI & Machine Learning Libraries By Logan Kearsley.

Slides:



Advertisements
Similar presentations
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Advertisements

Back-propagation Chih-yun Lin 5/16/2015. Agenda Perceptron vs. back-propagation network Network structure Learning rule Why a hidden layer? An example:
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Lecture 14 – Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Simple Neural Nets For Pattern Classification
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Neural Networks Basic concepts ArchitectureOperation.
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Chapter Seven The Network Approach: Mind as a Web.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Presenting: Itai Avron Supervisor: Chen Koren Characterization Presentation Spring 2005 Implementation of Artificial Intelligence System on FPGA.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Neural Networks Slides by Megan Vasta. Neural Networks Biological approach to AI Developed in 1943 Comprised of one or more layers of neurons Several.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Implementation Yaodong Bi. Introduction to Implementation Purposes of Implementation – Plan the system integrations required in each iteration – Distribute.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
C. Benatti, 3/15/2012, Slide 1 GA/ICA Workshop Carla Benatti 3/15/2012.
Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Multi-Layer Perceptron
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Meta-controlled Boltzmann Machine toward Accelerating the Computation Tran Duc Minh (*), Junzo Watada (**) (*) Institute Of Information Technology-Viet.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Neural Network Architecture Session 2
Fall 2004 Perceptron CS478 - Machine Learning.
CSSE463: Image Recognition Day 17
CSC 578 Neural Networks and Deep Learning
of the Artificial Neural Networks.
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Backpropagation.
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
The Network Approach: Mind as a Web
Presentation transcript:

AI & Machine Learning Libraries By Logan Kearsley

Purpose The purpose of this project is to design a system that combines the capabilities of multiple types of AI and machine learning systems, such as nervous networks and subsumption architectures, to produce a more flexible and versatile hybrid system.

Goals The end goal is to produce a set of basic library functions and architecture descriptions for the easy manipulation of the AI/ML subsystems (particularly neural networks), and use those to build an AI system capable of teaching itself how to complete tasks specified by a human-defined heuristic and altering learned behaviors to cope with changes in its operational environment with minimal human intervention.

Other Projects Don't know of any other similar projects. Builds on previous work done on multilayer perceptrons and subsumption architecture. Varies in trying to find ways to combine the different approaches to AI.

Design & Programming Modular / Black Box Design  The end user should be able to put together a working AI system with minimal knowledge of how the internals work Programming done in C

Testing Perceptron Neural Nets  Forced learning: make sure it will learn arbitrary input-output mappings after a certain number of exposures Subsumption Architecture  Simple test problems: does it run the right code for each sub- problem?

Algorithms Perceptrons:  Delta-rule learning: weights are adjusted based on the distance between the net's current output and the optimal output  Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually. Subsumption Architecture:  Scheduler takes a list of function pointers to task-specific functions  Task functions return an output or null  Highest-prioritynon-null task has its output executed each iteration

Algorithms Perceptron structure  Individual Neurons  vs.  Weight Matrix

Algorithms Subsumption Architecture:

Problems Back-Propagation is really confusing!

Results & Conclusions Single-layer perceptron works well  Capable of learning arbitrary mappings, but not an arbitrary combination of them  Multi-layer nets should learn arbitrary combinations, but learning algorithm for hidden layers is confusing. Can't re-use all of the same single-layer functions Plan Change  Originally, wanted to create a working system  Now, project goal is to produce useful function libraries- working systems are just for testing the code