Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Neural networks Introduction Fitting neural networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Perceptron Learning Rule
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
Artificial Intelligence (CS 461D)
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Introduction to Neural Network Justin Jansen December 9 th 2002.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
IT 691 Final Presentation Pace University Created by: Robert M Gust Mark Lee Samir Hessami Mark Lee Samir Hessami.
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
Hub Queue Size Analyzer Implementing Neural Networks in practice.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial Basis Function (RBF) Networks
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Machine Learning. Learning agent Any other agent.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
ARTIFICIAL NEURAL NETWORKS. Overview EdGeneral concepts Areej:Learning and Training Wesley:Limitations and optimization of ANNs Cora:Applications and.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Face Detection Using Neural Network By Kamaljeet Verma ( ) Akshay Ukey ( )
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
INTRODUCTION TO NEURAL NETWORKS 2 A new sort of computer What are (everyday) computer systems good at... and not so good at? Good at..Not so good at..
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Big data classification using neural network
Fall 2004 Perceptron CS478 - Machine Learning.
Data Mining, Neural Network and Genetic Programming
Artificial Intelligence (CS 370D)
Machine Learning. Support Vector Machines A Support Vector Machine (SVM) can be imagined as a surface that creates a boundary between points of data.
Chapter 12 Advanced Intelligent Systems
Creating Data Representations
Machine Learning. Support Vector Machines A Support Vector Machine (SVM) can be imagined as a surface that creates a boundary between points of data.
Machine Learning. Support Vector Machines A Support Vector Machine (SVM) can be imagined as a surface that creates a boundary between points of data.
Department of Electrical Engineering
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Ch4: Backpropagation (BP)
Ch4: Backpropagation (BP)
Presentation transcript:

Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak

Presentation Overview Artificial Neural Networks NASA profiles as a Pattern Classification Task Implementing Pattern Classification on the Silicon Recognition Neuron Board

ANN General Overview Inspired by biological neuron models Decision are distributed throughout system Components interacts at several stages Components work in unison to solve one problem ANN well suited to problems that don’t have closed form algorithmic solution Black box model – hard to interpret

ANNs Applied Driving a car. Task involves determining the direction the road is curving based on an image Image pixels are input Curve angle is output Learned non-linear relationship between pixels and roads hard to understand the model

Theory Input layer, Hidden layer, Output Layer Input – one “neuron” for each pixel Hidden – set of neurons to store what was learned. Output layer – one neuron for each of 5 turn directions – ie straight, slight left, hard right.. But what is a neuron?

Neurons Math terms: Binary decision makers English terms: take in some input, decide whether is “high” or “low”, and pass this value on. Similar to neurons in the brain propagating signals.

How is the model learned? Once you have training examples Pair (feature set, classification) Run the back propagation algorithms Learns connection weights between neurons, as well as within neuron transfer weights. Network topology changes affects learning behavior!

Unsupervised Networks So far we have used labeled training examples to learn relationship between features and response variable. What if we are just exploring patterns that exist in the features? Unsupervised learning / clustering…

Unsupervised Goal Find prototypical points within data set that can be used to approximate distribution of the whole data set. Compression / Visualization Typical methods Define a measure of similarity between data points Select initial prototypes Move prototypes to best fit the data – minimize error

Kohonen Net Unsupervised technique Each output neuron represents a prototype point, input layer is presented with data

Self Organizing Map Each prototype is pulled towards that data, and brings its closest prototypes a little closer as well… SOM Demo

How is it relevant to our problem? We will go through: Definition of what we are facing How we can use neural nets How we can improve on this method

LIDAR in Space Data from LITE project Send a laser beam down from a shuttle and get reflection readings. Data gathered for 53 hours in GB of data

Input

Problem There is too much data to send and process Each pixel is a number to be transmitted. Each reading (one column) contains 3000 pixels. Need a way to transmit the information in a more compact fashion.

Applying Neural Nets Some readings are very similar to each other. Define classes that will contain multiple readings. Define a representative for the class which is close enough to all the class members.

Applying Neural Nets (cont.) In order to get class Train Kohonen Nets on data for specific number of classes. Once trained, just pass in a new reading (3000 data points) and get the class id. Just transfer the class id. Go from 3000x3000 numbers to be transferred to 3000 numbers

Example

Result – 8 classes

Result – 16 Classes

Result – 64 Classes

Result – 128 Classes

Result – 256 Classes

Problems Takes a while to train 256 classes took 44 hours on a dual P3 1GHz computer with 1.5GB of RAM. Not optimal Classifying by the complete reading is wasteful. Need one number for each reading

Improving NN sky

Improving NN (cont.) Easier version: Look at blocks of 50x50 pixels instead of just 1 pixel. Classify as sky, etc. Can preprocess using kNN, basic filters to get rid of noise in data. Classes can be formed from sequence of 50 pixel blocks.

Improving NN (cont.) Harder version: Use variable sized mesh In parts with just the sky use a big block. In detailed parts use smaller blocks. Something to think about…

Muren Board Silicon Recognition, Inc. 2 ZISC078 Chips (156 neurons in parallel) 1 Mb of memory Up to 1 million recognitions/second

ZISC ZISC, the cornerstone of the Muren board, completely overcomes the three major limitations of software-based pattern recognition applications: ZISC fully handles non-linearity and fuzziness ZISC is a fully massively parallel processing system. Recognition speed is consistent, regardless of the number of stored patterns ZISC does not use any algorithms for programming the solution, ZISC is taught

RBF Space mapping

Example Source picture (circuit board) Result (after processing) (shows good and bad points)

Process Learning Classifying Training Testing with new data

NASA Data

Further investigation Can we write directly to the board? Ability to preprocess data?

Questions?