Example: Voice Recognition

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Hopefully a clearer version of Neural Network. I1 O2 O1 H1 H2I2.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks Basic concepts ArchitectureOperation.
Biological inspiration Animals are able to react adaptively to changes in their external and internal environment, and they use their nervous system to.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Artificial Neural Networks Torsten Reil
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Hopefully a clearer version of Neural Network. With Actual Weights.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Overview of Back Propagation Algorithm
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
1 Machine Learning Neural Networks Neural Networks NN belong to the category of trained classification models The learned classification model is an.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Classification / Regression Neural Networks 2
Artificial Intelligence Techniques Multilayer Perceptrons.
Multi-Layer Perceptron
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Music Genre Classification Alex Stabile. Example File
Procedure for Training a Child to Identify a Cat using 10,000 Example Cats For Cat_index  1 to Show cat and describe catlike features (Cat_index)
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
1 Machine Learning Neural Networks Tomorrow no lesson! (will recover on Wednesday 11, WEKA !!)
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Network and Deep Learning 王强昌 MLA lab.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Artificial Neural Networks. 2 Outline What are Neural Networks? Biological Neural Networks ANN – The basics Feed forward net Training Example – Voice.
Alex Stabile. Research Questions: Could a computer learn to distinguish between different composers? Why does music by different composers even sound.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Deep Learning Amin Sobhani.
Artificial neural networks
ANN-based program for Tablet PC character recognition
One-layer neural networks Approximation problems
CSE 473 Introduction to Artificial Intelligence Neural Networks
Dr. Kenneth Stanley September 6, 2006
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Java Implementation of Optimal Brain Surgeon
CSE P573 Applications of Artificial Intelligence Neural Networks
Classification / Regression Neural Networks 2
CSE 473 Introduction to Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
BACKPROPOGATION OF NETWORKS
General Aspects of Learning
Lecture 11. MLP (III): Back-Propagation
BACKPROPAGATION Multlayer Network.
XOR problem Input 2 Input 1
Artificial Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Structure of a typical back-propagated multilayered perceptron used in this study. Structure of a typical back-propagated multilayered perceptron used.
COSC 4335: Part2: Other Classification Techniques
3. Feedforward Nets (1) Architecture
Rate of Change The rate of change is the change in y-values over the change in x-values.
David Kauchak CS51A Spring 2019
General Aspects of Learning
Artificial Neural Networks / Spring 2002
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Example: Voice Recognition Task: Learn to discriminate between two different voices saying “Hello” Data Sources Gray Davis Arnold Schwarzenegger Format Frequency distribution (60 bins) Analogy: cochlea

Network architecture Feed forward network 60 input (one for each frequency bin) 6 hidden 2 output (0-1 for “Gray”, 1-0 for “Arnold”)

Presenting the data Gray Arnold

Presenting the data (untrained network) Gray 0.43 0.26 Arnold 0.73 0.55

Calculate error 0.43 – 0 = 0.43 0.26 –1 = 0.74 0.73 – 1 = 0.27 Gray 0.43 – 0 = 0.43 0.26 –1 = 0.74 Arnold 0.73 – 1 = 0.27 0.55 – 0 = 0.55

Back propagation error and adjust weights Gray 0.43 – 0 = 0.43 0.26 – 1 = 0.74 1.17 Arnold 0.73 – 1 = 0.27 0.55 – 0 = 0.55 0.82

Repeat process (sweep) for all training pairs Present data Calculate error Back propagate error Adjust weights Repeat process multiple times

Presenting the data (trained network) Gray 0.01 0.99 Arnold 0.99 0.01

Results – Voice Recognition Performance of trained network Discrimination accuracy between known “Hello”s 100% Discrimination accuracy between new “Hello”’s