Text Independent Speaker Recognition with Added Noise Jason Cardillo & Raihan Ali Bashir April 11, 2005.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Kostas Kontogiannis E&CE
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Simple Neural Nets For Pattern Classification
Unsupervised Learning With Neural Nets Deep Learning and Neural Nets Spring 2015.
Branch Prediction with Neural- Networks: Hidden Layers and Recurrent Connections Andrew Smith CSE Dept. June 10, 2004.
Neural Networks Basic concepts ArchitectureOperation.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
VARIABLE PRESELECTION LIST LENGTH ESTIMATION USING NEURAL NETWORKS IN A TELEPHONE SPEECH HYPOTHESIS-VERIFICATION SYSTEM J. Macías-Guarasa, J. Ferreiros,
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Pattern Similarity and Storage Capacity of Hopfield Network Suman K Manandhar Prof. Ramakoti Sadananda Computer Science and Information Management AIT.
Multiple-Layer Networks and Backpropagation Algorithms
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Chapter 9 Neural Network.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Akram Bitar and Larry Manevitz Department of Computer Science
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Object Recognizing. Deep Learning Success in 2012 DeepNet and speech processing.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Speech Recognition through Neural Networks By Mohammad Usman Afzal Mohammad Waseem.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Unsupervised Learning of Video Representations using LSTMs
Recurrent Neural Networks for Natural Language Processing
第 3 章 神经网络.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Backpropagation in fully recurrent and continuous networks
Intelligent Information System Lab
Neural Networks 2 CS446 Machine Learning.
RECURRENT NEURAL NETWORKS FOR VOICE ACTIVITY DETECTION
Prof. Carolina Ruiz Department of Computer Science
Convolutional Neural Networks
Lecture 11. MLP (III): Back-Propagation
Artificial Neural Network & Backpropagation Algorithm
Artificial Intelligence Chapter 3 Neural Networks
network of simple neuron-like computing elements
Neural Network - 2 Mayank Vatsa
Backpropagation.
Artificial Intelligence Chapter 3 Neural Networks
Lecture 16: Recurrent Neural Networks (RNNs)
Learning linguistic structure with simple recurrent neural networks
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Introduction to Neural Network
Prof. Carolina Ruiz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Text Independent Speaker Recognition with Added Noise Jason Cardillo & Raihan Ali Bashir April 11, 2005

Problem Definition Many methods for Text Independent Speech Recognition (MFCC, Gaussian, Markov etc) Few methods perform well with noisy speech samples.

Project Goal Implement Text Independent Speaker Recognition system robust to noise effect. The suggested implementation method is Recurrent Neural Nets (RNN)

Definition of RNN Recurrent networks (RNs) are models with bi-directional data flow. While a feed-forward network propagates data linearly from input to output, RNs also propagate data from later processing stages to earlier stages. In a fully recurrent network, every neuron receives inputs from every other neuron in the network. These networks are not arranged in layers. Usually only a subset of the neurons receive external inputs in addition to the inputs from all the other neurons, and another disjunct subset of neurons report their output externally as well as sending it to all the neurons. These distinctive inputs and outputs perform the function of the input and output layers of a feed-forword or simple recurrent network, and also join all the other neurons in the recurrent processing.

Why RNN for our Purpose? RNN captures long-term contextual effect over time Therefore can use temporal context to compensate for missing data. Also allows a single net to perform both imputation and classification.

Corrupted Data Solution X= missing data at time t; y = learning rate; V jm = indicates recurrent links from a hidden unit to the missing input; hid = activation of hidden unit j at time t-1 Input missing values for the next frame through the recurrent links after a feed-forward pass.

Corrupted Data Solution(cont ’ d) English Translation of previous slide: – Basically fill in missing data with average of all of the non-corrupted frames. – Accomplished by factoring sum squared error between correct targets and RNN output of each frame – Back propagate this result through time to fix corrupted inputs

System Architecture

Performance Testing Measured by comparing original error of signal to error remaining after passing through the system.

References [1] Parveen,S, Green, P.D.Speech Recognition with Missing Data using Recurrent Neural Nets. University of Sheffield Dept of Computer Science. [2] [3] Recurrent Neural Networks [4] Jain,BJ, Wysotzki F. Learning with Neural Networks in the Domain of Graphs. Technical University of Berlin.

Questions