語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬

Slides:



Advertisements
Similar presentations
FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:
Advertisements

Genome 559: Introduction to Statistical and Computational Genomics Elhanan Borenstein Artificial Neural Networks Some slides adapted from Geoffrey Hinton.
NEURAL NETWORKS Backpropagation Algorithm
Neural networks Introduction Fitting neural networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
LOGO Classification III Lecturer: Dr. Bo Yuan
Neural Optimization of Evolutionary Algorithm Strategy Parameters Hiral Patel.
CS 4700: Foundations of Artificial Intelligence
CS 484 – Artificial Intelligence
Radial-Basis Function Networks
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Classification III Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
Classification Part 3: Artificial Neural Networks
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Computer Science and Engineering
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Classification / Regression Neural Networks 2
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 9: Ways of speeding up the learning and preventing overfitting Geoffrey Hinton.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Neural Network and Deep Learning 王强昌 MLA lab.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Machine Learning Supervised Learning Classification and Regression
Big data classification using neural network
Deep Learning Amin Sobhani.
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
بحث في موضوع : Neural Network
Prof. Carolina Ruiz Department of Computer Science
Machine Learning Today: Reading: Maria Florina Balcan
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Artificial Neural Networks
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Prof. Carolina Ruiz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬 ANN Tutorial 語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬

ㄊㄞ ㄨㄢ ㄉㄚ ㄒㄩ ㄝ Acoustic Model Speech Recognition Language Model 台 灣 大 學

How do we build this machine? ㄅ ㄆ ㄇ ㄈ ㄉ ㄊ ㄋ ㄌ ㄍ ㄎ ㄏ ㄐ ㄑ ㄒ ㄓ ㄔ ㄕ ㄖ ㄗ ㄘ ㄙ ㄧ ㄨ ㄩ ㄚ ㄛ ㄜ ㄝ ㄞ ㄟ ㄠ ㄡ ㄢ ㄣ ㄤ ㄥ ㄦ Classifier How do we build this machine? Our Task

“Field of study that gives computers the ability to learn without being explicitly programmed” Instead of writing a program by hand for each specific task, we collect lots of examples that specify the correct output for a given input. A machine learning algorithm then takes these examples and produces a program that does the job. The program produced by the learning algorithm may look very different from a typical hand-written program. It may contain millions of numbers. If we do it right, the program works for new cases as well as the ones we trained it on. If the data changes the program can change too by training on the new data. Massive amounts of computation are now cheaper than paying someone to write a task-specific program. Instead of writing a program by hand for each specific task, we collect lots of examples that specify the correct output for a given input. A machine learning algorithm then takes these examples and produces a program that does the job. The program produced by the learning algorithm may look very different from a typical hand-written program. It may contain millions of numbers. If we do it right, the program works for new cases as well as the ones we trained it on. If the data changes the program can change too by training on the new data. Massive amounts of computation are now cheaper than paying someone to write a task-specific program. Machine Learning Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera

The Classification Task Features Classifier Classes Male Hair Length Make-up . Classifier Female Others 舉例而言 若要針對這房間裡的人 做性別的分類 The Classification Task

Male Female Make-Up 2D Feature Space Hair Length Voice pitch

We need some type of non-linear function! Multi-D Feature Space

Neurons Each neuron receives inputs from other neurons The effect of each input line on the neuron is controlled by a synaptic weight The weights can be positive or negative. The synaptic weights adapt so that the whole network learns to perform useful computations Recognizing objects, understanding language, making plans, controlling the body. Neurons Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera

Artificial Neural Network Feed Forward Net activation function bias i input th x1 w1 x2 w2 output index over input connections weight on x3 w3 y th i input w4 x4 b 1 Artificial Neural Network A lot of simple non-linearity  complex non-linearity

Sigmoid Function A.K.A Logistic Function Activation Function

How Do Neural Nets Learn? Intuition: 1. Start with random weights 2. Compare the outputs of the net to the targets 3. Try to adjust the weights to match the outputs with the targets yj tj Target 1 0.2 4 0.9 1 -3 w How Do Neural Nets Learn?

Gradient Descent x1 w1 x2 w2 x3 w3 x4 w4 Error w2 w1 Updated weights Learning rate Reference: Machine Learning course by Andrew Ng, Coursera

Back Propagation yj j zj i yi Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera

Overfitting Which model do you trust? The complicated model fits the data better. But it is not economical. A model is convincing when it fits a lot of data surprisingly well. Not just the training data. output = y input = x Which output value should you predict for this test input? Overfitting Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera

Divide dataset into training set and validation set Use training set for training, validation set as indicator to overfitting Use another independent testing test to performance Training and Testing

ㄊㄞ ㄨㄢ ㄉㄚ ㄒㄩ ㄝ Acoustic Model Speech Recognition Language Model 台 灣 大 學

Our Task ㄅ ㄆ ㄇ ㄈ ㄉ ㄊ ㄋ ㄌ ㄍ ㄎ ㄏ ㄐ ㄑ ㄒ ㄓ ㄔ ㄕ ㄖ ㄗ ㄘ ㄙ ㄧ ㄨ ㄩ ㄚ ㄛ ㄜ ㄝ ㄞ ㄟ ㄠ ㄡ ㄢ ㄣ ㄤ ㄥ ㄦ Our Task

We can ‘splice’ the right and left 4 MFCCs to form a 13 x 9 = 117 Dim Vector Example --- Inputs MFCC:13 Dim Vector

Example --- Output Right context dependent initial final phones ‘a’, ‘ai’, ‘an’, ‘ang’, ‘au’, ‘b_a’, ‘b_e’, ‘b_ee’, ‘b_i’ … Target ‘a’ ‘ai’ 1 117 Dim MFCC ‘an’ Example --- Output

Matlab NN toolkit---Net net = newff(Feature,Label,10); % given Feature and Label, this command generates an NN with 10 hidden units net = newff(Feature,Label,[20 10],{'logsig' 'tansig'}); % this command generates an NN with two layers, one with 20 units and sigmoid as activation function, one with 10 units and tanh as activation function Matlab NN toolkit---Net

Matlab NN Toolkit ---Train [net,tr] = train(net,Feature,Label); % this command will train the network, dividing your dataset into training, validation, and testing set Matlab NN Toolkit ---Train

Matlab NN Toolkit ---Test out = sim(net,testInputs); % this command runs the command you have trained on a test data set to obtain outputs Matlab NN Toolkit ---Test

Matlab NN Toolkit ---NNTool % this command calls a GUI for NN visualization Matlab NN Toolkit ---NNTool

Demo Program Location: speech.ee.ntu.edu.tw/~poetek/nnkit Copy the folder to your computer, and run the file ‘run.m’ for a quick demo Demo Program

DEMO

Demo Program --- Results Example (1): Classify the inputs into 4 classes: {‘a’, ‘e’, ‘i’, ‘u’} ans = correct: 1139 out of 1205 accuracy = 0.9452 classification = 214 3 0 3 13 195 9 6 0 5 377 12 1 11 3 353 Demo Program --- Results

Demo Program --- Results(2) Example (2): Classify the inputs into 6 classes: {‘a’, ‘e’, ‘i’, ‘u’, ‘b_a’, ‘p_a’, ‘m_a’, ‘f_a’} correct: 1293 out of 1446 accuracy = 0.8942 classification = 207 7 0 0 0 0 0 0 5 204 8 16 1 0 0 0 2 9 397 8 0 0 0 2 2 10 2 348 0 0 0 0 3 1 1 1 63 0 0 14 0 0 0 2 5 0 0 4 2 5 2 6 4 0 0 0 8 5 2 5 11 0 0 74 Demo Program --- Results(2)

Demo Program --- Results(3) Example (3): Classify the inputs into all 147 classes correct: 92 out of 506 accuracy = 0.1818 Demo Program --- Results(3)

Please run example 1~3 with the demo program, record your performance, run time and your observations Please play around with different settings to achieve better accuracy Some parameters you can change: Net structure Training algorithm Input Features net.trainParam (Please type ‘help newff’ for more info) Assignments

Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera Machine Learning course by Andrew Ng, Coursera Matlab NN toolkit documentation References

Thank You! Feel free to ask questions! My Email: b98901024@ntu.edu.tw Thank You!