Presentation is loading. Please wait.

Presentation is loading. Please wait.

Recurrent Neural Networks ECE 398BD Instructor: Shobha Vasudevan.

Similar presentations


Presentation on theme: "Recurrent Neural Networks ECE 398BD Instructor: Shobha Vasudevan."— Presentation transcript:

1 Recurrent Neural Networks ECE 398BD Instructor: Shobha Vasudevan

2 Neural Networks x W U h y

3 Recurrent Neural Network (RNN) Recurrent structure: hidden layer depends on not only the input but also the hidden layer of previous time step. x(t) W U h(t) y(t) h(t-1) V

4 Recurrent Neural Network (RNN) x(t) W U h(t) y(t) h(t-1) V

5 Application Natural language: a good example of series inputs A trained RNN can decide if a sentence is meaningful. I saw him yesterday. (√) Saw yesterday I him. (x) Also used for speech processing, dealing with words of same or similar pronunciation. I saw him yesterday. (√) Eye soul him yesterday. (x) Language Models deal with such problems.

6 Language Model (LM)

7 RNNLM Recurrent Neural Network Language Model Remembers every previous word, the farther the weaker. Consumes storage of only 3 weight matrices. Weakness: training and computing is complicated and time-consuming

8 RNNLM Input and output have the same size of vocabulary, each element represents a word in the vocabulary. Each time the input is a word: set the corresponding element to “1”, set the rest to “0” x(t) W U h(t) y(t) h(t-1) V

9 RNNLM x(t) W U h(t) y(t) h(t-1) V

10 RNNLM

11 RNNLM Application Decide if a sentence is meaningful: I saw him yesterday. (high score) Saw yesterday I him. (low score) Decide correct word among words of same or similar pronunciation: I saw him yesterday. (high score) Eye soul him yesterday. (low score) From a voice record, a recognition system gives all possible combinations of words and sends them to RNNLM. Choose the sentence with highest score.

12 RNNLM Application Microsoft Research Sentence Completion Challenge I have seen it on him, and could _____ to it. (a) write (b) migrate (c) climb (d) swear (e) contribute Fill in the 5 words respectively to get 5 sentences. The sentence with (d) gets highest score.

13 References Boden, M. (2001). A guide to recurrent neural networks and backpropagation. Mikolov, T., Kombrink, S., Burget, L., Cernocky, J. H., & Khudanpur, S. (2011, May). Extensions of recurrent neural network language model. In Acoustics, Speech and Signal Processing (ICASSP), 2011 IEEE International Conference on (pp. 5528-5531). IEEE. Sutskever, I., Martens, J., & Hinton, G. E. (2011). Generating text with recurrent neural networks. In Proceedings of the 28th International Conference on Machine Learning (ICML-11) (pp. 1017-1024).


Download ppt "Recurrent Neural Networks ECE 398BD Instructor: Shobha Vasudevan."

Similar presentations


Ads by Google