Download presentation
Presentation is loading. Please wait.
Published byKathryn Wade Modified over 6 years ago
2
Introduction to Deep Neural Networks and Word Embedding
Amirkabir University Computer Engineering and Information Technology Department Computer Engineering and Information Technology Department Introduction to Deep Neural Networks and Word Embedding Iman Khani Jazani PhD Student at Amirkabir University, CEIT, LIMP (Laboratory for Intelligence Multimedia Processing)
3
Outline Motivation Multi Layer Perceptron (MLP)
Convolutional Neural Network (CNN) Recurrent Neural Network (RNN) AutoEncoder (AE) Word Embedding
4
Motivation
5
Intro to DNN and Word Embedding- Iman Khani Jazani
Motivation Motivation MLP Automatically writing Wikipedia articles, math papers, computer code and even Shakespeare Restore colors in B&W photos and videos Photo captioning Changing gazes of people in photos Translation of image to image Create new image Reading and searching of text in image and video Atari games Self driving car Sound to video Voice generation Music composition Text To Speech Restoring sound in videos Handwriting Deep Learning networks creating Deep Learning networks Predicting earthquakes Deep dream Convert word to meaningful vector … CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
6
Google Deepmind Atari game
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
7
Deep Learning networks creating Deep Learning networks
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
8
How to define a neural network
Motivation MLP CNN Choose suitable configuration Choose good initial value for weights Choose suitable learning rule RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
9
Multi Layer Perceptron
10
Intro to DNN and Word Embedding- Iman Khani Jazani
Perceptron Motivation MLP CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
11
Some Popular Activation Functions
Motivation MLP Sigmoid Tanh ReLU CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
12
Multi layer Perceptron (MLP)
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
13
Intro to DNN and Word Embedding- Iman Khani Jazani
Backpropagation Motivation MLP Error= 𝑖=1 𝑀 𝑦 𝑖 − 𝑦 𝑖 2 𝜕𝐸𝑟𝑟𝑜𝑟 𝜕 𝑤 𝑖 CNN RNN AutoEncoder Word Embedding 39 Intro to DNN and Word Embedding- Iman Khani Jazani
14
Vanishing or exploding gradient
Backpropagation (1) Motivation MLP Vanishing or exploding gradient CNN RNN AutoEncoder Word Embedding 39 14 Intro to DNN and Word Embedding- Iman Khani Jazani
15
Convolutional Neural Network
16
Why Convolutional Neural Network
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 16 Intro to DNN and Word Embedding- Iman Khani Jazani
17
Intro to DNN and Word Embedding- Iman Khani Jazani
CNN layers Motivation MLP CNN Convolution Pooling or Subsampling Fully connected RNN AutoEncoder Word Embedding 39 17 17 Intro to DNN and Word Embedding- Iman Khani Jazani
18
Intro to DNN and Word Embedding- Iman Khani Jazani
Convolution Motivation MLP CNN RNN AutoEncoder Word Embedding 39 18 18 Intro to DNN and Word Embedding- Iman Khani Jazani
19
Pooling or Subsampling
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 19 19 19 Intro to DNN and Word Embedding- Iman Khani Jazani
20
Recurrent Neural Network
21
Why recurrent neural network
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 21 21 21 21 Intro to DNN and Word Embedding- Iman Khani Jazani
22
Intro to DNN and Word Embedding- Iman Khani Jazani
RNN configuration Motivation MLP CNN RNN AutoEncoder Word Embedding 39 22 22 22 22 22 Intro to DNN and Word Embedding- Iman Khani Jazani
23
Intro to DNN and Word Embedding- Iman Khani Jazani
Bidirectional RNN Motivation MLP CNN RNN AutoEncoder Word Embedding 39 23 23 23 23 23 Intro to DNN and Word Embedding- Iman Khani Jazani
24
Intro to DNN and Word Embedding- Iman Khani Jazani
Stacked RNN Motivation MLP CNN RNN AutoEncoder Word Embedding 39 24 24 24 24 24 Intro to DNN and Word Embedding- Iman Khani Jazani
25
Intro to DNN and Word Embedding- Iman Khani Jazani
RNN Formulation Motivation MLP CNN RNN AutoEncoder Word Embedding 39 25 25 25 25 25 Intro to DNN and Word Embedding- Iman Khani Jazani
26
Long Short-Term Memory (LSTM)
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 26 26 26 26 26 Intro to DNN and Word Embedding- Iman Khani Jazani
27
Intro to DNN and Word Embedding- Iman Khani Jazani
Types of RNN Structure Motivation MLP CNN RNN AutoEncoder Word Embedding ? ? ? ? ? 39 27 27 27 27 27 Intro to DNN and Word Embedding- Iman Khani Jazani
28
AutoEncoder
29
Intro to DNN and Word Embedding- Iman Khani Jazani
Why AutoEncoder Motivation MLP Unsupervised Learning Visualization Compression Representation Learning Use as a preprocessing step in supervised learning CNN RNN AutoEncoder Word Embedding 39 29 29 29 29 29 Intro to DNN and Word Embedding- Iman Khani Jazani
30
AutoEncoder Structure and Loss Function
Motivation MLP CNN RNN AutoEncoder Word Embedding 39 30 30 30 30 30 Intro to DNN and Word Embedding- Iman Khani Jazani
31
Word Embedding رویکردهای مختلف رو میتونی اینجا بگی
بعدش وارد ورد تو وک بشی
32
Building Block of Deep NLP
Why Word2Vec Motivation MLP Building Block of Deep NLP CNN RNN AutoEncoder اشاره به اینکه ورد تو وک یک مدل پیشگو است برخلاف مدل های مبتنی بر شمارش و هم رخدادی Word Embedding 39 32 32 32 32 32 Intro to DNN and Word Embedding- Iman Khani Jazani
33
Intro to DNN and Word Embedding- Iman Khani Jazani
Structure Motivation MLP CNN RNN AutoEncoder سافت مکس اشاره شود به دیپ بودن این شبکه اشاره کن Word Embedding 39 33 33 33 33 33 Intro to DNN and Word Embedding- Iman Khani Jazani
34
Its challenge and Some solutions
Motivation MLP Number of weights Number of word in the corpus CNN RNN Treating common word pairs or phrases as single “words” in their model. Subsampling frequent words to decrease the number of training examples. Modifying the optimization objective with a technique they called “Negative Sampling”, which causes each training sample to update only a small percentage of the model’s weights. AutoEncoder Word Embedding 39 34 34 34 34 34 Intro to DNN and Word Embedding- Iman Khani Jazani
35
Intro to DNN and Word Embedding- Iman Khani Jazani
How to use it Motivation MLP CNN RNN AutoEncoder Word Embedding 39 35 35 35 35 35 Intro to DNN and Word Embedding- Iman Khani Jazani
36
Intro to DNN and Word Embedding- Iman Khani Jazani
… 2vec Motivation MLP Char2vec Sentence2vec Paragraph2vec Doc2vec Tweet2vec Lda2vec Wiki2vec Topic2vec Entity2vec Node2vec Sense2vec , … CNN RNN AutoEncoder Word Embedding 39 36 36 36 36 36 Intro to DNN and Word Embedding- Iman Khani Jazani
37
Intro to DNN and Word Embedding- Iman Khani Jazani
Some challenges Motivation MLP Multi-sense Similar is not synonym Inability to handle unknown or OOV words Scaling to new languages requires new embedding matrices No shared representations at sub-word levels CNN RNN AutoEncoder Word Embedding 39 37 37 37 37 37 Intro to DNN and Word Embedding- Iman Khani Jazani
38
Intro to DNN and Word Embedding- Iman Khani Jazani
Conclusion Motivation MLP Deep neural networks and their applications How to define artificial neural network Main structures in deep neural network Word embedding and its challenges CNN RNN AutoEncoder Word Embedding 39 38 38 38 38 38 Intro to DNN and Word Embedding- Iman Khani Jazani
39
Intro to DNN and Word Embedding- Iman Khani Jazani
Thanks for your attention Intro to DNN and Word Embedding- Iman Khani Jazani
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.