Presentation is loading. Please wait.

Presentation is loading. Please wait.

A critical review of RNN for sequence learning Zachary C

Similar presentations


Presentation on theme: "A critical review of RNN for sequence learning Zachary C"— Presentation transcript:

1 A critical review of RNN for sequence learning Zachary C
A critical review of RNN for sequence learning Zachary C. Lipton

2 Time series Definition:A time series is a series of data points indexed (or listed or graphed) in time order. It is a sequence of discrete-time data. Feature: data points space sample from continuous real-word process Example: still images that comprise the frames of videos, clinical media data,natural language

3 Neural Networks Activation function: add the non-linear elements in network

4 Neural Networks Activation function: add the non-linear elements in network

5 Neural Networks Training process: backpropagation algorithm
Gradient decent + Chain rule Eg: partial derivative of e=(a+b)*(b+1) respective with respect to a and b

6 Neural Networks Training process: backpropagation algorithm

7 Neural Networks Training process: backpropagation algorithm

8 Neural Networks Training process: backpropagation algorithm

9 Neural Networks Training process: backpropagation algorithm

10 Neural Networks Training process: backpropagation algorithm

11 What is RNN? Feedforward neural network with inclusion of edge that span adjacent step times. Input for every time step contains the input of temporary time step and the output of last time step.

12 What is RNN? Training method: backporpagation , gradient decent.
Limitations: Vanishing gradients.

13 Vanishing gradient loss function: partial derivative of output:
partial derivative of (t-1) layer: partial derivative of (t-q) layer: relationship of gradients between (t-q) and t layer:

14 LSTM (long short-term memory)
To solve the problem of vanishing gradient

15 RNNs for Outlier Detection
Classification problem Training RNN weights to minimise the error by normal data. Since RNN attempts to represent the input patterns in the output, representing outliers are less well produced by the trained RNN have a higher reconstruction error.

16 Conclusion RNN can remember previous input.
When the problems involve continuous, prior knowledge related task, it could show advanced capability. RNN is a data inference method, which can get the probability disribution function from x(t) mapping to y(t).--- finding the relationship between 2 time series.


Download ppt "A critical review of RNN for sequence learning Zachary C"

Similar presentations


Ads by Google