Recurrent Encoder-Decoder Networks for Time-Varying Dense Predictions

Slides:



Advertisements
Similar presentations
Introduction to Recurrent neural networks (RNN), Long short-term memory (LSTM) Wenjie Pei In this coffee talk, I would like to present you some basic.
Advertisements

Deep Learning Neural Network with Memory (1)
Convolutional LSTM Networks for Subcellular Localization of Proteins
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Predicting the dropouts rate of online course using LSTM method
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SHOW.
Learning to Answer Questions from Image Using Convolutional Neural Network Lin Ma, Zhengdong Lu, and Hang Li Huawei Noah’s Ark Lab, Hong Kong
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
Convolutional Sequence to Sequence Learning
Unsupervised Learning of Video Representations using LSTMs
Faster R-CNN – Concepts
End-To-End Memory Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
CS 4501: Introduction to Computer Vision Computer Vision + Natural Language Connelly Barnes Some slides from Fei-Fei Li / Andrej Karpathy / Justin Johnson.
Recurrent Neural Networks for Natural Language Processing
Recurrent Neural Networks
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
Matt Gormley Lecture 16 October 24, 2016
Deep Learning: Model Summary
Combining CNN with RNN for scene labeling (segmentation)
ICS 491 Big Data Analytics Fall 2017 Deep Learning
Intelligent Information System Lab
Different Units Ramakrishna Vedantam.
Synthesis of X-ray Projections via Deep Learning
Week 6 Cecilia La Place.
Neural Networks 2 CS446 Machine Learning.
Shunyuan Zhang Nikhil Malik
CS6890 Deep Learning Weizhen Cai
Image Question Answering
Attention Is All You Need
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
Convolutional Neural Networks
Introduction to Neural Networks
Grid Long Short-Term Memory
RNN and LSTM Using MXNet Cyrus M Vahid, Principal Solutions Architect
Wei Liu, Chaofeng Chen and Kwan-Yee K. Wong
Advanced Artificial Intelligence
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
A First Look at Music Composition using LSTM Recurrent Neural Networks
Recurrent Neural Networks
Understanding LSTM Networks
ECE599/692 - Deep Learning Lecture 14 – Recurrent Neural Network (RNN)
The Big Health Data–Intelligent Machine Paradox
Long Short Term Memory within Recurrent Neural Networks
Other Classification Models: Recurrent Neural Network (RNN)
Lecture 16: Recurrent Neural Networks (RNNs)
Attention.
Please enjoy.
LSTM: Long Short Term Memory
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Hello Edge: Keyword Spotting on Microcontrollers Yundong Zhang, Naveen Suda, Liangzhen Lai and Vikas Chandra ARM Research, Stanford University arXiv.org,
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Attention for translation
Learn to Comment Mentor: Mahdi M. Kalayeh
Dilated Neural Networks for Time Series Forecasting
Recurrent Neural Networks (RNNs)
Automatic Handwriting Generation
Rgh
Neural Machine Translation using CNN
Question Answering System
Sequence to Sequence Music Generation
Presented By: Harshul Gupta
Baseline Model CSV Files Pandas DataFrame Sentence Lists
Week 3 Presentation Ngoc Ta Aidean Sharghi.
Recurrent Neural Networks
Sequence-to-Sequence Models
Bidirectional LSTM-CRF Models for Sequence Tagging
End-to-End Speech-Driven Facial Animation with Temporal GANs
Week 7 Presentation Ngoc Ta Aidean Sharghi
Multi-Target Detection and Tracking of UAVs from a UAV
Presentation transcript:

Recurrent Encoder-Decoder Networks for Time-Varying Dense Predictions Presented By: Chanon Chantaduly

Time-Varying Dense Predictions When we predict a label for each of the input units rather than the entire input Example: predicting the pixels that represent a cat Time-varying dataset Spatial and Temporal Example: video analysis, medical imaging

Recurrent Neural Networks (RNN) Primarily used in sequencing, where preservation of temporal information is needed. no no no no no yes a <0> a <1> a <2> a <3> a <4> a <5> The chicken butt t-shirt is Anthony’s

Problems with RNNs Not very good at capturing long term dependencies Vanishing/exploding gradient problem Next layer gets information only from the previous layer What if we need information from later on in the sequence?

Modified RNNs – Two main types Gated Recurrent Unit (GRU) Two gates Simpler model, which allows for building bigger models Long Short Term Memory (LSTM) Three gates More powerful historically, but higher computational costs

Information From The Future Bidirectional RNNs! yes no no no a<1> a<1> a<2> a<2> a<3> a<3> a<4> a<4> a<1> RNN #1 RNN #2 a<2> a<3> a<2> a<3> a<4> Pat loves Doge memes

Spatial vs Temporal Convolutional Neural Networks (CNN) are very good at preserving spatial information. Recurrent Neural Networks (RNN) are very good at processing temporal information. So how do we process datasets that require preserving both spatial and temporal information?

Convolutional Recurrent Neural Networks (CRNN) Combines together CNN and RNN by replacing the fully connected layers in the RNN with convolutional connections. Transforms GRU/LSTM to CGRU/CLSTM by adding the convolutional connections. Limitation: Very high computational cost/high memory consumption.

Incorporating CRNN units into FCN U-Net example: U-Net U-Net + 1 CLSTM/CGRU U-Net + 3 CLSTM/CGRU U-Net + 5 CLSTM/CGRU

Results