Environment Generation with GANs

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Artificial Neural Networks ML Paul Scheible.
CS 4700: Foundations of Artificial Intelligence
Overview of Back Propagation Algorithm
Dan Simon Cleveland State University
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Radboud University Medical Center, Nijmegen, Netherlands
Machine Learning Supervised Learning Classification and Regression
Neural networks and support vector machines
Welcome deep loria !.
Big data classification using neural network
Sentiment analysis using deep learning methods
Analysis of Sparse Convolutional Neural Networks
RNNs: An example applied to the prediction task
CS 388: Natural Language Processing: Neural Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Deep Feedforward Networks
Summary of “Efficient Deep Learning for Stereo Matching”
Deep Neural Net Scenery Generation
Learning with Perceptrons and Neural Networks
Computer Science and Engineering, Seoul National University
Jure Zbontar, Yann LeCun
Generative Adversarial Networks
Neural Networks CS 446 Machine Learning.
Intro to NLP and Deep Learning
Efficient Deep Model for Monocular Road Segmentation
Policy Compression for MDPs
Convolutional Networks
Shunyuan Zhang Nikhil Malik
Neural Networks and Backpropagation
RNNs: Going Beyond the SRN in Language Prediction
A critical review of RNN for sequence learning Zachary C
CS 4501: Introduction to Computer Vision Training Neural Networks II
Tips for Training Deep Network
Object Classification through Deconvolutional Neural Networks
Very Deep Convolutional Networks for Large-Scale Image Recognition
Smart Robots, Drones, IoT
Artificial Neural Networks
CSC 578 Neural Networks and Deep Learning
Generative Adversarial Network
LECTURE 35: Introduction to EEG Processing
Neural Networks Geoff Hulten.
Use 3D Convolutional Neural Network to Inspect Solder Ball Defects
LECTURE 33: Alternative OPTIMIZERS
Vinit Shah, Joseph Picone and Iyad Obeid
Lip movement Synthesis from Text
Coding neural networks: A gentle Introduction to keras
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Neural networks (1) Traditional multi-layer perceptrons
实习生汇报 ——北邮 张安迪.
Image Classification & Training of Neural Networks
Martin Schrimpf & Jon Gauthier MIT BCS Peer Lectures
ImageNet Classification with Deep Convolutional Neural Networks
LSTM: Long Short Term Memory
Deep Residual Learning for Automatic Seizure Detection
Keras.
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Introduction to Neural Networks
CS295: Modern Systems: Application Case Study Neural Network Accelerator Sang-Woo Jun Spring 2019 Many slides adapted from Hyoukjun Kwon‘s Gatech “Designing.
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
Learning and Memorization
Image recognition.
Distributed Reinforcement Learning for Multi-Robot Decentralized Collective Construction Gyu-Young Hwang
Cengizhan Can Phoebe de Nooijer
LHC beam mode classification
Overall Introduction for the Lecture
Presentation transcript:

Environment Generation with GANs Procedural Caves Environment Generation with GANs Murray Dunne - 2017

Outline Objective DCGAN Review Approach Results Other Approaches GAN Phase Connection Phase Results Other Approaches Future Work

Procedural Cave Generation Aim to generate infinitely wide environments for simulations and games Learn the characteristics of the image and generate larger matching environments Given an image “prefix”, extend it indefinitely Focus on extending only a single dimension Generate new pixels infinitely without repetition

Simple Cave Environment Aaron Moen - 2017

DCGAN Review Generator and Discriminator Discriminator is composed of convolutions then fully connected layers Generator is fully connected then deconvolutions Append discriminator onto generator to produce the adversarial network Discriminator weights are fixed in adversarial network Alternate training discriminator and generator Use cross entropy as loss function

DCGAN Review Adversarial Network generator discriminator noise vector image realness [1,0]

DCGAN Scaling DCGAN network sizes do not scale well 28x28 network has ~2.3M trainable parameters 28x100 network has ~5.0M 28x480 network has ~23.2M 1400x480 network has ~550M 1 color channel 4 convolution layers at 32,64,128,256 filters Trainable parameters scale poorly with pixels Large images quickly become non-viable Cannot use pooling layers, they are non-reversible

DCGAN Fixed Size DCGAN was originally for fixed size images SeqGAN focuses on discrete samples, not a continuous environment GANs for infinite real-valued generation are relatively unexplored

Approach Outline Train DCGAN on vertical slices of input Train second network to operate DCGAN Generate sequential matching elements Append them infinitely in one dimension Cave continues indefinitely without repetition

DCGAN Preparation Slice the input environment image into vertical slices 3173 slices, 28x480 pixels, grayscale

Generator Network Fully connected layer from noise Dropout for training 4 deconvolution layers ReLU activation Normalization Adjust activations to standard normal (μ = 0, σ = 1) 480x28x32 480x28x16 480x28x1 240x14x64 120x7x128 100

Discriminator Network 4 convolution layers Leaky ReLU (y=0.2x for x < 0) Avoids sparse gradient problem Dropout for training Fully connected layer Ends with single sigmoid activation 480x28x1 480x28x16 480x28x32 240x14x64 120x7x128 1

Training RMSProp optimizer with decay 10,000 update steps Multiply learning rate by running average of recent gradients Discriminator has more decay but higher initial learning rate 10,000 update steps ~250 samples randomly selected per step Cross entropy loss function

Training Accuracy and Loss 30,000 gradient steps 32 hours on 12GB Tesla K80

DCGAN Results 30,000 gradient updates

Connection Phase Train a network to connect samples generated by DCGAN Minimize difference between Rightmost column of existing environment Leftmost column of generator output Four fully connected layers ReLU activation, dropout for training generator 480x1 480 480 240 100 480x28 480x1

Connection Phase - Training RMSProp optimizer 20 epochs 10,000 training samples

Connection Phase Results 50 samples (1400x480)

Connection Phase - Issues Optimizing for smoothness yields overly smooth results Methods to improve variation Add noise to rightmost column vector LSTM based methods

Other Approaches LSTM DCGAN with extra input DCGAN with simple comparison

Other Approach - LSTM Too large a sample space 480 units of output Requires large memory Collapses quickly Difficult to teach not to be perfectly smooth

Other Approach – DCGAN Extra Add input to generator along with noise Last column of previous slice Quasi-RNN discriminator previous slice acceptability[1,0] generator image noise vector

Other Approach – DCGAN Extra Non-random input vector never converges Conflating realness with smoothness

Other Approach – DCGAN Extra 32400 gradient updates

DCGAN - Simple Comparison Run the DCGAN with noise as input Append to environment if it fits Rerun DCGAN as needed Euclidean distance between pixel columns Accept if below a threshold Accept it too much time has passed Adjust quality for real time results

Simple Comparison Result 50 samples (1400x480) Original for Reference

Future Work Use an LSTM for the connection phase Apply horizontal smoothing to results Train for longer Visible improvements between 10,000 and 30,000 updates More filters in convolution Bring out the extra features on the cave wall Not enough memory available

Slide Citations Neural Networks for Machine Learning G. Hinton, N. Srivastava, K. Swersky http://www.cs.toronto.edu/~tijmen/csc321/slides/lectu re_slides_lec6.pdf How to Train a GAN? Tips and tricks to make GANs work S. Chintala https://github.com/soumith/ganhacks GAN by Example using Keras on Tensorflow Backend R. Atienza https://medium.com/towards-data-science/gan-by- example-using-keras-on-tensorflow-backend- 1a6d515a60d0

Questions