Deep Belief Nets and Ising Model-Based Network Construction

Slides:



Advertisements
Similar presentations
Greedy Layer-Wise Training of Deep Networks
Advertisements

Deep Learning Bing-Chen Tsai 1/21.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CS590M 2008 Fall: Paper Presentation
Advanced topics.
Stacking RBMs and Auto-encoders for Deep Architectures References:[Bengio, 2009], [Vincent et al., 2008] 2011/03/03 강병곤.
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
Kostas Kontogiannis E&CE
Presented by: Mingyuan Zhou Duke University, ECE September 18, 2009
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Deep Belief Networks for Spam Filtering
CSC321: Introduction to Neural Networks and Machine Learning Lecture 20 Learning features one layer at a time Geoffrey Hinton.
Deep Boltzman machines Paper by : R. Salakhutdinov, G. Hinton Presenter : Roozbeh Gholizadeh.
Learning Multiplicative Interactions many slides from Hinton.
Using Fast Weights to Improve Persistent Contrastive Divergence Tijmen Tieleman Geoffrey Hinton Department of Computer Science, University of Toronto ICML.
CSC2535: Computation in Neural Networks Lecture 11: Conditional Random Fields Geoffrey Hinton.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
CSC Lecture 8a Learning Multiplicative Interactions Geoffrey Hinton.
Geoffrey Hinton CSC2535: 2013 Lecture 5 Deep Boltzmann Machines.
Markov Random Fields Probabilistic Models for Images
CIAR Second Summer School Tutorial Lecture 1a Sigmoid Belief Nets and Boltzmann Machines Geoffrey Hinton.
CSC 2535 Lecture 8 Products of Experts Geoffrey Hinton.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 18 Learning Boltzmann Machines Geoffrey Hinton.
CSC Lecture 6a Learning Multiplicative Interactions Geoffrey Hinton.
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
DeepDive Model Dongfang Xu Ph.D student, School of Information, University of Arizona Dec 13, 2015.
Comparison of Tarry’s Algorithm and Awerbuch’s Algorithm CS 6/73201 Advanced Operating System Presentation by: Sanjitkumar Patel.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 19: Learning Restricted Boltzmann Machines Geoffrey Hinton.
Cognitive models for emotion recognition: Big Data and Deep Learning
Convolutional Restricted Boltzmann Machines for Feature Learning Mohammad Norouzi Advisor: Dr. Greg Mori Simon Fraser University 27 Nov
Introduction on Graphic Models
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Deep learning Tsai bing-chen 10/22.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
CSC2535 Lecture 5 Sigmoid Belief Nets
CSC2515 Fall 2008 Introduction to Machine Learning Lecture 8 Deep Belief Nets All lecture slides will be available as.ppt,.ps, &.htm at
Neural Networks William Cohen [pilfered from: Ziv; Geoff Hinton; Yoshua Bengio; Yann LeCun; Hongkak Lee - NIPs 2010 tutorial ]
CSC321 Lecture 24 Using Boltzmann machines to initialize backpropagation Geoffrey Hinton.
Deep Belief Network Training Same greedy layer-wise approach First train lowest RBM (h 0 – h 1 ) using RBM update algorithm (note h 0 is x) Freeze weights.
CSC Lecture 23: Sigmoid Belief Nets and the wake-sleep algorithm Geoffrey Hinton.
CSC321 Lecture 27 Using Boltzmann machines to initialize backpropagation Geoffrey Hinton.
1 Restricted Boltzmann Machines and Applications Pattern Recognition (IC6304) [Presentation Date: ] [ Ph.D Candidate,
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
Some Slides from 2007 NIPS tutorial by Prof. Geoffrey Hinton
Learning Deep Generative Models by Ruslan Salakhutdinov
Energy models and Deep Belief Networks
CSC321: Neural Networks Lecture 22 Learning features one layer at a time Geoffrey Hinton.
Restricted Boltzmann Machines for Classification
LECTURE ??: DEEP LEARNING
Multimodal Learning with Deep Boltzmann Machines
Creating fuzzy rules from numerical data using a neural network
Intelligent Information System Lab
Deep Learning Qing LU, Siyuan CAO.
Factor Graph in DeepDive
Structure learning with deep autoencoders
Prof. Adriana Kovashka University of Pittsburgh April 4, 2017
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Restricted Boltzman Machines
Department of Electrical and Computer Engineering
Deep Architectures for Artificial Intelligence
Basics of Deep Learning No Math Required
Regulation Analysis using Restricted Boltzmann Machines
Markov Random Fields Presented by: Vladan Radosavljevic.
GANG: Detecting Fraudulent Users in OSNs
CSC321 Winter 2007 Lecture 21: Some Demonstrations of Restricted Boltzmann Machines Geoffrey Hinton.
CSC 2535: Computation in Neural Networks Lecture 9 Learning Multiple Layers of Features Greedily Geoffrey Hinton.
Attention for translation
Graph Attention Networks
Sanguthevar Rajasekaran University of Connecticut
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Deep Belief Nets and Ising Model-Based Network Construction Lecturer: Bo Chen

Network Construction Social , academic ,etc. Several users(nodes) A line means a connection between two users(nodes) Weights of the line indicates the strength of connection

What can it do? 1. Prediction: predict the information source 2. Recommendation: movie, paper , music 3. Other applications

How? 1. Information collection 2. Feature extraction 3. Construction based on features

Various Information Features How? Information Collection Various Information Feature Extraction Features Network Construction

Our Model Information Deep Belief Nets Features Weights Ising Model

Deep Belief Nets Constructed by Restricted Boltzmann Machine layer by layer greedily h3 h2 h1 data

Restricted Boltzmann Machine

Restricted Boltzmann Machine 1. Undirected or bidirected 2. One layer of visible units 3. One layer of hidden units 4. No connection between hidden units 5. No connection between visible units 6. The hidden units are conditionally independent given the visible units hidden j i visible

Terminology in RBM Weights and biases: weights refer to the connection between units and biases refer to each unit

Terminology in RBM Energy: v: visible units, vector, each usually take 1 or 0 h: hidden units, vector, each usually take 1 or 0 a: biases of visible units b: biases of hidden units w: weights

Terminology in RBM Probability: The probability of a configuration over both visible and hidden units: partition function

I & O in RBM Input: visible units Output: weights and biases (generative model)

Return to Deep Belief Nets Steps to train a DBN. 1. Use the parameters to extract a layer of features (the first hidden layer) from raw data 2. Treat the activations of the features as the input for the second hidden layer 3. Repeat step2 to train multiple layers of RBM

Return to Deep Belief Nets The third hidden layer h3 The third RBM The second hidden layer h2 The second RBM The first hidden layer h1 The first RBM data

A Brief Review Now we have a considerable number of features!!! 1. How to construct a network 2. Our model 3. The introduction of RBM 4. To train deep belief nets Now we have a considerable number of features!!!

Ising Model Ising Model User1 User2 User3 User4 User5 User6 Feature1 1 1 Feature2 Feature3 Feature4 Ising Model

Ising Model What will Ising Model do? 1. Treat every user as a node in an Ising Model 2. Treat every feature as one observation of their interaction 3. Infer the interaction between them from all the observations ( features )

Ising Model What is Ising Model? An energy-based model The energy of a configuration(Hamiltonian Function) Configuration σ (a set of users) Interaction 𝐽 𝑖𝑗 (connection strength between users) External field effect ℎ 𝑗

An Illustration External field effect Interaction Configuration

Ising Model The configuration probability given by Boltzmann Distribution is where and the normalization constant

Ising Model The conditional probability of 𝑋 𝑗 given all other nodes 𝑋 \j : Configuration Pairwise interaction Node parameter x1 x2 … xi xj t1 t2 ti tj Null … β1i β1j βi1 βij βj1 βji

How? Step1(lasso): Find possible neighbors for each Step2(EBIC): Choose the best set of neighbors Step3(AND rule or OR rule ): Determine the final edge Step4: Take the average of the weights

Experiment Not yet… BUT…

An Example An example of using Ising model to reconstruct a graph Sample Reconstruct

Thanks!

Q&A