Hilbert Space Embeddings of Conditional Distributions -- With Applications to Dynamical Systems Le Song Carnegie Mellon University Joint work with Jonathan.

Slides:



Advertisements
Similar presentations
Dialogue Policy Optimisation
Advertisements

Probabilistic Reasoning over Time
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Modeling Uncertainty over time Time series of snapshot of the world “state” we are interested represented as a set of random variables (RVs) – Observable.
Exploiting Sparse Markov and Covariance Structure in Multiresolution Models Presenter: Zhe Chen ECE / CMR Tennessee Technological University October 22,
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Observers and Kalman Filters
1 Learning Dynamic Models from Unsequenced Data Jeff Schneider School of Computer Science Carnegie Mellon University joint work with Tzu-Kuo Huang, Le.
Optimization & Learning for Registration of Moving Dynamic Textures Junzhou Huang 1, Xiaolei Huang 2, Dimitris Metaxas 1 Rutgers University 1, Lehigh University.
Presenter: Yufan Liu November 17th,
Kalman’s Beautiful Filter (an introduction) George Kantor presented to Sensor Based Planning Lab Carnegie Mellon University December 8, 2000.
Hilbert Space Embeddings of Hidden Markov Models Le Song, Byron Boots, Sajid Siddiqi, Geoff Gordon and Alex Smola 1.
Pattern Recognition and Machine Learning
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
1 Distributed localization of networked cameras Stanislav Funiak Carlos Guestrin Carnegie Mellon University Mark Paskin Stanford University Rahul Sukthankar.
A Latent Space Approach to Dynamic Embedding of Co-occurrence Data Purnamrita Sarkar Machine Learning Department Sajid M. Siddiqi Robotics Institute Geoffrey.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Conditional Random Fields
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Student: Hsu-Yung Cheng Advisor: Jenq-Neng Hwang, Professor
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
A Unifying Review of Linear Gaussian Models
Overview and Mathematics Bjoern Griesbach
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
/09/dji-phantom-crashes-into- canadian-lake/
Computer vision: models, learning and inference Chapter 19 Temporal models.
Consensus-based Distributed Estimation in Camera Networks - A. T. Kamal, J. A. Farrell, A. K. Roy-Chowdhury University of California, Riverside
Computer vision: models, learning and inference Chapter 19 Temporal models.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
METHOD OF PERTURBED OBSERVATIONS FOR BUILDING REGIONS OF POSSIBLE PARAMETERS IN ORBITAL DYNAMICS INVERSE PROBLEM Avdyushev V.
A hybrid SOFM-SVR with a filter-based feature selection for stock market forecasting Huang, C. L. & Tsai, C. Y. Expert Systems with Applications 2008.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Gaussian Processes Li An Li An
CHAPTER 8 DISCRIMINATIVE CLASSIFIERS HIDDEN MARKOV MODELS.
State Estimation and Kalman Filtering
LMS Algorithm in a Reproducing Kernel Hilbert Space Weifeng Liu, P. P. Pokharel, J. C. Principe Computational NeuroEngineering Laboratory, University of.
Tracking with dynamics
By: Aaron Dyreson Supervising Professor: Dr. Ioannis Schizas
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Multi-label Prediction via Sparse Infinite CCA Piyush Rai and Hal Daume III NIPS 2009 Presented by Lingbo Li ECE, Duke University July 16th, 2010 Note:
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
ICCV 2007 Optimization & Learning for Registration of Moving Dynamic Textures Junzhou Huang 1, Xiaolei Huang 2, Dimitris Metaxas 1 Rutgers University 1,
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
The Linear Dynamic System and its application in moction capture
C. Canton1, J.R. Casas1, A.M.Tekalp2, M.Pardàs1
Kalman’s Beautiful Filter (an introduction)
Probabilistic Robotics
Classification Discriminant Analysis
A Latent Space Approach to Dynamic Embedding of Co-occurrence Data
Lecture 10: Observers and Kalman Filters
Classification Discriminant Analysis
Filtering and State Estimation: Basic Concepts
John Lafferty, Chengxiang Zhai School of Computer Science
Particle Filtering.
Kalman Filtering COS 323.
Recursively Adapted Radial Basis Function Networks and its Relationship to Resource Allocating Networks and Online Kernel Learning Weifeng Liu, Puskal.
Kalman Filter: Bayes Interpretation
Kalman Filters Gaussian MNs
Probabilistic Surrogate Models
Presentation transcript:

Hilbert Space Embeddings of Conditional Distributions -- With Applications to Dynamical Systems Le Song Carnegie Mellon University Joint work with Jonathan Huang, Alex Smola and Kenji Fukumizu

Hilbert Space Embeddings Summary statistics for distributions: (mean) (covariance) (expected features)

Hilbert Space Embeddings Reconstruct from for certain kernels! Eg. RBF kernel. Sample average converges to true mean at.

Hilbert Space Embeddings Embed Conditional Distributions p(y|x)  µ[p(y|x)] Embed Conditional Distributions p(y|x)  µ[p(y|x)] Structured Prediction Dynamical Systems

Embedding P(Y|X) Simple case: X binary, Y vector valued Need to embed Solution: – Divide observations according to their – Average feature for each group What if X is continuous or structured?

Goal: Estimate embedding from finite sample Problem: Some X=x are never observed… Embed p(Y|X=x) for x never observed? Embedding P(Y|X)

If and are close, and will be similar. Key Idea Generalize to unobserved X=x via a kernel on X

Conditional Embedding Conditional Embedding Operator

generalization of covariance matrix

Kernel Estimate Empirical estimate converges at

Sum and Product Rules Probabilistic Relation Hilbert Space Relation Sum Rule Product Rule Can do probabilistic inference with embeddings!

Dynamical Systems Maintain belief using Hilbert space embeddings Hidden State observation

Advantages Applies to any domain with a kernel Eg. reals, rotations, permutations, strings Applies to more general distributions Eg. multimodal, skewed

Linear dynamics model (Kalman filter optimal): Evaluation with Linear Dynamics Trajectory of Hidden States Observations

Kalman   N(0,10 I),   N(0,10 -2 I) Training Size ( Steps) RMS Error RKHS Gaussian process noise, Gaussian observation noise

   (1,1/3),   N(0,10 -2 I) Training Size ( Steps) RMS Error RKHS Kalman Skewed process noise, small observation noise

 0.5N(-0.2,10 -2 I)+0.5N(0.2,10 -2 I),  N(0,10 -2 I) Training Size ( Steps) RMS Error RKHS Kalman Bimodal process noise, small observation noise

Camera Rotation Rendered using POV-ray renderer

Predicted Camera Rotation Ground truth RKHSKalman filterRandom Trace measure Dynamical systems with Hilbert space embeddings works better

Video from Predicted Rotation Original Video Video Rendered with Predicted Camera Rotation

Conclusion Embedding conditional distributions – Generalizes to unobserved X=x – Kernel estimate – Convergence of empirical estimate – Probabilistic inference Application to dynamical systems learning and inference

The End Thanks

Future Work Estimate the hidden states during training? Dynamical system = chain graphical model; how about tree and general graph topology?

  N(0,10 -2 I),   N(0,10 I) Training Size (2 n Steps) RMS Error RKHS Kalman Small process noise, large observation noise

Dynamics can Help log10(Noise Variance) Trace Measure

Dynamical Systems Maintain belief using Hilbert space embeddings Hidden State observation

Conditional Embedding Operator Desired properties: Covariance operator: generalization of covariance matrix