Hongfang Wang and Edwin R. Hancock Department of Computer Science

Slides:



Advertisements
Similar presentations
Basic Steps 1.Compute the x and y image derivatives 2.Classify each derivative as being caused by either shading or a reflectance change 3.Set derivatives.
Advertisements

SAMSI Discussion Session Random Sets/ Point Processes in Multi-Object Tracking: Vo Dr Daniel Clark EECE Department Heriot-Watt University UK.
Bayesian Belief Propagation
Evidential modeling for pose estimation Fabio Cuzzolin, Ruggero Frezza Computer Science Department UCLA.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Dynamic Bayesian Networks (DBNs)
Exploiting Sparse Markov and Covariance Structure in Multiresolution Models Presenter: Zhe Chen ECE / CMR Tennessee Technological University October 22,
Computer Vision Lab. SNU Young Ki Baik An Introduction to MCMC for Machine Learning (Markov Chain Monte Carlo)
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Probabilistic Graph and Hypergraph Matching
Qualifying Exam: Contour Grouping Vida Movahedi Supervisor: James Elder Supervisory Committee: Minas Spetsakis, Jeff Edmonds York University Summer 2009.
Data Visualization STAT 890, STAT 442, CM 462
Optimization & Learning for Registration of Moving Dynamic Textures Junzhou Huang 1, Xiaolei Huang 2, Dimitris Metaxas 1 Rutgers University 1, Lehigh University.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
One-Shot Multi-Set Non-rigid Feature-Spatial Matching
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Graph Based Semi- Supervised Learning Fei Wang Department of Statistical Science Cornell University.
Normalized Cuts and Image Segmentation Jianbo Shi and Jitendra Malik, Presented by: Alireza Tavakkoli.
Lecture 5: Learning models using EM
Understanding Gestalt Cues and Ecological Statistics using a Database of Human Segmented Images Charless Fowlkes, David Martin and Jitendra Malik Department.
A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Ranking by Odds Ratio A Probability Model Approach let be a Boolean random variable: document d is relevant to query q otherwise Consider document d as.
Graph-Based Semi-Supervised Learning with a Generative Model Speaker: Jingrui He Advisor: Jaime Carbonell Machine Learning Department
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Amos Storkey, School of Informatics. Density Traversal Clustering and Generative Kernels a generative framework for spectral clustering Amos Storkey, Tom.
Semi-Supervised Learning D. Zhou, O Bousquet, T. Navin Lan, J. Weston, B. Schokopf J. Weston, B. Schokopf Presents: Tal Babaioff.
Real-Time Decentralized Articulated Motion Analysis and Object Tracking From Videos Wei Qu, Member, IEEE, and Dan Schonfeld, Senior Member, IEEE.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Diffusion Maps and Spectral Clustering
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Computer Vision James Hays, Brown
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
A Graph-based Friend Recommendation System Using Genetic Algorithm
Markov Random Fields Probabilistic Models for Images
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Markov Cluster (MCL) algorithm Stijn van Dongen.
Andreas Papadopoulos - [DEXA 2015] Clustering Attributed Multi-graphs with Information Ranking 26th International.
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
CS Statistical Machine learning Lecture 24
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Lecture 2: Statistical learning primer for biologists
Fokker-Planck Equation and its Related Topics
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
1 Multi Scale Markov Random Field Image Segmentation Taha hamedani.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Learning Image Statistics for Bayesian Tracking Hedvig Sidenbladh KTH, Sweden Michael Black Brown University, RI, USA
Introduction to Sampling based inference and MCMC
Graph Spectral Image Smoothing
Intrinsic Data Geometry from a Training Set
Shuang Hong Yang College of Computing, Georgia Tech, USA Hongyuan Zha
Correlative Multi-Label Multi-Instance Image Annotation
Unsupervised Riemannian Clustering of Probability Density Functions
Particle Filtering for Geometric Active Contours
Machine Learning Basics
Outline Image Segmentation by Data-Driven Markov Chain Monte Carlo
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Graph Based Multi-Modality Learning
Learning with information of features
Revision (Part II) Ke Chen
Filtering and State Estimation: Basic Concepts
Revision (Part II) Ke Chen
Expectation-Maximization & Belief Propagation
Presented by Xu Miao April 20, 2005
Presentation transcript:

Probabilistic Relaxation Labelling by Fokker-Planck Diffusion on a Graph Hongfang Wang and Edwin R. Hancock Department of Computer Science University of York

Outline Introduction Spectral graph theory Probabilistic relaxation labelling Diffusion processes Probabilistic relaxation by diffusion Experiments Discussion

Overview The aim is to exploit the similarities between diffusion processes and relaxation labelling to develop a new iterative process for consistent labelling. A compositional graph structure is used to represent the compatibilities between object-label assignments. Evidence combination and label probability update are controlled by the Fokker-Planck equation. We evaluate our new development on the problem of data classification.

Probabilistic Relaxation Labelling Classical object labelling technique first introduced by Rosenfeld, Hummel and Zucker in early 1970’s.

Relaxation Labelling It aims at assigning consistent and unambiguous labels to a given set of objects. Relies on contextual information provided by topology of object arrangement and sets of compatibility relations on object-configutations. Involves evidence combination to update label probabilities and propagate label consistency constraints. Requires initial label probability assignment.

Relaxation Labelling Support functions for evidence combination: Hummer & Zucker: Kittler & Hancock: Probability update formula:

Graph theoretical setting for relaxation labelling. Formulate relaxation labelling as a random walk on a support graph.

Graph spectral relaxation labeling Set up a support graph where nodes represent possible object-label assignments. Edges represent label compatibility. Set up a random walk on the graph. Probability of visiting node is probability of object-label assignment (state-vector). Evolve state-vector of random with time to update probabilities and combine evidence.

Support graph Node-set Cartesian product of object-set X and label-set Adjacency matrix encodes object proximity and label compatibility: where: W(xi ,xj): the edge weight in the object graph; ωi , ωj: object labels; R(ωi , ωj ): compatibility value between the two labels. Compatiblities assigned using prior konwledge of label domain. Proximity weight set using inter-point distance measure (usually Gaussian).

Example

Fokker-Planck Diffusion Model random walk on graph using Fokker-Planck equation.

Diffusion Processes Diffusion processes are Markov random variables that are correlated and indexed by time. Markov property states that ``for a collection of random variables indexed by time t, given the current value of the variable, the future is independent of the variable’s past’’ Examples of using Markov property: image restoration [e.g., Geman & Geman 1985], Texture [Zhu,Wu & Mumford 1998], and object tracking [Isard & Blake 1996, Hua & Wu 2004, Yang, Duraiswami & Davis 2005]. Here use diffusion process to model random walk on support graph. Local evidence collection and propagation. … short time behaviour [Zhu, Wu & Mumford 1998]: FRAME: Filters, Random field And Maximum Entropy: --- Towards a Unified Theory for Texture Modeling, IJCV 27(2) pp.1-20 [Isard & Blake 1996]: Contour tracking by stochastic propagation of conditional density, ECCV 1996 [Hua & Wu 2004]: Multi-scale Visual Tracking by Sequential Belief Propagation, CVPR 2004 [Yang, Duraiswami & Davis 2005]: Fast Multiple Object Tracking via a Hierarchical Particle Filter, ICCV 2005

Probabilistic Relaxation Labelling by Diffusion Diffusion: global propagation and local evidence combination Diffusion:m we derive the diffusion operator from discrete random walks on graph [Feller Vol.II, 1970]. State space of the process: represented by the support graph. Justification of using diffusion processes on graphs: given sufficient data points, graph is capable of representing the manifold underlying the given data [Belkin&Niyogi 2002, Singer 2006, Sochen2001] Where: F is the Fokker-Planck operator [Feller 1970]: W. Feller, An introduction to probability theory and its applications, Vol.II [Belkin & Niyogi 2002]: Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. NIPS 2001 [Singer 2006]: From graph to manifold {L}aplacian: The convergence rate. Applied and Computational Harmonic Analysis, Vol. 21, pp.128-134, 2006 [Sochen2001]: Stochastic Processes in Vision: From Langevin to Beltrami. ICCV 2001.

Solve Fokker-Planck equation using spectrum of operator. Spectral Graph Theory Solve Fokker-Planck equation using spectrum of operator.

Spectral Graph Theory Adjacency matrix A: Laplacian matrix L (weighted): Transition matrix P : Where: E is the graph’s edge set, Wij is the weight between nodes vi and vj ,and degi is the degree of node vi .

Graph-spectral solution of FP Eqn. Transition matrix P and the Fokker Planck operator F: Label probability update formula: where F has the eigen-decomposition: An iteration is carried by setting the newly obtained label probabilities as the current initial label probabilities p0. Haven’t given the relation bet’n Q and F…

Experiments The development is applied to tasks of data classification; Five synthetic and two real world data-sets have been used; Small time steps are used to exploit the short time behaviour of the diffusion process; Compatibility values between labels: Short time behaviour; t is set to a range between 1 and 10. …?

Experiments Five synthetic data-sets: I. Two rings (R2) II. Ring-Gaussian (RG) III. Three Gaussian (G3) V. Four Gaussian G4_2) Top to bottom, left to right: i) syn. Data with two class labels (R2); ii) Three class labels (RG); iii) Three class labels, a mixture of three gaussians (G3); iv) A mixture sample of points from four Gaussian distributions (G4_1); v) A mixture sample of points from four Gaussian distributions (G4_2). IV. Four Gaussian (G4_1)

Experimental Results Resutls of synthetic data-sets: II. Results of Ring-Gaussian data (RG) I. Results of Three Gaussian data (G3) Some data-sets from UCI Repository of machine learning databases: Iris and wine III. Results of Four Gaussian data (G4-1)

Experimental Results Real world data (from UCI machine learning data repository) III. Results of Iris data-set IV. Results of Wine data

Discussion New development of probabilistic relaxation labelling in a graph setting; The diffusion process is used on the graph to collect evidences locally and propagate them globally; The process can also be viewed from a kernel methods perspective; Experiments on data classification tasks are successful; Can be applied to other tasks: Image segmentation; Speaker recognition; Other applications. I’d like to relate these concepts together…possibility of discovering new world.

Muchas grasias!