Random Swap EM algorithm for GMM and Image Segmentation

Slides:



Advertisements
Similar presentations
Part 2: Unsupervised Learning
Advertisements

University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Automatic.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax K-MST -based.
University of Eastern Finland School of Computing P.O. Box 111 FIN Joensuu FINLAND Tel fax K-means*:
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Demonstration.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax K-means example.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Comparison.
Variable Metric For Binary Vector Quantization UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE JOENSUU, FINLAND Ismo Kärkkäinen and Pasi Fränti.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Department.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Department.
Context-based object-class recognition and retrieval by generalized correlograms by J. Amores, N. Sebe and P. Radeva Discussion led by Qi An Duke University.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Gaussian Mixture.
Image Repairing: Robust Image Synthesis by Adaptive ND Tensor Voting IEEE Computer Society Conference on Computer Vision and Pattern Recognition Jiaya.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Information Bottleneck EM School of Engineering & Computer Science The Hebrew University, Jerusalem, Israel Gal Elidan and Nir Friedman.
K-means clustering Hongning Wang
Mixture Language Models and EM Algorithm
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
Mean-Shift Algorithm and Its Application Bohyung Han
Incremental Learning of Temporally-Coherent Gaussian Mixture Models Ognjen Arandjelović, Roberto Cipolla Engineering Department, University of Cambridge.
Expectation Maximization for GMM Comp344 Tutorial Kai Zhang.
What is it? When would you use it? Why does it work? How do you implement it? Where does it stand in relation to other methods? EM algorithm reading group.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Clustering Methods: Part 2d Pasi Fränti Speech & Image Processing Unit School of Computing University of Eastern Finland Joensuu, FINLAND Swap-based.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Cut-based & divisive clustering Clustering algorithms: Part 2b Pasi Fränti Speech & Image Processing Unit School of Computing University of Eastern.
International Conference on Intelligent and Advanced Systems 2007 Chee-Ming Ting Sh-Hussain Salleh Tian-Swee Tan A. K. Ariff. Jain-De,Lee.
Combined Central and Subspace Clustering for Computer Vision Applications Le Lu 1 René Vidal 2 1 Computer Science Department, Johns Hopkins University,
Genetic Algorithm Using Iterative Shrinking for Solving Clustering Problems UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE FINLAND Pasi Fränti and.
Compression of aerial images for reduced-color devices UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE FINLAND Pasi Fränti and Ville Hautamäki
Multidimensional Scaling by Deterministic Annealing with Iterative Majorization Algorithm Seung-Hee Bae, Judy Qiu, and Geoffrey Fox SALSA group in Pervasive.
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
Mixture of Gaussians This is a probability distribution for random variables or N-D vectors such as… –intensity of an object in a gray scale image –color.
Genetic algorithms (GA) for clustering Pasi Fränti Clustering Methods: Part 2e Speech and Image Processing Unit School of Computing University of Eastern.
Reference line approach in vector data compression Alexander Akimov, Alexander Kolesnikov and Pasi Fränti UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER.
Regularization and Feature Selection in Least-Squares Temporal Difference Learning J. Zico Kolter and Andrew Y. Ng Computer Science Department Stanford.
Lecture 6 Spring 2010 Dr. Jianjun Hu CSCE883 Machine Learning.
CS Statistical Machine learning Lecture 24
A split-and-merge framework for 2D shape summarization D. Gerogiannis, C. Nikou and A. Likas Department of Computer Science, University of Ioannina, Greece.
Lecture 2: Statistical learning primer for biologists
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
DENCLUE 2.0: Fast Clustering based on Kernel Density Estimation Alexander Hinneburg Martin-Luther-University Halle-Wittenberg, Germany Hans-Henning Gabriel.
Design and Implementation of Speech Recognition Systems Fall 2014 Ming Li Special topic: the Expectation-Maximization algorithm and GMM Sep Some.
Multilevel thresholding by fast PNN based algorithm UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE FINLAND Olli Virmajoki and Pasi Fränti.
Information Bottleneck versus Maximum Likelihood Felix Polyakov.
Filtering of map images by context tree modeling Pavel Kopylov and Pasi Fränti UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE FINLAND.
Genetic Algorithms for clustering problem Pasi Fränti
Gaussian Mixture Model-based EM Algorithm for Instrument Occlusion in Tool Detection from Imagery of Laparoscopic Robot-Assisted Surgery 1 Interdisciplinary.
A Study on Speaker Adaptation of Continuous Density HMM Parameters By Chin-Hui Lee, Chih-Heng Lin, and Biing-Hwang Juang Presented by: 陳亮宇 1990 ICASSP/IEEE.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
Hidden Markov Models BMI/CS 576
M. Lopes (ISR) Francisco Melo (INESC-ID) L. Montesano (ISR)
Machine Learning and Data Mining Clustering
Random Swap algorithm Pasi Fränti
Classification of unlabeled data:
LOCUS: Learning Object Classes with Unsupervised Segmentation
CS 2750: Machine Learning Expectation Maximization
Unsupervised-learning Methods for Image Clustering
Random Swap algorithm Pasi Fränti
Bayesian Models in Machine Learning
SMEM Algorithm for Mixture Models
“grabcut”- Interactive Foreground Extraction using Iterated Graph Cuts
Unsupervised Learning II: Soft Clustering with Gaussian Mixture Models
10701 Recitation Pengtao Xie
Machine Learning and Data Mining Clustering
Course project work tasks
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
Presentation transcript:

Random Swap EM algorithm for GMM and Image Segmentation Qinpei Zhao, Ville Hautamäki, Ismo Kärkkäinen, Pasi Fränti Speech & Image Processing Unit Department of Computer Science, University of Joensuu Box 111, Fin-80101 Joensuu FINLAND zhao@cs.joensuu.fi

Outline Background & Status RS-EM Application

Background: Mixture Model

Background: EM algorithm E-step (Expectation): M-step (Maximization): Iterate E,M step until convergence α- mixing coefficient Θ- model parameters, eg. {μ,∑}

Local Maxima Let’s describe it as mountain climbing……

2160m 3099m 600km

Initialization Effect Initialization and Result(1) Initialization and Result(2)

The situation of local maxima trap Sub-optimal Example The situation of local maxima trap

Status Standard EM for Mixture Models(1977) Deterministic Annealing EM (DAEM) (1998) Split-Merge EM (SMEM) (2000) Greedy EM (2002) RS-EM coming…

Outline Background & Status RS-EM (Random Swap) Application

RSEM: Motivations Random manner Prevent from staying near the unstable or hyperbolic fixed points of EM. Prevent from its stable fixed points corresponding to insignificant local maxima of the likelihood function Avoid the slow convergence of EM algorithm Less sensitive to its initialization

Formulas SMEM Greedy EM RSEM

Random Swap EM After EM After EM After Swap

Comparisons(1)

Comparisons(2) Q1 Q2 S1 S4

Outline Background & Status RS-EM Application

Application Image Segmentation Color Quantization Image Retrieval ……

Conclusion Thanks!☺ Introduce Randomization into algorithm Performs better Without heavy time complexity Wider applications Thanks!☺