# Gaussian Mixture Example: Start After First Iteration.

## Presentation on theme: "Gaussian Mixture Example: Start After First Iteration."— Presentation transcript:

Gaussian Mixture Example: Start

After First Iteration

After 2nd Iteration

After 3rd Iteration

After 4th Iteration

After 5th Iteration

After 6th Iteration

After 20th Iteration

A Gaussian Mixture Model for Clustering Assume that data are generated from a mixture of Gaussian distributions For each Gaussian distribution Center:  i Variance:  (ignore) For each data point Determine membership

Learning Gaussian Mixture Model with the known covariance

Log-likelihood of Data  Apply MLE to find optimal parameters

Learning a Gaussian Mixture (with known covariance)

E-Step Learning Gaussian Mixture Model

M-Step Learning Gaussian Mixture Model

Mixture Model for Document Clustering A set of language models

Mixture Model for Documents Clustering A set of language models  Probability

A set of language models  Probability Mixture Model for Document Clustering

A set of language models  Probability Introduce hidden variable z ij z ij : document d i is generated by the j-th language model  j.

Learning a Mixture Model E-Step K: number of language models

Learning a Mixture Model M-Step N: number of documents