Hierarchical Topic Models and the Nested Chinese Restaurant Process Blei, Griffiths, Jordan, Tenenbaum presented by Rodrigo de Salvo Braz.

Slides:



Advertisements
Similar presentations
Topic models Source: Topic models, David Blei, MLSS 09.
Advertisements

Teg Grenager NLP Group Lunch February 24, 2005
Xiaolong Wang and Daniel Khashabi
Markov Chain Sampling Methods for Dirichlet Process Mixture Models R.M. Neal Summarized by Joon Shik Kim (Thu) Computational Models of Intelligence.
Course: Neural Networks, Instructor: Professor L.Behera.
Hierarchical Dirichlet Process (HDP)
Ouyang Ruofei Topic Model Latent Dirichlet Allocation Ouyang Ruofei May LDA.
Information retrieval – LSI, pLSI and LDA
Hierarchical Dirichlet Processes
Title: The Author-Topic Model for Authors and Documents
Adaption Adjusting Model’s parameters for a new speaker. Adjusting all parameters need a huge amount of data (impractical). The solution is to cluster.
LDA Training System 8/22/2012.
Final Project Presentation Name: Samer Al-Khateeb Instructor: Dr. Xiaowei Xu Class: Information Science Principal/ Theory (IFSC 7321) TOPIC MODELING FOR.
Statistical Topic Modeling part 1
A Joint Model of Text and Aspect Ratings for Sentiment Summarization Ivan Titov (University of Illinois) Ryan McDonald (Google Inc.) ACL 2008.
Decoupling Sparsity and Smoothness in the Discrete Hierarchical Dirichlet Process Chong Wang and David M. Blei NIPS 2009 Discussion led by Chunping Wang.
Generative Topic Models for Community Analysis
Caimei Lu et al. (KDD 2010) Presented by Anson Liang.
Latent Dirichlet Allocation a generative model for text
Unsupervised discovery of visual object class hierarchies Josef Sivic (INRIA / ENS), Bryan Russell (MIT), Andrew Zisserman (Oxford), Alyosha Efros (CMU)
LATENT DIRICHLET ALLOCATION. Outline Introduction Model Description Inference and Parameter Estimation Example Reference.
Topic models for corpora and for graphs. Motivation Social graphs seem to have –some aspects of randomness small diameter, giant connected components,..
Correlated Topic Models By Blei and Lafferty (NIPS 2005) Presented by Chunping Wang ECE, Duke University August 4 th, 2006.
Example 16,000 documents 100 topic Picked those with large p(w|z)
Topic Models in Text Processing IR Group Meeting Presented by Qiaozhu Mei.
Online Learning for Latent Dirichlet Allocation
(Infinitely) Deep Learning in Vision Max Welling (UCI) collaborators: Ian Porteous (UCI) Evgeniy Bart UCI/Caltech) Pietro Perona (Caltech)
Bayesian Hierarchical Clustering Paper by K. Heller and Z. Ghahramani ICML 2005 Presented by HAO-WEI, YEH.
1 Linmei HU 1, Juanzi LI 1, Zhihui LI 2, Chao SHAO 1, and Zhixing LI 1 1 Knowledge Engineering Group, Dept. of Computer Science and Technology, Tsinghua.
Topic Modelling: Beyond Bag of Words By Hanna M. Wallach ICML 2006 Presented by Eric Wang, April 25 th 2008.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
Transfer Learning Task. Problem Identification Dataset : A Year: 2000 Features: 48 Training Model ‘M’ Testing 98.6% Training Model ‘M’ Testing 97% Dataset.
Inferring structure from data Tom Griffiths Department of Psychology Program in Cognitive Science University of California, Berkeley.
Mark M Hall Information School / Computer Science Sheffield University Sheffield, UK EuropeanaTech 2011, Vienna, 4 th - 6 th October 2011 Aggregating Cultural.
Finding the Hidden Scenes Behind Android Applications Joey Allen Mentor: Xiangyu Niu CURENT REU Program: Final Presentation 7/16/2014.
Eric Xing © Eric CMU, Machine Learning Latent Aspect Models Eric Xing Lecture 14, August 15, 2010 Reading: see class homepage.
Integrating Topics and Syntax -Thomas L
Timeline: A Dynamic Hierarchical Dirichlet Process Model for Recovering Birth/Death and Evolution of Topics in Text Stream (UAI 2010) Amr Ahmed and Eric.
A Model for Learning the Semantics of Pictures V. Lavrenko, R. Manmatha, J. Jeon Center for Intelligent Information Retrieval Computer Science Department,
Latent Dirichlet Allocation D. Blei, A. Ng, and M. Jordan. Journal of Machine Learning Research, 3: , January Jonathan Huang
Hierarchical Dirichlet Process and Infinite Hidden Markov Model Duke University Machine Learning Group Presented by Kai Ni February 17, 2006 Paper by Y.
1 Dirichlet Process Mixtures A gentle tutorial Graphical Models – Khalid El-Arini Carnegie Mellon University November 6 th, 2006 TexPoint fonts used.
Stick-Breaking Constructions
Storylines from Streaming Text The Infinite Topic Cluster Model Amr Ahmed, Jake Eisenstein, Qirong Ho Alex Smola, Choon Hui Teo, Eric Xing Carnegie Mellon.
Topic Models Presented by Iulian Pruteanu Friday, July 28 th, 2006.
Topic Modeling using Latent Dirichlet Allocation
Latent Dirichlet Allocation
Adaption Def: To adjust model parameters for new speakers. Adjusting all parameters requires an impractical amount of data. Solution: Create clusters and.
CS246 Latent Dirichlet Analysis. LSI  LSI uses SVD to find the best rank-K approximation  The result is difficult to interpret especially with negative.
Web-Mining Agents Topic Analysis: pLSI and LDA
Analysis of Social Media MLD , LTI William Cohen
Hierarchical Beta Process and the Indian Buffet Process by R. Thibaux and M. I. Jordan Discussion led by Qi An.
APPLICATIONS OF DIRICHLET PROCESS MIXTURES TO SPEAKER ADAPTATION Amir Harati and Joseph PiconeMarc Sobel Institute for Signal and Information Processing,
Text-classification using Latent Dirichlet Allocation - intro graphical model Lei Li
Completely Random Measures for Bayesian Nonparametrics Michael I. Jordan University of California, Berkeley Acknowledgments: Emily Fox, Erik Sudderth,
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Online Multiscale Dynamic Topic Models
The topic discovery models
The topic discovery models
Multitask Learning Using Dirichlet Process
Hierarchical Topic Models and the Nested Chinese Restaurant Process
Matching Words with Pictures
Topic Modeling Nick Jordan.
Topic models for corpora and for graphs
Michal Rosen-Zvi University of California, Irvine
Latent Dirichlet Allocation
CS246: Latent Dirichlet Analysis
Junghoo “John” Cho UCLA
Topic Models in Text Processing
Presentation transcript:

Hierarchical Topic Models and the Nested Chinese Restaurant Process Blei, Griffiths, Jordan, Tenenbaum presented by Rodrigo de Salvo Braz

Document classification One-class approach: one topic per document, with words generated according to the topic. For example, a Naive Bayes model.

Document classification It is more realistic to assume more than one topic per document. Generative model: pick a mixture distribution over K topics and generate words from it.

Document classification Even more realistic: topics may be organized in a hierarchy (not independent); Pick a path from root to leaf in a tree; each node is a topic; sample from the mixture.

Dirichlet distribution (DD) Distribution over distribution vectors of dimension K: P(p; u,  ) = 1/Z(u)  i p i ui Parameters are a prior distribution (“previous observations”); Symmetric Dirichlet distribution assumes a uniform prior distribution (u i = u j, any i, j).

Latent Dirichlet Allocation (LDA) Generative model of multiple-topic documents; Generate a mixture distribution on topics using a Dirichlet distribution; Pick a topic according to their distribution and generate words according to the word distribution for the topic.

Latent Dirichlet Allocation (LDA) K W   w Words Topics Topic distribution  DD hyper parameter

Chinese Restaurant Process (CRP) 1 out of 9 customers

Chinese Restaurant Process (CRP) 2 out of 9 customers

Chinese Restaurant Process (CRP) 3 out of 9 customers

Chinese Restaurant Process (CRP) 4 out of 9 customers

Chinese Restaurant Process (CRP) 5 out of 9 customers

Chinese Restaurant Process (CRP) 6 out of 9 customers

Chinese Restaurant Process (CRP) 7 out of 9 customers

Chinese Restaurant Process (CRP) 8 out of 9 customers

Chinese Restaurant Process (CRP) 9 out of 9 customers Data point (a distribution itself) sampled

Species Sampling Mixture Generative model of multiple-topic documents; Generate a mixture distribution on topics using a CRP prior; Pick a topic according to their distribution and generate words according to the word distribution for the topic.

Species Sampling Mixture K W   w Words Topics Topic distribution  CRP hyper parameter

Nested CRP

Hierarchical LDA (hLDA) Generative model of multiple-topic documents; Generate a mixture distribution on topics using a Nested CRP prior; Pick a topic according to their distribution and generate words according to the word distribution for the topic.

hLDA graphical model

Artificial data experiment word documents on 25-term vocabulary Each vertical bar is a topic

CRP prior vs. Bayes Factors

Predicting the structure

NIPS abstracts

Comments Accommodates growing collections of data; Hierarchical organization makes sense, but not clear to me why the CRP prior is the best prior for that; No mention of time; maybe it takes a very long time.