Jonah Shifrin, Bryan Pardo, Colin Meek, William Birmingham

Slides:



Advertisements
Similar presentations
Pattern Finding and Pattern Discovery in Time Series
Advertisements

Effective Keyword Based Selection of Relational Databases Bei Yu, Guoliang Li, Karen Sollins, Anthony K.H Tung.
Dynamic Bayesian Networks (DBNs)
Hidden Markov Models Reading: Russell and Norvig, Chapter 15, Sections
Hidden Markov Models Bonnie Dorr Christof Monz CMSC 723: Introduction to Computational Linguistics Lecture 5 October 6, 2004.
Language Model based Information Retrieval: University of Saarland 1 A Hidden Markov Model Information Retrieval System Mahboob Alam Khalid.
Patterns, Profiles, and Multiple Alignment.
Hidden Markov Models Ellen Walker Bioinformatics Hiram College, 2008.
Profiles for Sequences
Hidden Markov Models Theory By Johan Walters (SR 2003)
Albert Gatt Corpora and Statistical Methods Lecture 8.
. Hidden Markov Model Lecture #6. 2 Reminder: Finite State Markov Chain An integer time stochastic process, consisting of a domain D of m states {1,…,m}
درس بیوانفورماتیک December 2013 مدل ‌ مخفی مارکوف و تعمیم ‌ های آن به نام خدا.
What is the temporal feature in video sequences?
Implementation of Graph Decomposition and Recursive Closures Graph Decomposition and Recursive Closures was published in 2003 by Professor Chen. The project.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Hidden Markov Models Pairwise Alignments. Hidden Markov Models Finite state automata with multiple states as a convenient description of complex dynamic.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 14: Introduction to Hidden Markov Models Martin Russell.
1 Robust Temporal and Spectral Modeling for Query By Melody Shai Shalev, Hebrew University Yoram Singer, Hebrew University Nir Friedman, Hebrew University.
HIDDEN MARKOV MODELS IN MULTIPLE ALIGNMENT. 2 HMM Architecture Markov Chains What is a Hidden Markov Model(HMM)? Components of HMM Problems of HMMs.
HIDDEN MARKOV MODELS IN MULTIPLE ALIGNMENT
. Class 5: HMMs and Profile HMMs. Review of HMM u Hidden Markov Models l Probabilistic models of sequences u Consist of two parts: l Hidden states These.
Temporal Processes Eran Segal Weizmann Institute.
1 Hidden Markov Model Instructor : Saeed Shiry  CHAPTER 13 ETHEM ALPAYDIN © The MIT Press, 2004.
. Hidden Markov Models with slides from Lise Getoor, Sebastian Thrun, William Cohen, and Yair Weiss.
Hidden Markov Models David Meir Blei November 1, 1999.
Learning HMM parameters Sushmita Roy BMI/CS 576 Oct 21 st, 2014.
. cmsc726: HMMs material from: slides from Sebastian Thrun, and Yair Weiss.
Hidden Markov Model Continues …. Finite State Markov Chain A discrete time stochastic process, consisting of a domain D of m states {1,…,m} and 1.An m.
Query Log Analysis Naama Kraus Slides are based on the papers: Andrei Broder, A taxonomy of web search Ricardo Baeza-Yates, Graphs from Search Engine Queries.
Information Retrieval in Practice
Introduction to Profile Hidden Markov Models
Polyphonic Queries A Review of Recent Research by Cory Mckay.
A Time Based Approach to Musical Pattern Discovery in Polyphonic Music Tamar Berman Graduate School of Library and Information Science University of Illinois.
Advanced Signal Processing 2, SE Professor Horst Cerjak, Andrea Sereinig Graz, Basics of Hidden Markov Models Basics of HMM-based.
BINF6201/8201 Hidden Markov Models for Sequence Analysis
Sequence analysis: Macromolecular motif recognition Sylvia Nagl.
MUMT611: Music Information Acquisition, Preservation, and Retrieval Presentation on Timbre Similarity Alexandre Savard March 2006.
Speech Parameter Generation From HMM Using Dynamic Features Keiichi Tokuda, Takao Kobayashi, Satoshi Imai ICASSP 1995 Reporter: Huang-Wei Chen.
10/29/20151 Gene Finding Project (Cont.) Charles Yan.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Music Information Retrieval from a Singing Voice Using Lyrics and Melody Information Motoyuki Suzuki, Toru Hosoya, Akinori Ito, and Shozo Makino EURASIP.
Project Lachesis: Parsing and Modeling Location Histories Daniel Keeney CS 4440.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Mining Logs Files for Data-Driven System Management Advisor.
CS Statistical Machine learning Lecture 24
Melodic Similarity Presenter: Greg Eustace. Overview Defining melody Introduction to melodic similarity and its applications Choosing the level of representation.
Hidden Markovian Model. Some Definitions Finite automation is defined by a set of states, and a set of transitions between states that are taken based.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Elements of a Discrete Model Evaluation.
Protein Family Classification using Sparse Markov Transducers Proceedings of Eighth International Conference on Intelligent Systems for Molecular Biology.
Chapter 8. Learning of Gestures by Imitation in a Humanoid Robot in Imitation and Social Learning in Robots, Calinon and Billard. Course: Robots Learning.
Maximum Entropy Model, Bayesian Networks, HMM, Markov Random Fields, (Hidden/Segmental) Conditional Random Fields.
Hidden Markov Models (HMMs) –probabilistic models for learning patterns in sequences (e.g. DNA, speech, weather, cards...) (2 nd order model)
1 Hidden Markov Model: Overview and Applications in MIR MUMT 611, March 2005 Paul Kolesnik MUMT 611, March 2005 Paul Kolesnik.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
1 Random Walks on the Click Graph Nick Craswell and Martin Szummer Microsoft Research Cambridge SIGIR 2007.
Classification of melody by composer using hidden Markov models Greg Eustace MUMT 614: Music Information Acquisition, Preservation, and Retrieval.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Dynamic Programming & Hidden Markov Models. Alan Yuille Dept. Statistics UCLA.
Hidden Markov Model Parameter Estimation BMI/CS 576 Colin Dewey Fall 2015.
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
Tasneem Ghnaimat. Language Model An abstract representation of a (natural) language. An approximation to real language Assume we have a set of sentences,
Federated text retrieval from uncooperative overlapped collections Milad Shokouhi, RMIT University, Melbourne, Australia Justin Zobel, RMIT University,
Hidden Markov Autoregressive Models
Bursty and Hierarchical Structure in Streams
N-Gram Model Formulas Word sequences Chain rule of probability
Hidden Markov Models By Manish Shrivastava.
Presentation transcript:

Jonah Shifrin, Bryan Pardo, Colin Meek, William Birmingham Hidden Markov Models HMM-Based Musical Query Retrieval Jonah Shifrin, Bryan Pardo, Colin Meek, William Birmingham Student: Leela Krishna Kadiri

Introduction A model that explicitly maintains a probability distribution over the set of possible observations for each state is called a hidden Markov model (HMM). More formally, an HMM requires two things in addition to that required for a standard Markov model: A set of possible observations, O={o1, o2, o3,…, on}. A probability distribution over the set of observations for each state in S. The paper describes a way in which music could be retrieved by providing a musical query as a piece. Method used – HMM(Hidden Markov Model)

The Method Pieces in the database are represented as hidden Markov models (HMMs). The query is treated as an observation sequence and a model is ranked similar to the query if its HMM has a high likelihood of generating the query. The query is sung by the user and is recorded in .wav format. The query is transcribed into MIDI format. A sequence of values are derived from the MIDI representation (the deltaPitch, IOI and IOIratio values)

The Method A note transition between note n and note n+1 is described by the duple <deltaPitch, IOIratio>. deltaPitchn is the difference in pitch between note n and note n+1. The inter onset interval (IOIn) is the difference between the onset of notes n and n+1. IOIration is IOIn/IOIn+1. For the final transition, IOIn = IOIn/durationn+1.

The Method

The Method The songs in the database are represented as themes. These themes are represented as HMMs. For the HMM construction, the states are represented by note transitions, eg, 8 note transitions can be represented by 4 unique duples of <deltaPitch,IOIratio> . Here, the nodes are represent the states, the transitions by arrows, the value below each node indicates that the traversal may begin at that state and the numerical value on the edges are transition probabilities.

States Each HMM is built automatically from a MIDI file encoding the theme. The unique duples characterizing the note transitions found in the MIDI file form the states in the model. In the previous example shown a passage with eight note transitions characterized by four unique duples. Each unique duple is represented as a state.

Hidden Markov Model The model is a weighted automaton that consists of: A set of states, S = {s1, s2, s3,…, sn}. A set of transition probabilities, T, where each ti,j in T represents the probability of a transition from si to sj. A probability distribution, π, where πi is the probability the automaton will begin in state si. E, a subset of S containing the legal ending states. A set of possible observations, O={o1, o2, o3,…, on}. A probability distribution over the set of observations for each state in S.

Hidden Markov Model In this method of music retrieval all states are considered as legal ending states. The probability distribution for the initial state of each model in the database is given by the formula, where |S| is the number of states, p is probability that the sung query exactly represents the theme in the database.

Estimating Observation Probabilities In a general voice sung query there are 25 deltaPitch values and 27 IOIratio values given the hidden states. But, the hidden states depend on these values of the duple <deltaPitch, IOIratio>. Two different observation-hidden state pair tables are considered of values 25*25 and 27*27 and given the conditional independence, the probability of encountering an observational duple given a hidden state is given by,

Finding The target Given the observation sequence and the set of models(themes), the Forward Algorithm produces a value between 0 and 1 predicting whether the HMM produced the observation. Scaling the value with a constant and displaying the results in a descending order, we can obtain the correct song that was sung by the user.

Results The results are compared to Baseline matcher and the proposed system outperformed the baseline matcher in 8 out of ten cases.

HMM HMMs are employed in this problem to mainly reduce the pitch errors that are introduced by the singers. The variations of HMM like the pairHMM is a more complex yet efficient way of producing the possible states with a difference that it produces two sequences of states simultaneously.

References Jonah Shifrin, Bryan Pardo, Colin Meek, William Birmingham HMM Based Musical Query Retrieval JCDL 02, July 13-17, 2002, Portland, Oregon, USA, Copyright 2000 ACM 1-58113-513-0/02/0007