Part1 Markov Models for Pattern Recognition – Introduction CSE717, SPRING 2008 CUBS, Univ at Buffalo.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Hidden Markov Model 主講人:虞台文 大同大學資工所 智慧型多媒體研究室. Contents Introduction – Markov Chain – Hidden Markov Model (HMM) Formal Definition of HMM & Problems Estimate.
Operations Research: Applications and Algorithms
STAT 497 APPLIED TIME SERIES ANALYSIS
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
An Introduction to Markov Decision Processes Sarah Hickmott
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Ch-9: Markov Models Prepared by Qaiser Abbas ( )
Operations Research: Applications and Algorithms
Lecture 6 Power spectral density (PSD)
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
SYSTEMS Identification
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
Prof. Bart Selman Module Probability --- Part d)
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
Part 4 c Baum-Welch Algorithm CSE717, SPRING 2008 CUBS, Univ at Buffalo.
1 Markov Chains Algorithms in Computational Biology Spring 2006 Slides were edited by Itai Sharon from Dan Geiger and Ydo Wexler.
Review of Probability and Random Processes
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Probability and Distributions A Brief Introduction.
Hamid R. Rabiee Fall 2009 Stochastic Processes Review of Elementary Probability Lecture I.
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Isolated-Word Speech Recognition Using Hidden Markov Models
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Elements of Stochastic Processes Lecture II
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
9/26 디지털 영상통신 Mathematical Preliminaries Math Background Predictive Coding Huffman Coding Matrix Computation.
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
CS Statistical Machine learning Lecture 24
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Chapter 1 Random Process
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Dongfang Xu School of Information
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Other Models for Time Series. The Hidden Markov Model (HMM)
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Chapter 6 Random Processes
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Stochastic Process - Introduction
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Markov Chains Tutorial #5
Probability and Distributions
Operations Research: Applications and Algorithms
Advanced Artificial Intelligence
STOCHASTIC HYDROLOGY Random Processes
HCI/ComS 575X: Computational Perception
CONTEXT DEPENDENT CLASSIFICATION
Markov Chains Tutorial #5
ECE 5345 Stochastic Processes
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
Presentation transcript:

Part1 Markov Models for Pattern Recognition – Introduction CSE717, SPRING 2008 CUBS, Univ at Buffalo

Textbook Markov models for pattern recognition: from theory to applications by Gernot A. Fink, 1st Edition, Springer, Nov 2007

Textbook Foundation of Math Statistics Vector Quantization and Mixture Density Models Markov Models Hidden Markov Model (HMM) Model formulation Classic algorithms in the HMM Application domain of the HMM n-Gram Systems Character and handwriting recognition Speech recognition Analysis of biological sequences

Preliminary Requirements Familiar with Probability Theory and Statistics Basic concepts in Stochastic Process

Part 2 a Foundation of Probability Theory, Statistics & Stochastic Process CSE717, SPRING 2008 CUBS, Univ at Buffalo

Coin Toss Problem Coin toss result: X: random variable head, tail: states S X : set of states Probabilities:

Discrete Random Variable A discrete random variable’s states are discrete: natural numbers, integers, etc Described by probabilities of states Pr X (s 1 ), Pr X (x=s 2 ), … s 1, s 2, …: discrete states (possible values of x) Probabilities over all the states add up to 1

Continuous Random Variable A continuous random variable’s states are continuous: real numbers, etc Described by its probability density function (p.d.f.): p X (s) The probability of a<X<b can be obtained by integral Integral from to

Joint Probability and Joint p.d.f. Joint probability of discrete random variables Joint p.d.f. of continuous random variables Independence Condition

Conditional Probability and p.d.f. Conditional probability of discrete random variables Joint p.d.f. for continuous random variables

Statistics: Expected Value and Variance For discrete random variable For continuous random variable

Normal Distribution of Single Random Variable Notation p.d.f Expected value Variance

Stochastic Process A stochastic process is a time series of random variables : random variable t: time stamp Audio signal Stock market

Causal Process A stochastic process is causal if it has a finite history A causal process can be represented by

Stationary Process A stochastic process is stationary if the probability at a fixed time t is the same for all other times, i.e., for any n, and, A stationary process is sometimes referred to as strictly stationary, in contrast with weak or wide-sense stationarity

Gaussian White Noise White Noise: obeys independent identical distribution (i.i.d.) Gaussian White Noise

Gaussian White Noise is a Stationary Process Proof for any n, and,

Temperature Q1: Is the temperature within a day stationary?

Markov Chains A causal process is a Markov chain if for any x 1, …, x t k is the order of the Markov chain First order Markov chain Second order Markov chain

Homogeneous Markov Chains A k-th order Markov chain is homogeneous if the state transition probability is the same over time, i.e., Q2: Does homogeneous Markov chain imply stationary process?

State Transition in Homogeneous Markov Chains Suppose is a k -th order Markov chain and S is the set of all possible states (values) of x t, then for any k+1 states x 0, x 1, …, x k, the state transition probability can be abbreviated to

Rain Dry Two states : ‘Rain’ and ‘Dry’. Transition probabilities: Pr(‘Rain’|‘Rain’)=0.4, Pr(‘Dry’|‘Rain’)=0.6, Pr(‘Rain’|‘Dry’)=0.2, Pr(‘Dry’|‘Dry’)=0.8 Example of Markov Chain

Rain Dry Initial (say, Wednesday) probabilities: Pr Wed (‘Rain’)=0.3, Pr Wed (‘Dry’)=0.7 What’s the probability of rain on Thursday? P Thur (‘Rain’)= Pr Wed (‘Rain’) x Pr(‘Rain’|‘Rain’)+Pr Wed (‘Dry’) x Pr(‘Rain’|‘Dry’)= 0.3 x x 0.2=0.26 Short Term Forecast

Rain Dry P t (‘Rain’)= Pr t-1 (‘Rain’) x Pr(‘Rain’|‘Rain’)+Pr t-1 (‘Dry’) x Pr(‘Rain’|‘Dry’)= Pr t-1 (‘Rain’) x 0.4+(1– Pr t-1 (‘Rain’) x 0.2= x Pr t (‘Rain’) P t (‘Rain’)= Pr t-1 (‘Rain’) => Pr t-1 (‘Rain’)=0.25, Pr t-1 (‘Dry’)=1-0.25=0.75 Condition of Stationary steady state distribution

Rain Dry P t (‘Rain’) = x Pr t-1 (‘Rain’)  P t (‘Rain’) – 0.25 = 0.2 x( Pr t-1 (‘Rain’) – 0.25)  P t (‘Rain’) = 0.2 t-1 x( Pr 1 (‘Rain’)-0.25)+0.25  P t (‘Rain’) = 0.25 (converges to steady state distribution) Steady-State Analysis

Rain Dry Periodic Markov chain never converges to steady states Periodic Markov Chain