The Occasionally Dishonest Casino Narrated by: Shoko Asei Alexander Eng.

Slides:



Advertisements
Similar presentations
Rare Events, Probability and Sample Size. Rare Events An event E is rare if its probability is very small, that is, if Pr{E} ≈ 0. Rare events require.
Advertisements

Marjolijn Elsinga & Elze de Groot1 Markov Chains and Hidden Markov Models Marjolijn Elsinga & Elze de Groot.
Lecture 2 Hidden Markov Model. Hidden Markov Model Motivation: We have a text partly written by Shakespeare and partly “written” by a monkey, we want.
Hidden Markov Model.
0 0 Review Probability Axioms –Non-negativity P(A)≥0 –Additivity P(A U B) =P(A)+ P(B), if A and B are disjoint. –Normalization P(Ω)=1 Independence of two.
Chapter 4 Probability and Probability Distributions
.. . Parameter Estimation using likelihood functions Tutorial #1 This class has been cut and slightly edited from Nir Friedman’s full course of 12 lectures.
Rolling Dice Data Analysis - Hidden Markov Model Danielle Tan Haolin Zhu.
Dice Games & Probabilities. Thermo & Stat Mech - Spring 2006 Class 16 Dice Games l One die has 6 faces. So, the probabilities associated with a dice game.
Parameter Estimation using likelihood functions Tutorial #1
Bioinformatics Hidden Markov Models. Markov Random Processes n A random sequence has the Markov property if its distribution is determined solely by its.
Hidden Markov Models.
 CpG is a pair of nucleotides C and G, appearing successively, in this order, along one DNA strand.  CpG islands are particular short subsequences in.
Patterns, Profiles, and Multiple Alignment.
Hidden Markov Models Modified from:
Hidden Markov Model Most pages of the slides are from lecture notes from Prof. Serafim Batzoglou’s course in Stanford: CS 262: Computational Genomics (Winter.
Hidden Markov Models 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2.
Hidden Markov Models in Bioinformatics Applications
Heuristic Local Alignerers 1.The basic indexing & extension technique 2.Indexing: techniques to improve sensitivity Pairs of Words, Patterns 3.Systems.
Hidden Markov Model 11/28/07. Bayes Rule The posterior distribution Select k with the largest posterior distribution. Minimizes the average misclassification.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2004 Modeling Sequential Processes.
Hidden Markov Models I Biology 162 Computational Genetics Todd Vision 14 Sep 2004.
Hidden Markov Models 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2.
Hidden Markov Models 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2.
This presentation has been cut and slightly edited from Nir Friedman’s full course of 12 lectures which is available at Changes.
Hidden Markov Models: an Introduction by Rachel Karchin.
Hidden Markov Models K 1 … 2. Outline Hidden Markov Models – Formalism The Three Basic Problems of HMMs Solutions Applications of HMMs for Automatic Speech.
Lecture 9 Hidden Markov Models BioE 480 Sept 21, 2004.
Hidden Markov Models Usman Roshan BNFO 601. Hidden Markov Models Alphabet of symbols: Set of states that emit symbols from the alphabet: Set of probabilities.
Time Warping Hidden Markov Models Lecture 2, Thursday April 3, 2003.
Bioinformatics Hidden Markov Models. Markov Random Processes n A random sequence has the Markov property if its distribution is determined solely by its.
Hidden Markov Models Usman Roshan BNFO 601. Hidden Markov Models Alphabet of symbols: Set of states that emit symbols from the alphabet: Set of probabilities.
Doug Downey, adapted from Bryan Pardo,Northwestern University
Hidden Markov models Sushmita Roy BMI/CS 576 Oct 16 th, 2014.
CS262 Lecture 5, Win07, Batzoglou Hidden Markov Models 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2.
Dishonest Casino Let’s take a look at a casino that uses a fair die most of the time, but occasionally changes it to a loaded die. This model is hidden.
Class 5 Hidden Markov models. Markov chains Read Durbin, chapters 1 and 3 Time is divided into discrete intervals, t i At time t, system is in one of.
1 Markov Chains. 2 Hidden Markov Models 3 Review Markov Chain can solve the CpG island finding problem Positive model, negative model Length? Solution:
Viterbi once again! Model generates numbers – :1/6 2:1/6 3:1/6 4:1/6 5:1/6 6:1/6 Fair 1:1/10 2:1/10 3:1/10 4:1/10 5:1/10 6:1/2 Loaded 0.95.
Section 6.2 ~ Basics of Probability Introduction to Probability and Statistics Ms. Young.
Sections 4-1 and 4-2 Review and Preview and Fundamentals.
THE HIDDEN MARKOV MODEL (HMM)
BINF6201/8201 Hidden Markov Models for Sequence Analysis
Hidden Markov Models Yves Moreau Katholieke Universiteit Leuven.
Hidden Markov Models Usman Roshan CS 675 Machine Learning.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Basic Concepts of Probability Coach Bridges NOTES.
Probability The calculated likelihood that a given event will occur
CS5263 Bioinformatics Lecture 10: Markov Chain and Hidden Markov Models.
Markov Models and Simulations Yu Meng Department of Computer Science and Engineering Southern Methodist University.
Homework 1 Reminder Due date: (till 23:59) Submission: – – Write the names of students in your team.
Hidden Markov Models 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2.
Hidden Markovian Model. Some Definitions Finite automation is defined by a set of states, and a set of transitions between states that are taken based.
Algorithms in Computational Biology11Department of Mathematics & Computer Science Algorithms in Computational Biology Markov Chains and Hidden Markov Model.
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Unit 4 Section 3.1.
Bayan Turki Bagasi.  Introduction  Generating a Test Sequence  Estimating the State Sequence  Estimating Transition and Emission Matrices  Estimating.
Math 1320 Chapter 7: Probability 7.3 Probability and Probability Models.
Counting and Probability. Imagine tossing two coins and observing whether 0, 1, or 2 heads are obtained. Below are the results after 50 tosses Tossing.
Hidden Markov Models – Concepts 1 2 K … 1 2 K … 1 2 K … … … … 1 2 K … x1x1 x2x2 x3x3 xKxK 2 1 K 2.
Probability and Probability Distributions. Probability Concepts Probability: –We now assume the population parameters are known and calculate the chances.
HMM (Hidden Markov Models)
Markov Chains Tutorial #5
Student Activity 1: Fair trials with two dice
PROBABILITY The probability of an event is a value that describes the chance or likelihood that the event will happen or that the event will end with.
Tutorial #3 by Ma’ayan Fishelson
Hidden Markov Autoregressive Models
1.
Markov Chains Tutorial #5
Probability Probability Principles of EngineeringTM
Presentation transcript:

The Occasionally Dishonest Casino Narrated by: Shoko Asei Alexander Eng

Basic Probability Basic probability can be used to predict simple, isolated events, such as the likelihood a tossed coin will land on heads/tails The occasionally dishonest casino concept can be used to assess the likelihood of a certain sequence of events occurring

Loaded Dice A casino’s use of a fair die most of the time, but occasional switch to a loaded die A loaded die displays preference for landing on a particular face(s) This is difficult to detect because of the low probability of it appearing

Emissions Model of a casino where two dice are rolled ◦ One is fair with all faces being equally probable P(1) = P(2) = P(3) = P(4) = P(5) = P(6) = 1/6 ◦ The other is loaded where the number “6” accumulates 1/2 of the probability distribution of the faces of this die P(1) = P(2) = P(3) = P(4) = P(5) = 1/10 P(6) = 1/2 ◦ These are emission probabilities

State Transitions The changes of state are called transitions ◦ Exchange of fair/loaded dice ◦ Stays or changes at each point in time ◦ The probability of the outcome of a roll is different for each state The casino may switch dice before each roll ◦ A fair to a loaded die with probability of 0.05 ◦ A loaded to a fair die with probability of 0.1 ◦ These are transition probabilities

Bayes’ Rule A concept for finding the probability of related series of events It relates the probability of: ◦ Event A conditional to event B ◦ Event B conditional to event A ◦ Both probabilities not necessarily the same

Markov Chain Models A Markov chain model (MCM) is a sequence of states whose probabilities at a time interval depend only upon the value preceding it It is based on the Markov assumption, which states that “the probability of a future observation given past and present observations depends only on the present”

Hidden Markov Models A Hidden Markov model (HMM) is a statistical model with unknown parameters ◦ Transitions between different states are nondeterministic with known probabilities ◦ Extension of a Markov chain model The system has observable parameters from which hidden parameters can be explained

The Occasionally Dishonest Casino: An Example Hidden Markov model structure ◦ Emission and transition probabilities are known ◦ Sequence observed is “656” ◦ Usage of fair and/or loaded die is unknown Find path, or sequence of states, that most likely produced the observed sequence

The Occasionally Dishonest Casino: An Example P(656|FFL) = Probability of “FFL” path given “656” sequence = P(F) * P(6|F) * P(F  F) * P(5|F) * P(F  L) * P(6|L) Emission Probabilities P(6|F) = 1/6 P(5|F) = 1/6 P(6|L) = 1/2 Transition Probabilities P(F  F) = 0.95 P(F  L) = 0.05 P(F) = P(L) = Probability of starting with fair/loaded die 1 st toss2 nd toss3 rd toss

The Occasionally Dishonest Casino: An Example PathSequence Calculation 1FFF P(F)*P(6|F)*P(FF)*P(5|F)*P(FF)*P(6|F) 2LFF P(L)*P(6|L)*P(LF)*P(5|F)*P(FF)*P(6|F) 3FLF P(F)*P(6|F)*P(FL)*P(5|L)*P(LF)*P(6|F) 4FFL P(F)*P(6|F)*P(FF)*P(5|F)*P(FL)*P(6|L) 5LLF P(L)*P(6|L)*P(LL)*P(5|L)*P(LF)*P(6|F) 6LFL P(L)*P(6|L)*P(LF)*P(5|F)*P(FL)*P(6|L) 7FLL P(F)*P(6|F)*P(FL)*P(5|L)*P(LL)*P(6|L) 8LLL P(L)*P(6|L)*P(LL)*P(5|L)*P(LL)*P(6|L)

The Occasionally Dishonest Casino: An Example PathSequence Calculation Probability 1FFF(1/2)*(1/6)*(0.95)*(1/6)*(0.95)*(1/6) LFF(1/2)*(1/2)*(0.10)*(1/6)*(0.95)*(1/6) FLF(1/2)*(1/6)*(0.05)*(1/10)*(0.10)*(1/6) FFL(1/2)*(1/2)*(0.95)*(1/6)*(0.05)*(1/2) LLF(1/2)*(1/2)*(0.90)*(1/10)*(0.10)*(1/6) LFL(1/2)*(1/2)*(0.10)*(1/6)*(0.05)*(1/2) FLL(1/2)*(1/6)*(0.05)*(1/10)*(0.90)*(1/2) LLL(1/2)*(1/2)*(0.90)*(1/10)*(0.90)*(1/2)

The Occasionally Dishonest Casino: An Example Most likely path comprises greatest portion of the probability distribution Three consecutive tosses of a loaded die most likely produced the sequence “656” 8LLL(1/2)*(1/2)*(0.90)*(1/10)*(0.90)*(1/2)

A Final Note The occasionally dishonest casino concept is applicable to many systems Commonly used in bioinformatics to model DNA or protein sequences ◦ Consider a twenty-sided die with a different amino acid representing each face…

Snake Eyes!