Download presentation

Presentation is loading. Please wait.

Published byJason Wilson Modified over 2 years ago

1
Accelerating Viterbi Algorithm 20120814 Pei-Ching Li

2
Outline Introduction of Viterbi Algorithm – Example Architecture – Parallel on CUDA MIDI – hmmtrain Future Works

3
Introduction of Viterbi Algorithm a dynamic programming algorithm for finding the most likely sequence of hidden states called the Viterbi path. Hidden Markov Model

4
Example O = Walk->Walk->Shop->Clean

5
Example O = Walk->Walk->Shop->Clean

6
Example O = Walk->Walk->Shop->Clean

7
Example O = Walk->Walk->Shop->Clean

8
Example O = Walk->Walk->Shop->Clean S = Sunny->Sunny->Rainy->Rainy

9
Parallel Part CSE551 Final Project: Parallel Viterbi on a GPU – Authors: Seong Jae Lee, Miro Enev – Provenance: Autumm 2009, University of Washington

10
Architecture Input – transition probability – emission probability Algorithm – hmmgenerate – hmmviterbi – accuracy of hmmviterbi

11
Matlab 2011 [SEQ, STATES] = HMMGENERATE(LEN,TRANSITIONS,EMISSIONS) STATES = HMMVITERBI(SEQ,TRANSITIONS,EMISSIONS) Use MATLAB Coder to generate C/C++ code

12
Parallel on CUDA Focus on hmmviterbi() to accelerate – Calculate the values – Choose the maximum reduction

13
Parallel on CUDA (2 nd version)

14
Parallel on CUDA

15
MIDI Score : Length : 1 second Hmmtrain : – unknown states – initial guesses for TRANS and EMIS – hmmtrain (seq, TRANS_GUESS, EMIS_GUESS)

16
TRANS_GUESS : 12x12 – C → D, D → E, E → F, F → G0.8 – Othersrandom EMIS_GUESS : played or not – 0.9 vs. 0.1 – Not accepted

17
hmmtrain (seq, TRANS_GUESS, EMIS_GUESS) seq – Source – Output of band-pass filter Hmmtrain will use algo. – BaumWelch : hmmdecode Calculates the posterior state probabilities of a sequence of emissions – Viterbi : hmmviterbi

18
hmmtrain The results of models have big difference than the guess! Can’t use the results to get the great states when running Viterbi algorithm.

19
Future Works Finish the 3 rd version. Modify the guess models to get the better result!

20
THANK YOU

21
Appendix 1 : O(nm 2 ) n stands for the number of observations m is the number of possible states of an observation

22
Appendix 2 : Reference CSE551 Final Project: Parallel Viterbi on a GPU – Authors: Seong Jae Lee, Miro Enev – Provenance: Autumm 2009, University of Washington

23
Appendix 2 : CSE551 Final Project : Parallel Viterbi on a GPU

25
Appendix 3 : Auto-generated Probability Models Random + constraint – tmp = (float)rand() / (float)RAND_MAX; – prob = (tmp <= constraint) ? 0 : tmp; Guarantee probability of each row equals 1. Verify the sequence conformed to the models. – hmmestimate(seq, states)

26
Appendix 3 : Auto-generated Probability Models Viterbi algorithm – when back tracing the likely states, avoid to save the 0 state (rand() % N) + 1

Similar presentations

OK

236607 Visual Recognition Tutorial1 Markov models Hidden Markov models Forward/Backward algorithm Viterbi algorithm Baum-Welch estimation algorithm Hidden.

236607 Visual Recognition Tutorial1 Markov models Hidden Markov models Forward/Backward algorithm Viterbi algorithm Baum-Welch estimation algorithm Hidden.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google