Download presentation

Presentation is loading. Please wait.

Published byCamille Bicknell Modified over 2 years ago

1
Character Recognition using Hidden Markov Models Anthony DiPirro Ji Mei Sponsor:Prof. William Sverdlik

2
Our goal Recognize handwritten Roman and Chinese characters This is an example of the Noisy Channel Problem Ji

3
Noisy Channel Problem Find the intended input, given the noisy input that was received Examples – iPhone 4S Siri speech recognition – Human handwriting

4
Markov Chain We use a Hidden Markov Model to solve the Noisy Channel Problem A HMM is a Markov chain for which the state is only partially observable. Markov Chain Definition Illustration

5
Hidden Markov Model

6
Our Project

8
How to solve our problem? Using a HMM, we can calculate the hidden states chain, based on the observation chain We used our collected samples to calculate transition probability table and emission probability table Use Viterbi algorithm to find the most likely result

9
Pre-Processing Shrink Medium filter Sharpen

10
Feature Extraction We count the regions in each area to represent the observation states

11
Compare Adjusted Input Canonical B Canonical A … S2 S3 S1 S2 S3

12
Experimenting How to split character

13
Experimenting How to represent states

14
Result

15
Conclusions Factors that will affect accuracy – Pre-processing – How to split word – Number of states

16
In the future Spend more time on different features Pixel Density Counting lines Use other algorithms such as a neural network to implement character recognition.

Similar presentations

OK

Lecture 16 Hidden Markov Models. HMM Until now we only considered IID data. Some data are of sequential nature, i.e. have correlations have time. Example:

Lecture 16 Hidden Markov Models. HMM Until now we only considered IID data. Some data are of sequential nature, i.e. have correlations have time. Example:

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google