Presentation is loading. Please wait.

Presentation is loading. Please wait.

Large Vocabulary Continuous Speech Recognition. Subword Speech Units.

Similar presentations


Presentation on theme: "Large Vocabulary Continuous Speech Recognition. Subword Speech Units."— Presentation transcript:

1 Large Vocabulary Continuous Speech Recognition

2 Subword Speech Units

3

4

5 HMM-Based Subword Speech Units

6

7 Training of Subword Units

8

9

10 Training Procedure

11

12

13

14 Errors and performance evaluation in PLU recognition Substitution error (s) Substitution error (s) Deletion error (d) Deletion error (d) Insertion error (i) Insertion error (i) Performance evaluation: Performance evaluation: If the total number of PLUs is N, we define: If the total number of PLUs is N, we define: Correctness rate: N – s – d /N Correctness rate: N – s – d /N Accuracy rate: N – s – d – i / N Accuracy rate: N – s – d – i / N

15 Language Models for LVCSR Word Pair Model: Specify which word pairs are valid

16 Statistical Language Modeling

17 Perplexity of the Language Model Entropy of the Source: First order entropy of the source: If the source is ergodic, meaning its statistical properties can be completely characterized in a sufficiently long sequence that the Source puts out,

18 We often compute H based on a finite but sufficiently large Q: H is the degree of difficulty that the recognizer encounters, on average, When it is to determine a word from the same source. Using language model, if the N-gram language model P N (W) is used, An estimate of H is: In general: Perplexity is defined as:

19 Overall recognition system based on subword units

20 Naval Resource (Battleship) Management Task: 991-word vocabulary NG (no grammar): perplexity = 991

21 Word pair grammar We can partition the vocabulary into four nonoverlapping sets of words: The overall FSN allows recognition of sentences of the form:

22 WP (word pair) grammar: Perplexity=60 FSN based on Partitioning Scheme: 995 real arcs and 18 null arcs WB (word bigram) Grammar: Perplexity =20

23 Control of word insertion/word deletion rate In the discussed structure, there is no control on the sentence length In the discussed structure, there is no control on the sentence length We introduce a word insertion penalty into the Viterbi decoding We introduce a word insertion penalty into the Viterbi decoding For this, a fixed negative quantity is added to the likelihood score at the end of each word arc For this, a fixed negative quantity is added to the likelihood score at the end of each word arc

24

25

26

27

28 Context-dependent subword units Creation of context-dependent diphones and triphones

29 If c(.) is the occurrence count for a given unit, we can use a unit reduction rule such as: CD units using only intraword units for “show all ships”: CD units using both intraword and itnerword units:

30

31 Smoothing and interpolation of CD PLU models

32 Implementation issues using CD units

33

34

35 Word junction effects To handle known phonological changes, a set of phonological rules are Superimposed on both the training and recognition networks. Some typical phonological rules include:

36 Recognition results using CD units

37

38 Position dependent units

39 Unit splitting and clustering

40

41

42

43

44

45 A key source of difficulty in continuous speech recognition is the So-called function words, which include words like a, and, for, in, is. The function words have the following properties:

46 Creation of vocabulary-independent units

47 Semantic Postprocessor For Recognition


Download ppt "Large Vocabulary Continuous Speech Recognition. Subword Speech Units."

Similar presentations


Ads by Google