Tutorial 2 September 8, 2005. 1.If you were a system designer what kind of speech recognition system would you use for the following? Consider whether.

Slides:



Advertisements
Similar presentations
Language and Grammar Unit
Advertisements

Pasco-Hernando Community College Tutorial Series.
Functions, Domain and Range
Identifying Parts of Speech & their Functions Nouns, Pronouns, Verbs, Prepositions, Adjectives, & Adverbs; Subjects & Objects.
Is natural language regular? Context –free? (chapter 13)
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 2 (06/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Part of Speech (PoS)
Use of Artificial Intelligent Agents, Multimodal Integration, and Theory of Evidence to Design Software to aid First Grade Level Math Education By: John.
Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Welcome! ASL 3 and 4- Lecture Day.
Fall 2008Programming Development Techniques 1 Topic 9 Symbol Manipulation Generating English Sentences Section This is an additional example to symbolic.
Grammars.
Ch5 Stochastic Methods Dr. Bernard Chen Ph.D. University of Central Arkansas Spring 2011.
Cognitive Processes PSY 334 Chapter 11 – Language Structure.
9.012 Brain and Cognitive Sciences II Part VIII: Intro to Language & Psycholinguistics - Dr. Ted Gibson.
Language Model. Major role: Language Models help a speech recognizer figure out how likely a word sequence is, independent of the acoustics. A lot of.
Matakuliah: G0922/Introduction to Linguistics Tahun: 2008 Session 10 Syntax 1.
Grammars, Languages and Finite-state automata Languages are described by grammars We need an algorithm that takes as input grammar sentence And gives a.
Why is ASR Hard? Natural speech is continuous
Helping Verbs.
Helping Verbs.
6th Grade Grammar Notes.
ENGLISH III August 28, 2012 Bell Ringer: Get a Grammar Book
Chapter 16: Random Variables
Wildcat Worksheets Mini-Lesson #04: Prepositions.
TODAY IN ALGEBRA 1…  Warm Up: Writing expressions  Learning Goal: 1.6 You will represent functions as rules and as tables  Independent Practice – NO.
1.2 Represent Functions as Rules and Tables
ISSUES IN SPEECH RECOGNITION Shraddha Sharma
+ Quantitative Statistics: Chi-Square ScWk 242 – Session 7 Slides.
Solve Systems of Equations By Graphing
Grammars.
ITEC 380 Organization of programming languages Lecture 2 – Grammar / Language capabilities.
BİL711 Natural Language Processing1 Statistical Parse Disambiguation Problem: –How do we disambiguate among a set of parses of a given sentence? –We want.
 Feature extractor  Mel-Frequency Cepstral Coefficients (MFCCs) Feature vectors.
Experiments on Building Language Resources for Multi-Modal Dialogue Systems Goals identification of a methodology for adapting linguistic resources for.
Elaborated paragraphs Cynthia Hatchell 7 th grade Language Arts.
Dr. Monira Al-Mohizea MORPHOLOGY & SYNTAX WEEK 12.
Warm-up Day of Ch. 4 Test Suppose you flip a coin and then roll a die. a.Write out all the possible outcomes. b.Write the probability of all the possible.
Discourse Analysis Force Migration and Refugee Studies Program The American University in Cairo Professor Robert S. Williams.
Welcome!Welcome! ASL 3 Lecture Day. From last year’s classifier PPT.
Chapter 16: Random Variables
1.2 Represent Functions as Rules and Tables EQ: How do I represent functions as rules and tables??
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Dr. Eng. Farag Elnagahy Office Phone: King ABDUL AZIZ University Faculty Of Computing and Information Technology CPCS 222.
Grammars Grammars can get quite complex, but are essential. Syntax: the form of the text that is valid Semantics: the meaning of the form – Sometimes semantics.
Cognitive Processes PSY 334 Chapter 11 – Language Structure June 2, 2003.
Artificial Intelligence: Natural Language
LANGUAGE ARTS LA WORKS UNIT 3 REVIEW STUDY GUIDE.
You need your Reader (spiral) Notebooks & Writer (composition) Notebooks You must have your own book to read in class EVERY DAY! You should also have your.
HMM-Based Speech Synthesis Erica Cooper CS4706 Spring 2011.
1 Syntax 1. 2 In your free time Look at the diagram again, and try to understand it. Phonetics Phonology Sounds of language Linguistics Grammar MorphologySyntax.
Finite State Machines 1.Finite state machines with output 2.Finite state machines with no output 3.DFA 4.NDFA.
Chapter 1: Variables in Algebra
Goal: Identify and graph functions..  Relation: mapping or pairing, of input values with output values.  Domain: Set of input values.  Range: set of.
Syntax By WJQ. Syntax : Syntax is the study of the rules governing the way words are combined to form sentences in a language, or simply, the study of.
Welcome to our Parent Workshop. Example questions.
Some Great Benefits You Can Avail By Joining a Private Country Club
Words, Phrases, Clauses, & Sentences
Simple Sentences.
English Grammar Parts of Speech.
QUICKEN CUSTOMER SERVICE NUMBER ( ) All kind of support for quicken Any time support for all edition pf quicken is available at Quicken Customer.
Bigpond Phone Number
Bigpond Phone Number Dial Toll Free Number.
Grammar Review.
SLOPE = = = The SLOPE of a line is There are four types of slopes
Communicative competence
1.6 Represent Functions as Rules and Tables
عمادة التعلم الإلكتروني والتعليم عن بعد
Text Type: Dictionaries
Noun Clauses.
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

Tutorial 2 September 8, 2005

1.If you were a system designer what kind of speech recognition system would you use for the following? Consider whether you would use –Speaker dependent or independent –Large or limited vocabulary –Language grammar or statistical language model and carefully consider what would lead to the best outcomes. –Phone name dialing –Taxi booking –Pizza ordering – dictation –Web navigation (go back, visit.. )

2.Suppose you are given a speech recognizer for a language consisting of only four words: tire, dire, tile, dial. With the following channel model for phone realizations. –P(ay -> ay) = 1.0 –P(t->t) = 0.8p(t->d) = 0.2 –P(d->d) = 0.9p(d->t) = 0.1 –P(r->r) = 0.6p(r->l) = 0.4 –P(l->l) = 0.6p(l->r)=0.4 Using Bayes rule and the following unigram language model, calculate the most probable true input for each of the outputs :/tayl/ and /dayl/ p(tire)=0.4p(dire)=0.2p(dial)=0.3p(tile)=0.1

3. Consider an infinite language that only accepts sentences of the form : the mouse died the cat died the dog died... the mouse that the cat bit died the cat that the mouse bit died the dog that the mouse bit died... the mouse that the cat that the dog chased bit died the cat that the mouse that the dog chased bit died the dog that the cat that the mouse chased bit died... etc any noun can substitute for mouse, cat or dog any verb could substitute for chased, bit or died. Write a context free grammar that accepts exactly this infinite language