Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 15 Natural Language Processing (cont)

Similar presentations


Presentation on theme: "Chapter 15 Natural Language Processing (cont)"— Presentation transcript:

1 Chapter 15 Natural Language Processing (cont)
Artificial Intelligence ดร.วิภาดา เวทย์ประสิทธิ์ ภาควิชาวิทยาการคอมพิวเตอร์ คณะวิทยาศาสตร์ มหาวิทยาลัยสงขลานครินทร์

2 The same expression means different things in different context.
NLP Problems Figure 15.1 P. 378 English sentences are incomplete descriptions of the information that are intended to convey. The same expression means different things in different context. No natural language program can be complete because of new words, expression, and meaning can be generated quite freely. There are lots of ways to say the same thing. Artificial Intelligence Lecture36-40 Page 2

3 1) Processing written text
NLP Problems 1) Processing written text using lexical, syntactic, and semantic knowledge of the language the require real world information 2) Processing spoken language using all information needed above plus additional knowledge about phonology handle ambiguities in speech Artificial Intelligence Lecture36-40 Page 3

4 Natural Language processing
NLP Natural Language processing Language translation / multilingual translation Language understanding Figure 14.5 p Interaction among component Figure 14.6 p. 366 A speech Waveform Artificial Intelligence Lecture36-40 Page 4

5 1) Morphological Analysis 2) Syntactic Analysis 3) Semantic Analysis
Step in NLP 1) Morphological Analysis 2) Syntactic Analysis 3) Semantic Analysis 4) Discourse Integration 5) Pragmatic Analysis boundaries between these five phrases are often fuzzy. Artificial Intelligence Lecture36-40 Page 5

6 1. Morphological Analysis
Individual words are analyzed into components Nonword tokens such as punctuation are separated from the words I want to print Bill’s .int file. file extension proper noun possessive suffix Artificial Intelligence Lecture36-40 Page 6

7 linear sequence of words are transformed into structures
2. Syntactic Analysis linear sequence of words are transformed into structures show how words relate to each other English syntactic analyzer If do not pass the syntactic analyzer  reject (Boy the go to store the) Artificial Intelligence Lecture36-40 Page 7

8 Example of syntactic analysis Figure 15.2 p. 382  RM2, RM5, RM5
A knowledge base Fragment Figure 15.3 p. 383 User073, F1, Printing, File_Structure, Waiting Mental Event/ Physical Event Animate/Event Partial meaning for a sentence Figure 15.4 p. 384 Artificial Intelligence Lecture36-40 Page 8

9 the structures created by the syntactic analyser are assign meanings
3. Semantic Analysis the structures created by the syntactic analyser are assign meanings mapping between the syntactic structure and objects in the task domain If no mapping  reject (colorless green ideas sleep furiously) 1) It must map individual words into appropriate objects in the knowledge base or database. 2) It must create the correct structures to correspond to the meanings of the individual words combine with each other. Artificial Intelligence Lecture36-40 Page 9

10 4. Discourse Integration
the meaning of the individual sentence may depend on the sentences that precede it and may influence the meanings of the sentences that follow it. (Ex. John want it.)  “It” depends on the previous sentence. Current user who type word “I” is User068 = Susan_Black We get F1 with filename in /wsmith/ directory Artificial Intelligence Lecture36-40 Page 10

11 (Ex. Do you know what time it is?)
5. Pragmatic Analysis The structure representing what was said is reinterpreted to determine what was actually meant. (Ex. Do you know what time it is?)  we should understand what to do.... Understand to decide what to do as a result Representing the intended meaning Figure 15.5 P. 385 Artificial Intelligence Lecture36-40 Page 11

12 Grammar declarative representation
Syntactic Processing Grammar declarative representation syntactic facts about the language Figure 15.6 p.387 Parser procedure compares the grammar against input sentences to produce parsed structure. Figure 15.7 p.388 A parse tree for a sentence Artificial Intelligence Lecture36-40 Page 12

13 Syntactic Processing Top-down Parsing
Begin with start symbol and apply the grammar rules forward until the symbols at the terminals of the tree correspond to the components of the sentence being parsed. Bottom-up Parsing Begin with the sentence to be parsed and apply the grammar rules backward until a single tree whose terminals are the words of the sentence and whose top node is the start symbol has been produced. Artificial Intelligence Lecture36-40 Page 13

14 ATN : Augmented Transition Network
similar to finite state machine Figure 15.8 p.392 An ATN network Figure 15.9 p.3923An ATN Grammar in List Form sentence  “The long file has printed.” S  NP  Q1  AUX  Q3  V  Q4 (F) halt NP  Det Q6  Adj Q6  N  Q7 (F) (S DCL (NP (FILE (LONG) DEFINITE)) HAS (VP PRINTED)) Artificial Intelligence Lecture36-40 Page 14

15 DAGs : Direct Acyclic Graph
Unification Grammar DAGs : Direct Acyclic Graph the graph corresponding to “the” and “file” are [CAT: DET [CAT: N [NP [DET: N LEX:the] LEX : file HEAD: file NUMBER: SING] NUMBER: SING]] Artificial Intelligence Lecture36-40 Page 15

16 Lexicon disambiguaty or Word sense disambiguaty
Semantic Processing Step 1. Lexicon  look up the individual words in a dictionary and extract their meaning. Step 2 Lexicon disambiguaty or Word sense disambiguaty words may have more than one meaning e.g. bank (ธนาคาร หรือ ตลิ่ง), diamond p.398 (เพชร หรือ รูปเหลี่ยม) use semantic marker  PHYSICAL OBJECT, ANIMATE OBJECT, ABSTRACT OBJECT e.g. I drop my diamond... Artificial Intelligence Lecture36-40 Page 16

17 Semantic Processing use semantic marker  PHYSICAL OBJECT ANIMATE OBJECT ABSTRACT OBJECT e.g. I drop my diamond... note that “ My lawn hates the cold” is a correct semantic as well, although lawn can not act the verb hate..... but this sentence can be use in the good English sense. Artificial Intelligence Lecture36-40 Page 17

18 Sentence level Processing
1) Semantic grammars 2) Case grammars 3) Conceptual parsing 4) Approximately compositional semantic interpretation Artificial Intelligence Lecture36-40 Page 18

19 Sentence level Processing
1) Semantic grammars - semantic action associate with the grammar rule - Figure p. 401 e.g. “I want to”  ACTION - Figure p. 402 Parsing Result with semantic grammars Artificial Intelligence Lecture36-40 Page 19

20 Sentence level Processing
2) Case grammars : passing process driven from the sentence’s main verb. - Figure p Active and passive sentence : Susan printed a file. = The file was printed by Susan. - Figure p Similar sentence : Mother baked for three hours. = The pie baked for three hours. - Word case p. 405 : (A) Agent, (I) Instrument, (F) Factitive, (L) Locatives, (S) Source, (G) Goal, (B) Beneficiary, (T) Time, (O) Object Artificial Intelligence Lecture36-40 Page 20

21 Sentence level Processing
2) Case grammars - Figure p Some verb case frames : open [_ _ O (I) (A)] A : Instigator of the action : die [_ _ D] D : Entity effect by the action John die. : kill [_ _ D (I) (A)] Bill killed John. Bill killed John with a knife. : want [_ _A O] John wanted some ice cream. John wanted Marry to go to store. Artificial Intelligence Lecture36-40 Page 21

22 Sentence level Processing
3) Conceptual parsing : strategy for finding both the structure and the meaning of the sentence in one step. use verb-ACT dictionary Figure p. 407 e.g. “want” 1) stative (wanting something to happen) 2) transitive ATRANS (wanting an object) 3) intransitive PTRANS (wanting a person) Artificial Intelligence Lecture36-40 Page 22

23 Sentence level Processing
3) Conceptual parsing CD : Conceptual dependency structure : passing process driven from the sentence’s main verb with more details in the lower level. Figure p. 407 “John wanted Mary to go to the store.” (PTRANS) Figure p. 409 “John went to the park with the peacocks.” (PTRANS) Artificial Intelligence Lecture36-40 Page 23

24 Sentence level Processing
4) Approximately compositional semantic interpretation - Semantic interpretation rules Figure p. 411 - Combining Mapping Knowledge Figure p. 412 wanting : agent : (animate), object : (state or event) Artificial Intelligence Lecture36-40 Page 24

25 Discourse and Pragmatic Processing
use to understand a single sentence. p. 415 – 416 Relationship between discourse contexts Names of individuals Dave went to the movie. Parts of actions. 1) John went on a business trip to New York. 2) He left on an early morning flight. Causal chains.... Planning sequences.... Artificial Intelligence Lecture36-40 Page 25

26 Discourse and Pragmatic Processing
knowledge to focus the current focus on dialogue a model of each participant’s current beliefs the goal driven character of dialogue the rule of conversation shared by all participants Modeling Individual beliefs p.419 Model logic : Three belief spaces Figure p.420 temporal logics allow to talk about the truth of the set of proposition at current state of the real world, in the pat, and in the future as well. conditional logic allow to talk about the truth or falsehood under some circumstances Artificial Intelligence Lecture36-40 Page 26


Download ppt "Chapter 15 Natural Language Processing (cont)"

Similar presentations


Ads by Google