2 What do we mean when we speak of a natural language? Let us contrast two kinds of language "unnatural" (or formal) and "natural".
3 Unnatural languagecomputer language, such as C++, Java, Prolog, or LISPhighly constrained; they have a very rigid syntaxwork with a very limited initial vocabulary.
4 A program program add-numbers-in-the-table (input, output); var table: array [1..200] of integer;index: integer;sum: real;beginindex := 1sum:=0while index <= 100sum := sum + table[index];index := index+1;end;end.
5 Natural Languages tend to be far less precisely defined. Examples include the languages we speak, write and read: English, American, French, Chinese, Japanese, or Sanscrit.Have a very complex syntax or, perhaps, do not even conform to a well defined syntax ... especially as they are usedThey typically have an enormous vocabulary.
6 Natural language dialogue Here are some numbers in a table.Add themAll of them?No, just the first 100.
7 Natural Language and Perception? Natural language is conveyed in a number of ways. One of the most important ways is that it is "spoken”.Speaking a language places sound patterns in the environment.These sound patterns join other sounds in the environment, produced by the wind rushing through the trees, by machinery, and by other speakers
8 Natural language is also conveyed via text, in books, letters, newspapers and on computer screens. What sensor do we use to interpret text? What is perception in this case?Are we reasoning about out perceptions?Do we perceive that the string "My hat is red" is a code for a conceptual entity?
9 Machine TranslationOne of the original motivating reasons for studying Natural Language was to build machines which could translate text from one language to another.This was originally thought to be a modest task
10 The method first envisioned was to 1. Replace the words in the language to be translated with equivalent words in the target language.2. Use syntax rules to cleanup the resulting sentences.
11 Sometimes this works. Consider the following example. English Sentence: I must go home.Word replacements into GermanI Ichmust mussgo gehenhome nach hause
12 Resulting Sentence: Ich muss gehen nach hause. Syntax transform ... move verb to the end of the sentence. Ich muss gehen nach hause.German Sentence: Ich muss nach hause gehen.This is pretty good!
13 Here is an unlucky example. The spirit is willing but the flesh is weak EnglishTranslate: English Russian EnglishThe vodka is strong but the meat is rotten The result!
14 Words can mean many things Most and often the meanings can be unrelated. Here is a simple word which has a number of unrelated meanings.bow ribbon tied into a decorative configurationbow an instrument used to project an arrow.bow the forward portion of a boatbow an act performed out of respectbow deviation of an object from a straight line.
15 PlacementFor one thing, selection of the proper meaning of a word seems to depend on where that word occurs in the sentence.The pen is in the box.The box is in the pen.
16 ContextSentences change their meaning depending on the context in which the sentence is presented.The following sentence is ambiguous even if we keep the specify the meaning for each of the words.I saw the man on the hill with a telescope.How well do you think the word substitution method of machine translation would work on this one?
17 Natural Language Understanding The upshot of the early work in machine translation made it clear that there was more to translating natural language than simply substituting words and massaging the syntax.Something "deeper" was afoot.To properly translate a sentence the machine must first "understand" it. What do we mean by "deeper" and "understand"??? How do we do it??
18 What does the following phrase mean? Water pump pulley adjustment screw threads damage report summary.
19 How do we understand conversations? What is going on in the following dialogue?Do you know the time?Yes.Could you tell me the time?Will you tell me the time?I need to know the time.I understand.
20 Eliza: Step toward Natural Language Understanding? Although its author, Joseph Weisenbaum, would strongly disagree, it has been often said that Eliza is a demonstration of natural language understanding ... at least to some extent.
21 Eliza's understanding based on key words Eliza had a set of templates, each looking for a key word in the input sentence.These templates were of the following form:(* keyword *)where the *'s are meant to be wildcards, like the ones used in UNIX for filename descriptions. They are meant to match with any string.
22 The template(* computers *)for example, would match the sentenceI am really frustrated with computers.by matching the first “” of the template with the string "I am really frustrated with " and the second “” with the string ".".
23 Eliza's "understanding" of the sentence was simple, to be sure, but it was enough to return "meaningful" sentences to the user ... in the context of a therapy session.The program would store the strings which were matched to the *'s and sometime later generate a new sentence, often using these stored strings.The responses were built into a table, referenced by the template.
24 Whenever Elisa found a match, she would generate one of the corresponding responses. Question: Is this understanding?TemplateResponses(* computers *)Do computers frighten you?(* mother *)Tell me more about your family.(I hate *)You say you hate *. Why do you hate *?no matchPlease go on Tell me more about *.
25 Conceptual Dependency An attempt to represent sentences about actions in a way that addresses the similarity in meaning.
26 Capturing the similarity of meaning capture the similarity of meaning found in sentences like:Mary took the book from John.Mary received the book from John.Mary bought the book from John.John gave the book to Mary.John sold the book to Mary.John sold Mary the book.John bought the book from Mary.John traded a cigar to Mary for the book.
27 In all these examples the "ownership" of the book is transferred between John and Mary. The direction may vary and the intention may change, but the result of the event is the similar.
28 FramesShank and Abelson used frames to represent events. These frames had four slots:actor: the agent causing the eventaction: the action performed by the actorobject: the object being acted upondirection:the direction in which that action is oriented
29 All actions in terms of a small set of “primitive actions”. One of the list of primitive actions they worked with wasatrans transfer of possessionmtrans transfer of mental informationptrans physical transfer of an object from one place to another.mbuild to build mental structuresspeak the act of making soundsingest eatgrasp to hold in ones handpropel to apply a force to an objectattend focusing ones consciousness uponmove move a body part, eg moving an arm.
30 Four similar sentences John gave Mary a the book.actor: Johnaction: atransobject: bookdirection: from John to Mary.John took the book from Marydirection: from Mary to John.John bought the book from MaryMary sold the book to John.actor: Mary
31 EnhancementsOne criticism of this presentation was that the “understanding” did not capture some of the subtleties present in many sentences.In response, enhancements were made to the frame structure.actor: the agent causing the eventaction: the action performed by the actorobject: the object being acted upondirection: the direction in which that action is orientedinstrument: device used to accomplish actioncause: events caused by the actiontime: timeframe
32 John went to the store actor: John action: ptrans object: John direction: to (the store)instrument: unspecifiedcause unspecifiedtime unspecified.
33 John flew to New York actor: John action: ptrans object: john direction: to (New York)instrument:actor: planeaction: propelobject: planeother fields: unspecifiedtime: past
34 Challenges: John took a plan to New York Is itactor: Johnaction: ptransobject: planedirection: to New YorkOr should it be ...object: Johndirection: to (New York)instrument:actor: planeaction: propeltime: past
35 MARGIE (Schank, et al. 1975).- MARGIE was a natural language program composed of a conceptual analyzer, a sentence generator and an inference module.The conceptual analyzer was a kind of compiler that converted English sentences into conceptual dependency structures.
36 English Sentences Conceptual Dependency (CD). The sentence generator produced "natural language sentences" from conceptual dependency structuresConceptual Dependency Natural Language Sentences.Schank and his students experimented with sentence generators in a number of languages.CD EnglishCD French
37 Hope in these efforts was, of course, to get a handle on a solution to the language translation problem.The inference module would make inferences from the CD expressions in a data base together with the new CD expressions created by the new sentences.
38 Experiments were conducted to see what would happen with and without the inference module enabled.
39 Inferencing disabledHere is an example of the output generated with inferencing disabled.InputJohn killed Mary by choking her.Output paraphrases:John Strangled Mary.John choked Mary and she died because she could not breathe.Mary died because she was unable to inhale some air and she was unable to inhale some air because John grabbed her by the neck.
40 Inferencing enabledWhen inferencing was enabled the output became even more interesting:InputJohn gave Mary an Aspirin.Output SentencesJohn believes that Mary is sick.Mary is sick.Mary wants to feel better.Mary will ingest the aspirin.
41 Programs like MARGIE are very interesting Programs like MARGIE are very interesting. Their successes and failures help us understand the natural language understanding process.When we write programs like this we are trying to discover what is it about sentences that make it possible for us to:Understand them.Understand ones we have not seen before.Produce new ones that others can understand.
42 Language building blocks or units Printed text, for example, is organized according to the following language unitsLettersWordsPhrasesSentencesParagraphsSectionsChaptersBooksFields
43 Syntax or sentence structure It seems compelling that syntax or sentence structure plays a role in determining the meaning of a sentence.if we can determine the structure of the sentence it seems to help resolve the ambiguity.
44 If we can determine that “on the hill modifies the man and “with the telescope” modifies the action of seeing, then the sentence meaning is resolved. It means that I used the telescope to seem the man and that the man was on the hill.If, on the other hand, “ with the telescope” modifies the phrase “on the hill” the hill, then the sentence means that I saw the man that was located on the particular hill that had a telescope on it. Etc.
45 GrammarOne form of grammar describes a sentence in terms of the concepts “noun phrase, verb phrase, noun, verb, prepositional phrase, adverb, preposition, etc.
47 Parsing in PrologTo begin with, we will simply determine if a sentence is a legal sentence. In other words, we will write a predicate sentence/1, which will determine if its argument is a sentence.Our two examples assume we have broken the sentences into words (by testing for the “whitespace” between words) and stored in the following prolog lists[the,dog,ate,the,bone][the,big,brown,mouse,chases,a,lazy,cat]
48 Basic strategies for parsing The generate-and-test strategythe list to be parsed is split in different wayswith the splittings tested to see if they are components of a legal sentence.Prolog code sentence(L) :- append(NP, VP, L), nounphrase(NP), verbphrase(VP).
49 The append predicate will generate possible values for the variables NP and VP, by splitting the original list L.The next two goals test each of the portions of the list to see if they are grammatically correct. If not, backtracking into append/3 causes another possible splitting to be generated.The clauses for nounphrase/1 and verbphrase/1 are similar to sentence/1, and call further predicates that deal with smaller units of a sentence, until the word definitions are met, such asnoun([dog]). verb([ate]).noun([mouse]). verb([chases]).
50 Difference strategyThe more efficient strategy is to skip the generation step and pass the entire list to the lower level predicates,which in turn will take the grammatical portion of the sentence they are looking for from the front of the list and return the remainder of the list.
51 CD “grammar”Another grammar is conceptual dependency (CD, Shank and Ableson). It describes a sentence as a flat structure with well defined roles: actor, action, object, direction, instrument, time, etc.
52 Transformational grammar transformational grammar (by Noam Chomsky).The key idea here is that somewhere “deep” in the mind is a structure (to be determined) that represents ideas and thoughts, perhaps utterances.From this structure, there is a set of transformations that translate or convert “deep” structures into a “phrase” structure, like the grammar school would produce.Of course, analogous transformations would convert the phrase structures into “deep” structures. This work distinguishes between the structure of the sentence, namely phrase structure, and the structure of the ideas, the deep structure. The conceptual dependency work of Shank and Ableson does not.
54 A key questionA second key question is: Once we decide on a grammar, "How can we design an algorithm that can take a sentence and determine its structure?"Such an algorithm would be referred to as a parsing algorithm.
55 Consider the sentence group I saw the man on the hill with a telescope. He seemed to be looking at the moon.Yes, your honor, I know he was quite a distance away, but I had been setting up my gear for an evening of star watching. I saw the man on the hill with a telescope.
56 ContextClearly the sentence “I saw the man on the hill with the telescope” loses a lot of its ambiguity when it is imbedded in a story or paragraph.
57 SAM (Script Applier Mechanism) Another Schank and Ableson program, SAM, worked with conceptual dependency structures that were collected into larger structures called scripts.These scripts represented stories or patterns of events we often encounter together.
59 typical script: a restaurant script. we could thinkCD1 as representing the event of entering the restaurant.CD2 could represent the waiter asking if you wanted dinner or cocktails.CD31 would be the event of moving to a seat in the lounge;CD32 would be ordering a drink, etc.
60 SAM had a number of scripts in its internal representations. As sentences were entered SAM would convert them to CD and focus on a script that "explained" them.The script provided the logical thread that tied the sentences together and helped resolve ambiguities in individual sentences.In a sense, the selection of the appropriate script was an "understanding" of the collection of sentences
61 John went to the restaurant. He ordered a Big Mac and an order of fries.He ate and returned to the turnpike.
62 Depending on your collection of scripts, one might imagine the first sentence selecting a number of restaurant scripts, including one for McDonalds.The second sentence would eliminate all but the McDonalds script.The third sentence could invoke, among other scripts, one about driving long distances and could identify the McDonalds script with the CD in the long-distance-driving script associated with stopping to eat.