Presentation is loading. Please wait.

Presentation is loading. Please wait.

Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 1 Overview + Reiter & Dale 1997.

Similar presentations


Presentation on theme: "Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 1 Overview + Reiter & Dale 1997."— Presentation transcript:

1 Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 1 Overview + Reiter & Dale 1997

2 We are Kees van Deemter… Principal Research Fellow, ITRI, University of Brighton, UK PhD University of Amsterdam 1991 Research interests: - Formal semantics of Natural Language (underspecification, anaphora, prosody) - Generation of text, documents, speech

3 …and Matthew Stone Assistant Professor, Comp. Sci. & Cog. Sci. Rutgers University, USA PhD University of Pennsylvania 1998 Research interests: - Knowledge representation (action, knowledge, planning, inference) - Task-oriented spoken language

4 Our Question This Week What are the possible ways of using knowledge (of the world and of language) in formulating an utterance?

5 Knowledge in utterances Knowledge of the world Utterance says something useful and reliable. Knowledge of language Utterance is natural and concise, in other words, it fits hearer and context.

6 A Concrete Example Turn handle to locked position. OPEN LOCKED Our partner is working with equipment that looks like: The instruction that wed like to give them is:

7 Knowledge in this utterance Knowledge of the world Utterance says something useful and reliable. OPEN LOCKED This is what has to happen next.

8 Knowledge in this utterance Knowledge of language Utterance is natural and concise. Consider the alternatives… OPEN LOCKED Move the thing around.

9 Knowledge in this utterance Knowledge of language Utterance is natural and concise. Consider the alternatives… OPEN LOCKED You ought to readjust the fuel-line access panel handle by pulling clockwise 48 degrees until the latch catches.

10 Our Question This Week What are the possible ways of using knowledge (of the world and of language) in formulating an utterance? This is a formal question; the answers will depend on the logics behind grammatical information and real-world inference.

11 The NLG problem depends on the input to the system OUTPUT: Turn handle to locked position. INPUT: turn(handle, locked) If the input looked like this: Deriving the output would be easy:

12 Must support correct, useful domain reasoning e.g., characterizing the evident function of equipment Real conceptual input is richer and organized differently OPEN LOCKED e.g., simulating/animating the intended action

13 Difference in Content Input: New info complete, separate from old Output: New info cut down, mixed with old Turn handle to locked position.

14 Difference in Organization Input: deictic representation for objects through atomic symbols that index a flat database handle(object388). number(object388, 16A46164-1). composedOf(object388,steel). color(object388,black). goal(object388,activity116). partOf(object388,object486). Output: descriptions for objects through complex, hierarchical structures NP – DET – the - N – N – handle

15 Formal problem NLG means applying input domain knowledge that looks quite different from output language!

16 Formal problem How can we characterize these different sources of information in a common framework as part of a coherent model of language use For example: how can we represent linguistic distinctions that make choices in NLG good or bad?

17 Our Question This Week What are the possible ways of using knowledge (of the world and of language) in formulating an utterance? This is not just a mathematical question – This is a computational question; possible ways of using knowledge will be algorithms.

18 No Simple Strategy to Resolve Differences Lots of variability in natural instructions Lift assembly at hinge. Disconnect cable from receptacle. Rotate assembly downward. Slide sleeve onto tube. Push in on poppet.

19 Strategy Must Decide When to Say More Using utterance interpretation as a whole Turn handle by hand-grip from current open position for handle 48 degrees clockwise to locked position for handle. OPEN LOCKED

20 In particular, hearer matches shared initial state So describe objects and places succinctly Turn handle by hand-grip from current open position for handle 48 degrees clockwise to locked position for handle. OPEN LOCKED

21 In particular, hearer applies knowledge of the domain So omit inevitable features of action Turn handle by hand-grip from current open position for handle 48 degrees clockwise to locked position for handle. OPEN LOCKED

22 Computational problem Because this process is so complex, it takes special effort to specify the process, to make it effective, and ensure that it works appropriately. For example: How much search is necessary? How can we control search when search is required? What will such a system be able to say?

23 OK, so generation is hard Is this worth doing? Why does it matter? NLG systems also have practical difficulties: Actually getting domain information. Successful software engineering. Knowing when youve completed them. These are not to be underestimated – Theyve been the focus of most NLG research.

24 Why does this matter? Formally precise work in NLG is starting, with motivations in computer science, cognitive science and linguistics. Particularly challenging because generation has received much less attention than understanding.

25 Computer Science Motivations Dialogue systems are coming. Spoken language interfaces are standard for simple phone applications; they will soon be the norm for customer service, information, etc. Generation is an important bottleneck. After speech recognizer performance, the biggest impediment to user satisfaction is the lack of concise, natural, context-dependent responses. (More Wednesday.)

26 Computer Science Motivations On-line help can be enhanced when help messages are generated. (Assumption: There are too many to be stored separately.) Multilingual document authoring can be an efficient alternative to MT. (Different texts can be generated from the same input.) Slightly less obvious: Sophisticated text summarization requires NLG Some approaches to MT require NLG

27 Computer Science Motivations Better formal understanding of NLG promises More flexible architectures More robust and natural behavior Less labor-intensive programming methods In short, NLG systems that work better and are easier to build.

28 Motivations in Cognitive Science Want to know how language structure supports language use. –different representations make different processing problems.

29 Motivations in Cognitive Science Different representations make different processing problems. E.G. incremental interpretation: how do you understand an incomplete sentence – John cooked…

30 Motivations in Cognitive Science E.G. incremental interpretation: how do you understand an incomplete sentence – John cooked… CFG: compilation – metalevel reasoning that abstracts meaning in common to many derivations. CCG: just use the grammar itself – John cooked parses as S/NP

31 Motivations in Cognitive Science Want to know how language structure supports language use. –different representations make different processing problems. What do we learn when we think of speaking as well as understanding?

32 Linguistic Motivations Generation brings an exacting standard for theoretical precision. Grice says be brief. For NLG we must say precisely how. (More Tuesday.) More generally, is a model of the meaning of forms enough to explain when people use them? If not, we need to say more. (Or maybe this whole meaning stuff is wrongheaded!)

33 Our Ulterior Motive Explain why NLG is theoretically interesting (formally, computationally, and linguistically). Get more people working on NLG.

34 Our strategy this week: Zoom in on Generation of Referring Expressions (henceforth: GRE) Suggest that rest of NLG is equally interesting

35 What is GRE? (very briefly) Given: an object to refer to. (target object) Given: properties of all objects. (Given: a context.) Task: identify the target object.

36 Our Plan Today: Overview of generation. Tuesday: Case study in generation: GRE. Wednesday: Pragmatics in GRE. Thursday: Semantics in GRE. Friday: Syntax in GRE. ( NOTE: Relatively complementary lectures.) Of course its backwards: its generation!

37 Rest of Today A discussion of how generation systems usually work (after Reiter & Dale, 1997)

38 In practice, NLG systems work the way we can build them. They solve a specific, carefully-delineated task. They can verbalize only specific knowledge. They can verbalize it only in specific, often quite stereotyped ways.

39 In practice, NLG systems work the way we can build them. That means start with available input and the desired output, and putting together something that maps from one to the other. Any linguistics is a bonus. Any formal analysis of computation is a bonus.

40 Input can come from … Existing database (e.g., tables) Format facilitates update, etc. An interface that allows a user to specify it (e.g., by selecting from menus) Language interpretation

41 For Example Input: Rail schedule database. Current train status. User query When is the next train to Glasgow? Output: There are 20 trains each day from Aberdeen to Glasgow. The next train is the Caledonian express; it leaves Aberdeen at 10am. It is due to arrive in Glasgow at 1pm, but arrival may be slightly delayed.

42 To get from input to output means selecting and organizing information The selection and organization typically happens in a cascade of processes that use special data structures or representations Each makes explicit a degree of selection and organization that the system is committed to. Indirectly, each indicates the degree of selection and organization the system has still to create.

43 The NLG Pipeline Goals Text Plans Text Planning Sentence Plans Sentence Planning Linguistic Realization Surface Text

44 Overview of Processes and Representations, 1 Goals MessagesText Plans Text Planning Content Planning Discourse Planning

45 Message A message represents a piece of information that the text should convey, in domain terms.

46 Example Messages message-id: msg01 relation: IDENTITY arguments: arg1: NEXT-TRAIN arg2: CALEDONIAN-EXPRESS The next train is the Caledonian Express

47 Example Messages message-id: msg02 relation: DEPARTURE arguments: entity: CALEDONIAN-EXPRESS location: ABERDEEN time: 1000 The Caledonian Express leaves Aberdeen at 10am.

48 A close variant Q: When is the next train to New Brunswick? A: Its the 7:38 Trenton express. I know something about the domain in this case – and can highlight how nonlinguistic the domain representation will be.

49 Variant message message-id: msg03 relation: NEXT-SERVICE arguments: station-stop: STATION-144 train: TRAIN-3821 The next train to New Brunwick is the Trenton Local.

50 Closer to home message-id: msg04 relation: DEPARTURE arguments: origin: STATION-000 train: TRAIN-3821 time: 0738 It leaves Penn Station at 7:38.

51 How I got domain knowledge NY Penn Station really is NJT Station 000, New Brunswick really is Station 144 (you have to key this into ticket machines!) This really is train #3821 (its listed with this number on the schedule!)

52 Text Plan A text plan represents the argument that the text should convey; it is a hierarchical structure of interrelated messages.

53 Example Text Plan NextTrainInformation ELABORATION [DEPARTURE][IDENTITY]

54 Overview of Processes and Representations, 2 Text Plans Sentence Plans Sentence Planning Lexical Choice Aggregation Referring Expression Generation ?

55 Sentence Plans A sentence plan makes explicit the lexical elements and relations that have to be realized in a sentence of the output text.

56 Example Sentence Plan (S1/be :subject (NEXT-SERVICE/it) :object (TRAIN-3821/express :modifier Trenton :modifier 7:38 :status definite)) Its the 7:38 Trenton express.

57 We know whats happened Aggregation: we have constructed a single sentence that realizes two messages. Once we have the first message: Its the Trenton express. We just add 7:38 to realize the second message: Its the 7:38 Trenton express.

58 We know whats happened Referring expression generation: we have figured out to realize the next-service as it, and figured out to identify the train by its destination and frequency of stops.

59 We know whats happened Lexical (and grammatical) choice: to use the verb be with it as the subject and a reference to the train second; to say express rather than express train. to say Trenton rather than Northeast Corridor.

60 But theres no consensus method for how to do it. Reiter (1994, survey of 5 NLG systems): Most practical systems follow a pipeline, even though this makes some things difficult to do. Example: Avoidance of ambiguity Cahill et al. (1999, survey of 18 NLG systems): Tasks like Aggregation and GRE can happen almost anywhere in the system, e.g., - as early as Content Planning - as late as Sentence Realization

61 But theres no consensus method for how to do it. And well see that formal and computational questions raise important difficulties for what representations you can have what processes and algorithms you can use how you bring knowledge of language into the loop

62 Overview of Processes and Representations, 3 Sentence Plans Surface Text Linguistic Realization

63 This is easier to think about We all know what a surface text looks like! And we all know you have to have a grammar (of some kind or other) to get one!

64 Concluding remarks Our overview has followed `standard model of Reiter & Dale (1997) Even though the paper has an applied motivation, the formal problems we described earlier really come up. –Need linguistic representations that correspond to domain and enable choices –Need good algorithms to put correspondence to work

65 Next Time GRE in particular. A microcosm of NLG, requiring choices of content and form. A proving ground for formal questions how to formalize knowledge of language, characterize good communication, and design effective algorithms.

66 What to do If youve read Reiter & Dale 1997 great! This lecture probably made more sense to you. But we wont touch general issues till Friday So otherwise no hurry to do Reiter & Dale 1997. Next reading is Dale & Reiter 1995. Well be covering the whole paper rather closely.


Download ppt "Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 1 Overview + Reiter & Dale 1997."

Similar presentations


Ads by Google