Presentation on theme: "Dialogue systems II Understanding speakers’ intentions Cooperative responses."— Presentation transcript:
Dialogue systems II Understanding speakers’ intentions Cooperative responses
2/17 Speaker’s intentions Speech acts and illocutionary force (Austin 1962, Searle 1969) Traditionally, meanings of utterances expressed in terms of truth conditions Austin distinguished utterances as –Constative: as above –Performative: utterances constituting verbal actions per se Can you pass the salt? It’s too hot in here.
3/17 Types of verbal act Locution –Literal use of an utterance Illocution –Intending meaning of the utterance –hence “illocutionary force” Perlocution –What is achieved by the utterance Fun starts when locution and illocution do not match Can we characterize the circumstances when a given locution should (or should not) have a given illocutionary force?
5/17 Examples of illocutionary act Assertives –Bare declaratives, I’d like to report that…, Did you know that… Directives –Imperatives, Can I have…, I’d like …, Can you …, Commissives –I promise …, Would you like …, Can I give you …, Expressives –Thank you, (I’m) sorry, Well done, I’d like to say well done Declarations –I resign, You’re fired, I declare this supermarket open, OK that’s enough (cricket declaration), j’adoube, Pax, Barley
6/17 Mismatches Can you X? Can I X? –Apparently yes/no questions about ability or permission, but may be … –Directive: can you pass the salt? –Commissive: can I give you a lift? –Expressive: can I just say well done? Ambiguity –Can you tell me if Mr Smith was on that plane? Context –Why have you taken your shirt off? It’s too hot in here. –Yes, Joe, what do you want? It’s too hot in here.
7/17 Beliefs and intentions In order to correctly understand the speech act involved, we must know about –Literal meaning (“propositional content”) –Conventions associated with certain constructions –Conditions/context in which it was uttered –Beliefs/intentions of the speaker eg if there’s no way of making it cooler, it’s too hot is just a statement.
8/17 Understanding dialogues Does B mean “Yes” or “No”? Ambiguous because being kept awake could be good or bad, depending on the circumstances. A. Do you want some coffee? B. Coffee would keep me awake “No” is understood Statement is added as an explanation How do you know that “no” is implied? A. Will you have another beer? B. Well I’m driving.
9/17 Understanding dialogues A’s response is a perfectly reasonable answer to the question. Though it assumes (or begs) knowledge of where the meeting is in relation to the station Assumptions and beliefs of both parties have to be modelled In a real application, this can only be done with a restricted domain A. I’d like to buy a ticket to London. B. Which train do you want to get? A. Well my meeting is at 2 o’clock.
10/17 Cooperative responses Other side of the coin In any dialogue, responses generally express what the speaker thinks the hearer wants to know From analysis point of view need to understand why a given response is appropriate, and what it means. Equally, when generating dialogue, responses need to be “cooperative”
11/17 Grice’s Maxims H P Grice (1975) Principles of cooperative dialogue Maxim of Quantity: –Make your contribution to the conversation as informative as necessary. –Do not make your contribution to the conversation more informative than necessary. Maxim of Quality: –Do not say what you believe to be false. –Do not say that for which you lack adequate evidence. Maxim of Relevance: –Be relevant (i.e., say things related to the current topic of the conversation). Maxim of Manner: –Avoid obscurity of expression. –Avoid ambiguity. –Be brief (avoid unnecessary wordiness). –Be orderly.
12/17 Maxim of quantity –Implies she didn’t eat it all, because otherwise you would say so –Logically no such implication can be made Maggie ate some of the chocolate A. What time is the next train to London? B. The next train is 1015. It arrives at 1245. –Information given is more than was asked for –B guesses that this is the sort of information A might also need, and so offers it unsolicited
13/17 Maxim of quantity A.What films are there on this afternoon? B.[Lists all the films showing at all 12 screens] A.Who is enrolled in LELA10011? B.There are 145 students enrolled in that course. Shall I list them? Does answer the question literally, but this maxim would tell you that listing all the films is too much information Better to say There are 12 screens, shall I list everything? Or narrow down the search: What time? Or What genre?
14/17 Maxim of quality Obvious requirement to say the truth But includes “the whole truth” A. How many first-year students are taking LELA20032? B. None. A. How many first-year students are taking LELA20032? B. None, because it’s a second-year course. This kind of informative answer could be triggered by recognizing that the original question implies some mistaken assumption about the data.
15/17 Maxim of quality A. What is the capital of Edinburgh? B1. I don’t know. B2. Edinburgh is a city: only countries have capitals. A. How much is a ticket to London? B. £63. A.Does the train stop at Watford? B.Yes. If you want to get off at Watford, it’s only £60. A. Is The Exorcist showing this half term? B. Yes, but it’s a 15. A. Did Everton win this weekend? B1. No. B2. No, they drew. B3. No, they didn’t play.
16/17 Maxim of relevance Answer must be relevant. –How to determine relevance? –Need a model of speaker’s and hearer’s beliefs; and the domain Works both ways: –An answer that seems to be irrelevant must be interpreted as if it obeyed this maxim. A. Did Everton win this weekend? B. It was an international break.
17/17 When maxims are disobeyed Deceit, long-windedness, irrelevance, obscurity, taciturnity are all part of natural communication –May be for rhetorical effect –eg an apparent lie might be sarcasm –Obscurity may be to cover up lack of knowledge –An irrelevant answer may reveal a mismatch in mutual belief systems
18/17 Conversation machines In contrast to purposeful dialogue systems, often with little or no genuine linguistic processing Quite good fun, and interesting to look at as a demonstration of what can be done without genuine understanding or processing
19/17 Conversation machines Early prototypes: ELIZA, PARRY –Worked by paraphrasing the input –Or recognizing keywords A. I have been feeling very worried lately. B. You have been feeling very worried lately? A.My mother doesn’t love me any more. B.Tell me more about your family. A. I am very nervous. B. Why are you very nervous? Other examples: Racter, ALICE, Verbot, … Easy to trick
Your consent to our cookies if you continue to use this website.