Presentation is loading. Please wait.

Presentation is loading. Please wait.

Commonsense Reasoning and Argumentation 14/15 HC 13: Dialogue Systems for Argumentation (1) Henry Prakken 25 March 2015.

Similar presentations


Presentation on theme: "Commonsense Reasoning and Argumentation 14/15 HC 13: Dialogue Systems for Argumentation (1) Henry Prakken 25 March 2015."— Presentation transcript:

1 Commonsense Reasoning and Argumentation 14/15 HC 13: Dialogue Systems for Argumentation (1) Henry Prakken 25 March 2015

2 Why do agents need argumentation? For their internal reasoning Reasoning about beliefs, goals, intentions etc often is defeasible For their interaction with other agents Information exchange involves explanation Collaboration and negotiation involve conflict of opinion and persuasion

3 Overview Dialogue systems for argumentation Inference vs. dialogue Use of argumentation in MAS General ideas Two systems (1)

4 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased

5 We should lower taxes claim

6 We should lower taxes claimwhy

7 We should lower taxes Lower taxes increase productivity Increased productivity is good since claimwhy

8 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad since claimwhy

9 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Increased inequality is good since claimwhy claim

10 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Increased inequality is good since claimwhy claim

11 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Increased inequality is good Increased inequality stimulates competition Competition is good since claimwhy claim

12 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good since claim why claim

13 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good since claim why claim

14 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

15 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

16 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

17 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

18 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim concede

19 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why retract claim concede

20 Types of dialogues (Walton & Krabbe) Dialogue TypeDialogue GoalInitial situation Persuasionresolution of conflictconflict of opinion Negotiationmaking a dealconflict of interest Deliberationreaching a decisionneed for action Information seekingexchange of informationpersonal ignorance Inquirygrowth of knowledgegeneral ignorance

21 Example P: I offer you this Peugeot for $10000. P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

22 Example (2) P: I offer you this Peugeot for $10000. P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

23 Example (3) P: I offer you this Peugeot for $10000. P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

24 Inference vs dialogue Dialogue systems for argumentation have: A communication language (well-formed utterances) A protocol (which utterances are allowed at which point?) Termination and outcome rules Argument games are a proof theory for a logic But real argumentation dialogues have real players! Distributed information Richer communication languages Dynamics

25 Standards for argumentation formalisms Logical argument games: soundness and completeness wrt some semantics of an argumentation logic Dialogue systems: effectiveness wrt dialogue goal and fairness wrt participants’ goals Argumentation: Dialogue goal = rational resolution of conflicts of opinion Participants’ goal = to persuade Argumentation is often instrumental to other dialogue types Does argumentation promote the goals of e.g. negotiation or deliberation?

26 Some properties of dialogue systems that can be studied Correspondence of outcome with players’ beliefs If the union of participants’ beliefs justifies p, can/will agreement on p result? (‘completeness’) If participants’ agree on p, does the union of their beliefs justify p? (‘soundness’) Disregarding vs. assuming participants’ personalities

27 Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t p  q s   q r   s r,t   p Knowledge basesInference rules P1: q since p

28 Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t Knowledge basesInference rules P1: q since p O1:  q since s p  q s   q r   s r,t   p

29 Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t, r Knowledge basesInference rules P1: q since p O1:  q since s P2:  s since r p  q s   q r   s r,t   p

30 Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t, r Knowledge basesInference rules P1: q since p O1:  q since s O2:  p since r,t P2:  s since r p  q s   q r   s r,t   p

31 Example 1 Paul: r Olga: s p  q r  p s   r Knowledge basesInference rules P1: q since p Paul  Olga does not justify q but they could agree on q Olga is credulous: she concedes everything for which she cannot construct a (defensible or justified) counterargument

32 Example 1 Paul: r Olga: s p  q r  p s   r Knowledge basesInference rules P1: q since p Paul  Olga does not justify q but they could agree on q O1: concede p,q

33 Example 1 Paul: r Olga: s p  q r  p s   r Knowledge basesInference rules P1: q since p Paul  Olga does not justify q but they could agree on q Olga is sceptical: she challenges everything for which she cannot construct a (defensible or justified) argument

34 Example 1 Paul: r Olga: s Knowledge basesInference rules P1: q since p O1: why p? p  q r  p s   r Paul  Olga does not justify q but they could agree on q

35 Example 1 Paul: r Olga: s Knowledge basesInference rules P1: q since p O1: why p? P2: p since r p  q r  p s   r Paul  Olga does not justify q but they could agree on q

36 Example 1 Paul: r Olga: s Knowledge basesInference rules P1: q since p O1: why p? O2:  r since s P2: p since r p  q r  p s   r Paul  Olga does not justify q but they could agree on q

37 Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p Modus ponens … Paul  Olga does not justify p but they will agree on p if players are conservative, that is, if they stick to their beliefs if possible

38 Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: concede p Modus ponens … Paul  Olga does not justify p but they will agree on p if players are conservative, that is, if they stick to their beliefs if possible

39 Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: what about q? Modus ponens … Possible solution (for open-minded agents, who are prepared to critically test their beliefs):

40 Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: what about q? Modus ponens … P2: claim q Possible solution (for open-minded agents, who are prepared to critically test their beliefs):

41 Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: what about q? Modus ponens … P2: claim q O2:  p since q, q   p Possible solution (for open-minded agents, who are prepared to critically test their beliefs): Problem: how to ensure relevance?

42 Dialogue game systems in more detail A dialogue purpose Participants (with roles) A topic language Lt With a logic A communication language Lc With a protocol Move legality rules Effect rules for Lc (“commitment rules”) Turntaking rules Termination and outcome rules

43 Effect rules Specify commitments “Claim p” and “Concede p” commits to p “p since Q” commits to p and Q “Retract p” ends commitment to p... Commitments used for: Determining outcome Enforcing ‘dialogical consistency’...

44 Public semantics for dialogue protocols Public semantics: can protocol compliance be externally observed? Commitments are a participant’s publicly declared standpoints, so not the same as beliefs! Only commitments and dialogical behaviour should count for move legality: “Claim p is allowed only if you believe p” vs. “Claim p is allowed only if you are not committed to  p and have not challenged p”

45 More and less strict protocols Single-multi move: one or more moves per turn allowed Single-multi-reply: one or more replies to the same move allowed Deterministic: no choice from legal moves Deterministic in Lc: no choice from speech act types Only reply to moves from previous turn?

46 Two systems for persuasion dialogue Parsons, Wooldridge & Amgoud Journal of Logic and Computation 13(2003) Prakken Journal of Logic and Computation 15(2005)

47 PWA: languages, logic, agents Lc: Claim p, Why p, Concede p, Claim S p  Lt, S  Lt Lt: propositional Logic: argumentation logic Arguments: (S, p) such that S  Lt, consistent S propositionally implies p Defeat: (S, p) defeats (S’, p’) iff  p  S’ and level(S) ≥ level(S’) Semantics: grounded Assumptions on agents: Have a knowledge base KB  Lt Have an assertion and acceptance attitude

48 Assertion/Acceptance attitudes Relative to speaker’s own KB + hearer’s commitments Confident/Credulous agent: can assert/accept P iff she can construct an argument for P Careful/Cautious agent: can assert/accept P iff she can construct an argument for P and no stronger argument for -P Thoughtful/Skeptical agent: can assert/accept P iff she can construct a justified argument for P If part of protocol, then protocol has no public semantics!

49 PWA: protocol 1. W claims p; 2. B concedes if allowed by its attitude, if not claims -p if allowed by its attitude or else challenges p 3. If B claims -p, then goto 2 with players’ roles reversed and -p in place of p; 4. If B has challenged, then: a. W claims S, an argument for p; b. Goto 2 for each s  S in turn. 5. B concedes p if allowed by its attitude, or the dialogue terminates without agreement. Also: - no player repeats its own moves - if the ‘indicated’ move cannot be made (i.e., would repeat a move), the dialogue terminates Outcome: do players agree at termination?

50 The agents’ KBs P: airbag airbag  safe O: newspaper newspaper   safe

51 PWA: example dialogue (1) P: thoughtful/skeptical P1: claim safe O: careful/cautious P: airbag airbag  safe O: newspaper newspaper   safe

52 PWA: example dialogue (1) P: thoughtful/skeptical P1: claim safe O: careful/cautious P: airbag airbag  safe O: newspaper newspaper   safe + safe

53 PWA: example dialogue (1) P: thoughtful/skeptical P1: claim safe O: careful/cautious O1: concede safe P: airbag airbag  safe O: newspaper newspaper   safe + safe

54 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe O: thoughtful/skeptical P: airbag airbag  safe O: newspaper newspaper   safe + safe

55 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe O: thoughtful/skeptical O1: why safe P: airbag airbag  safe O: newspaper newspaper   safe + safe

56 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} O: thoughtful/skeptical O1: why safe P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

57 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} O: thoughtful/skeptical O1: why safe O2: why airbag P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

58 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} P3: claim {airbag} O: thoughtful/skeptical O1: why safe O2: why airbag P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

59 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} P3: claim {airbag} O: thoughtful/skeptical O1: why safe O2: why airbag O3: why airbag  safe P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

60 PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} P3: claim {airbag} P2: claim {airbag  safe} O: thoughtful/skeptical O1: why safe O2: why airbag O3: why airbag  safe P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

61 PWA: example dialogue (3) P: thoughtful/skeptical P1: claim safe O: confident/skeptical P: airbag airbag  safe O: newspaper newspaper   safe + safe

62 PWA: example dialogue (3) P: thoughtful/skeptical P1: claim safe P2: why  safe O: confident/skeptical O1: claim  safe P: airbag airbag  safe O1: +  safe O: newspaper newspaper   safe P1: + safe

63 PWA: example dialogue (3) P: thoughtful/skeptical P1: claim safe P2: why  safe P3a: why newspaper P3b: why newspaper   safe O: confident/skeptical O1: claim  safe O2: claim {newspaper, newspaper   safe} O3a: claim {newspaper} O3b: claim {newspaper   safe} P: airbag airbag  safe O1: +  safe O2: + newspaper, newspaper   safe O: newspaper newspaper   safe P1: + safe

64 PWA: characteristics Protocol multi-move (if 4a is breadth-first) (almost) unique-reply Deterministic in Lc Dialogues Short (no stepwise construction of arguments, no alternative replies) Only one side develops arguments Logic used for single agent: check attitudes and construct argument Commmitments Used for attitudes and outcome


Download ppt "Commonsense Reasoning and Argumentation 14/15 HC 13: Dialogue Systems for Argumentation (1) Henry Prakken 25 March 2015."

Similar presentations


Ads by Google