Presentation is loading. Please wait.

Presentation is loading. Please wait.

Argumentation Henry Prakken SIKS Basic Course Learning and Reasoning May 26 th, 2009.

Similar presentations


Presentation on theme: "Argumentation Henry Prakken SIKS Basic Course Learning and Reasoning May 26 th, 2009."— Presentation transcript:

1 Argumentation Henry Prakken SIKS Basic Course Learning and Reasoning May 26 th, 2009

2 Why do agents need argumentation? For their internal reasoning Reasoning about beliefs, goals, intentions etc often is defeasible For their interaction with other agents Information exchange, negotiation, collaboration, …

3 Overview Inference (logic) Abstract argumentation Rule-based argumentation Dialogue

4 Part 1: Inference

5 We should lower taxes Lower taxes increase productivity Increased productivity is good

6 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad

7 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity USA lowered taxes but productivity decreased

8 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … USA lowered taxes but productivity decreased

9 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective USA lowered taxes but productivity decreased

10 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective USA lowered taxes but productivity decreased

11 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased

12 Sources of conflict Default generalisations Conflicting information sources Alternative explanations Conflicting goals, interests Conflicting normative, moral opinions …

13 Application areas Medical diagnosis and treatment Legal reasoning Interpretation Evidence / crime investigation Intelligence Decision making Policy design …

14 We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased

15 AB C D E

16 Status of arguments: abstract semantics (Dung 1995) INPUT: a pair  Args,Defeat  OUTPUT: An assignment of the status ‘in’ or ‘out’ to all members of Args So: semantics specifies conditions for labeling the ‘argument graph’. Should capture reinstatement: ABC

17 Possible labeling conditions Every argument is either ‘in’ or ‘out’. 1. An argument is ‘in’ if all arguments defeating it are ‘out’. 2. An argument is ‘out’ if it is defeated by an argument that is ‘in’. Works fine with: But not with: ABC AB

18 Two solutions Change conditions so that always a unique status assignment results Use multiple status assignments: and ABC ABAB ABC AB

19 Unique status assignments Grounded semantics (Dung 1995): S0: the empty set Si+1: Si + all arguments defended by Si... (S defends A if all defeaters of A are defeated by a member of S)

20 AB C D E Is B, D or E defended by S1? Is B or E defended by S2?

21 A problem(?) with grounded semantics We have: We want(?): AB C D AB C D

22 A problem(?) with grounded semantics AB C D A = Frederic Michaud is French since he has a French name B = Frederic Michaud is Dutch since he is a marathon skater C = F.M. likes the EU since he is European (assuming he is not Dutch or French) D = F.M. does not like the EU since he looks like a person who does not like the EU

23 A problem(?) with grounded semantics AB C D A = Frederic Michaud is French since Alice says so B = Frederic Michaud is Dutch since Bob says so C = F.M. likes the EU since he is European (assuming he is not Dutch or French) D = F.M. does not like the EU since he looks like a person who does not like the EU E E = Alice and Bob are unreliable since they contradict each other

24 Multiple labellings AB C D AB C D

25 Status assignments (1) Given  Args,Defeat  : A status assignment is a partition of Args into sets In and Out such that : 1. An argument is in In if all arguments defeating it are in Out. 2. An argument is in Out if it is defeated by an argument that is in In. AB C

26 Status assignments (1) Given  Args,Defeat  : A status assignment is a partition of Args into sets In and Out such that : 1. An argument is in In if all arguments defeating it are in Out. 2. An argument is in Out if it is defeated by an argument that is in In. AB C

27 Status assignments (1) Given  Args,Defeat  : A status assignment is a partition of Args into sets In and Out such that : 1. An argument is in In if all arguments defeating it are in Out. 2. An argument is in Out if it is defeated by an argument that is in In. AB C

28 Status assignments (1) Given  Args,Defeat  : A status assignment is a partition of Args into sets In and Out such that : 1. An argument is in In if all arguments defeating it are in Out. 2. An argument is in Out if it is defeated by an argument that is in In. AB C

29 Status assignments (1) Given  Args,Defeat  : A status assignment is a partition of Args into sets In and Out such that : 1. An argument is in In if all arguments defeating it are in Out. 2. An argument is in Out if it is defeated by an argument that is in In. AB C

30 Status assignments (2) Given  Args,Defeat  : A status assignment is a partition of Args into sets In, Out and Undecided such that: 1. An argument is in In if all arguments defeating it are in Out. 2. An argument is in Out if it is defeated by an argument that is in In. A status assignment is stable if Undecided = . In is a stable extension A status assignment is preferred if Undecided is  -minimal. In is a preferred extension A status assignment is grounded if Undecided is  -maximal. In is the grounded extension

31 Dung’s original definitions Given  Args,Defeat , S  Args, A  Args: S is conflict-free if no member of S defeats a member of S S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members S is a preferred extension if it is  -maximally admissible S is a stable extension if it is conflict-free and defeats all arguments outside it S is the grounded extension if S is the  -smallest set such that A  S iff S defends A.

32 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Admissible?

33 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Admissible?

34 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Admissible?

35 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Admissible?

36 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Preferred? S is preferred if it is maximally admissible

37 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Preferred? S is preferred if it is maximally admissible

38 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Preferred? S is preferred if it is maximally admissible

39 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Grounded? S is groundeded if it is the smallest set s.t. A  S iff S defends A

40 AB C D E S defends A if all defeaters of A are defeated by a member of S S is admissible if it is conflict-free and defends all its members Grounded? S is groundeded if it is the smallest set s.t. A  S iff S defends A

41 Properties The grounded extension is unique Every stable extension is preferred (but not v.v.) There exists at least one preferred extension The grounded extension is a subset of all preferred and stable extensions …

42 The ‘ultimate’ status of arguments (and conclusions) With grounded semantics: A is justified if A  g.e. A is overruled if A  g.e. and A is defeated by g.e. A is defensible otherwise With preferred semantics: A is justified if A  p.e for all p.e. A is defensible if A  p.e. for some but not all p.e. A is overruled otherwise (?) In all semantics:  is justified if  is the conclusion of some justified argument  is defensible if  is not justified and  is the conclusion of some defensible argument

43 The status of arguments: proof theory Argument games between proponent and opponent: Proponent starts with an argument Then each party replies with a suitable counterargument Possibly backtracking A winning criterion E.g. the other player cannot move An argument is (dialectically) provable iff proponent has a winning strategy in a game for it.

44 The G-game for grounded semantics: A sound and complete game: Each move replies to previous move Proponent does not repeat moves Proponent moves strict defeaters, opponent moves defeaters A player wins iff the other player cannot move Result: A is in the grounded extension iff proponent has a winning strategy in a game about A.

45 A game tree A B C D E F

46 P: A A B C D E F

47 A game tree P: A A B C D E F O: F

48 A game tree P: A A B C D E F O: F P: E

49 A game tree P: A O: B A B C D E F O: F P: E

50 A game tree P: A O: B P: C A B C D E F O: F P: E

51 A game tree P: A O: B P: C O: D A B C D E F O: F P: E

52 A game tree P: A O: B P: CP: E O: D A B C D E F O: F P: E

53 The structure of arguments: current accounts Assumption-based approaches (Dung-Kowalski-Toni, Besnard & Hunter, …) K = theory A = assumptions, - is conflict relation on A R = inference rules (strict) An argument for p is a set A’  A such that A’  K |- R p Arguments attack each other on their assumptions Rule-based approaches (Pollock, Vreeswijk, DeLP, Prakken & Sartor, Defeasible Logic, …) K = theory R = inference rules (strict and defeasible) K yields an argument for p if K |- R p Arguments attack each other on applications of defeasible inference rules

54 Aspic system: overview Argument structure based on Vreeswijk (1997) ≈ Trees where Nodes are wff of logical language L closed under negation Links are applications of inference rules Strict (  1,...,  1   ); or Defeasible (  1,...,  1   ) Reasoning starts from knowledge base K  L Defeat based on Pollock Argument acceptability based on Dung (1995)

55 ASPIC system: structure of arguments An argument A is:  if   K with Conc(A) = {  } Sub(A) =  A 1,..., A n   if there is a strict inference rule Conc(A 1 ),..., Conc(A n )   Conc(A) = {  } Sub(A) = Sub(A 1 ) ...  Sub(A n )  {A} A 1,..., A n   if there is a defeasible inference rule Conc(A 1 ),..., Conc(A n )   Conc(A) = {  } Sub(A) = Sub(A 1 ) ...  Sub(A n )  {A} A is strict if all members of Sub(A) apply strict rules; else A is defeasible

56 Q1Q2 P R1R2 R1, R2  Q2 Q1, Q2  P Q1,R1,R2  K

57 Domain-specific vs. inference general inference rules R1: Bird  Flies R2: Penguin  Bird Penguin  K R1: ,      Strict rules: all deductively valid inference rules Bird  Flies  K Penguin  Bird  K Penguin  K Flies Bird Penguin Flies Bird Bird  Flies Penguin Penguin  Bird

58 ASPIC system: attack and defeat ≥ is a preference ordering between arguments such that if A is strict and B is defeasible then A > B A rebuts B if Conc(A) = ¬Conc(B’ ) for some B’  Sub(B); and B’ applies a defeasible rule; and not B’ > A A undercuts B if Conc(A) = ¬B’ for some B’  Sub(B); and B’ applies a defeasible rule A defeats B if A rebuts or undercuts B Naming convention implicit

59 Q1Q2 P R1R2  Q2 V 1V 2 V 3 S 2 T 1T 2

60 Argument acceptability Dung-style semantics and proof theory directly apply!

61 Additional properties (cf. Caminada & Amgoud 2007) Let E be any stable, preferred or grounded extension: 1. If B  Sub(A) and A  E then B  E 2. If the strict rules R S are closed under contraposition, then {  |  = Conc(A) for some A  E } is closed under R S ; consistent if K is consistent

62 Argument schemes Many arguments (and attacks) follow patterns. Much work in argumentation theory (Perelman, Toulmin, Walton,...) Argument schemes Critical questions Recent applications in AI (& Law)

63 Argument schemes: general form But also critical questions Negative answers are counterarguments Premise 1, …, Premise n Therefore (presumably), conclusion

64 Expert testimony (Walton 1996) Critical questions: Is E biased? Is P consistent with what other experts say? Is P consistent with known evidence? E is expert on D E says that P P is within D Therefore (presumably), P is the case

65 Witness testimony Critical questions: Is W sincere? (veracity) Was P evidenced by W’s senses? (objectivity) Did P occur? (observational sensitivity) Witness W says P Therefore (presumably), P

66 Perception Critical questions: Are the circumstances such that reliable observation of P is impossible? … P is observed Therefore (presumably), P

67 Memory Critical questions: Was P originally based on beliefs of which one is false? … P is recalled Therefore (presumably), P

68 ‘Unpacking’ the witness testimony scheme Critical questions: Is W sincere? (veracity) Was P evidenced by W’s senses? (objectivity) Did P occur? (observational sensitivity) Witness W says “I remember I saw P” Therefore (presumably), W remembers he saw P Therefore (presumably), W saw P Therefore (presumably), P Witness testimony

69 ‘Unpacking’ the witness testimony scheme Critical questions: Is W sincere? (veracity) Was P evidenced by W’s senses? (objectivity) Did P occur? (observational sensitivity) Witness W says “I remember I saw P” Therefore (presumably), W remembers he saw P Therefore (presumably), W saw P Therefore (presumably), P Memory

70 ‘Unpacking’ the witness testimony scheme Critical questions: Is W sincere? (veracity) Was P evidenced by W’s senses? (objectivity) Did P occur? (observational sensitivity) Witness W says “I remember I saw P” Therefore (presumably), W remembers he saw P Therefore (presumably), W saw P Therefore (presumably), P Perception

71 Applying commonsense generalisations Critical questions: are there exceptions to the generalisation? exceptional classes of people may have other reasons to flea Illegal immigrants Customers of prostitutes … P If P then usually Q Therefore (presumably), Q People who flea from a crime scene usually have consciousness of guilt Consc of Guilt Fleas If Fleas then usually Consc of Guilt

72 Arguments from consequences Critical questions: Does A also have bad (good) consequences? Are there other ways to bring about G?... Action A brings about G, G is good (bad) Therefore (presumably), A should (not) be done

73 Other work on argument-based inference Reasoning about priorities and defeat Abstract support relations between arguments Gradual defeat Other semantics Dialectical proof theories Combining modes of reasoning...

74 Part 2: Dialogue

75 ‘Argument’ is ambiguous Inferential structure Single agents (Nonmonotonic) logic Fixed information state Form of dialogue Multiple agents Dialogue theory Changing information state

76 Example P: Tell me all you know about recent trading in explosive materials (request) P: why don’t you want to tell me? P: why aren’t you allowed to tell me? P: You may be right in general (concede) but in this case there is an exception since this is a matter of national importance P: since we have heard about a possible terrorist attack P: OK, I agree (offer accepted). O: No I won’t (reject) O: since I am not allowed to tell you O: since sharing such information could endanger an investigation O: Why is this a matter of national importance? O: I concede that there is an exception, so I retract that I am not allowed to tell you. I will tell you on the condition that you don’t exchange the information with other police officers (offer)

77 Example P: Tell me all you know about recent trading in explosive materials (request) P: why don’t you want to tell me? P: why aren’t you allowed to tell me? P: You may be right in general (concede) but in this case there is an exception since this is a matter of national importance P: since we have heard about a possible terrorist attack P: OK, I agree (offer accepted). O: No I won’t (reject) O: since I am not allowed to tell you O: since sharing such information could endanger an investigation O: Why is this a matter of national importance? O: I concede that there is an exception, so I retract that I am not allowed to tell you. I will tell you on the condition that you don’t exchange the information with other police officers (offer)

78 Example P: Tell me all you know about recent trading in explosive materials (request) P: why don’t you want to tell me? P: why aren’t you allowed to tell me? P: You may be right in general (concede) but in this case there is an exception since this is a matter of national importance P: since we have heard about a possible terrorist attack P: OK, I agree (offer accepted). O: No I won’t (reject) O: since I am not allowed to tell you O: since sharing such information could endanger an investigation O: Why is this a matter of national importance? O: I concede that there is an exception, so I retract that I am not allowed to tell you. I will tell you on the condition that you don’t exchange the information with other police officers (offer)

79 Types of dialogues (Walton & Krabbe) Dialogue TypeDialogue GoalInitial situation Persuasionresolution of conflictconflict of opinion Negotiationmaking a dealconflict of interest Deliberationreaching a decisionneed for action Information seekingexchange of informationpersonal ignorance Inquirygrowth of knowledgegeneral ignorance

80 Dialogue systems (according to Carlson 1983) Dialogue systems define the conditions under which an utterance is appropriate An utterance is appropriate if it promotes the goal of the dialogue in which it is made Appropriateness defined not at speech act level but at dialogue level Dialogue game approach Protocol should promote the goal of the dialogue

81 Formal dialogue systems Topic language With a logic (possibly nonmonotonic) Communication language Locution + content (from topic language) With a protocol: rules for when utterances may be made Should promote the goal of the dialogue Effect rules (e.g. on agent’s commitments) Termination and outcome rules

82 Negotiation Dialogue goal: making a deal Participants’ goals: maximise individual gain Typical communication language: Request p, Offer p, Accept p, Reject p, …

83 Persuasion Participants: proponent (P) and opponent (O) of a dialogue topic T Dialogue goal: resolve the conflict of opinion on T Participants’ goals: P wants O to accept T O wants P to give up T Typical speech acts: Claim p, Concede p, Why p, p since S, Retract p, Deny p … Goal of argument games: Verify logical status of argument or proposition relative to given theory

84 Standards for dialogue systems Argument games: soundness and completeness wrt some logical semantics Dialogue systems: Effectiveness wrt dialogue goal Efficiency, relevance, termination,... Fairness wrt participants’ goals Can everything relevant be said?,...

85 Some standards for persuasion systems Correspondence With participants’ beliefs If union of beliefs implies p, can/will agreement on p result? If parties agree that p, does the union of their beliefs imply p?... With ‘dialogue theory’ If union of commitments implies p, can/will agreement on p result?...

86 A communication language (Dijkstra et al. 2007) Speech actAttackSurrender request(  )offer (  ’), reject(  ) - offer(  )offer(  ’) (  ≠  ’), reject(  )accept(  ) reject(  )offer(  ’) (  ≠  ’), why-reject (  ) - accept(  ) -- why-reject(  )claim (  ’) - claim(  )why(  )concede(  ) why(  )  since S (an argument)retract(  )  since Swhy(  ) (   S)  ’ since S’ (a defeater) concede(  ) concede  ’ (  ’  S) concede(  ) -- retract(  ) -- deny(  ) --

87 A protocol (Dijkstra et al. 2007) Start with a request Repy to a previous move of the other agent Pick your replies from the table Finish persuasion before resuming negotiation Turntaking: In nego: after each move In pers: various rules possible Termination: In nego: if offer is accepted or someone withdraws In pers: if main claim is retracted or conceded

88 Example dialogue formalised P: Request to tell O: Reject to tell P: Why reject to tell? Embedded persuasion... O: Offer to tell if no further exchange P: Accept after tell no further exchange

89 Persuasion part formalised O: Claim Not allowed to tell P: Why not allowed to tell? O: Not allowed to tell since telling endangers investigation & What endangers an investigation is not allowed P: Concede What endangers an investigation is not allowed O: Why National importance? P: National importance since Terrorist threat & Terrorist threat  National importance P: Exception to R1 since National importance & National importance  Exception to R1

90 Persuasion part formalised O: Claim Not allowed to tell P: Why not allowed to tell? O: Not allowed to tell since telling endangers investigation & What endangers an investigation is not allowed P: Concede What endangers an investigation is not allowed O: Why National importance? P: National importance since Terrorist threat & Terrorist threat  National importance P: Exception to R1 since National importance & National importance  Exception to R1 P: Concede Exception to R1

91 Persuasion part formalised O: Claim Not allowed to tell P: Why not allowed to tell? O: Not allowed to tell since telling endangers investigation & What endangers an investigation is not allowed P: Concede What endangers an investigation is not allowed O: Why National importance? P: National importance since Terrorist threat & Terrorist threat  National importance P: Exception to R1 since National importance & National importance  Exception to R1 O: Concede Exception to R1 O: Retract Not allowed to tell

92 Theory building in dialogue In my 2005 approach to (persuasion) dialogue: Agents build a joint theory during the dialogue A dialectical graph Moves are operations on the joint theory

93 Not allowed to tell claim

94 Not allowed to tell claimwhy

95 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed claimwhy since

96 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed claimwhy since concede

97 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed Exception to R1 claimwhy since National importance R2: national importance  Not R1 concede

98 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed Exception to R1 claimwhy since National importance R2: national importance  Not R1 why concede

99 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed Exception to R1 claimwhy since National importance R2: national importance  Not R1 Terrorist threat  national importance Terrorist threat why since concede

100 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed Exception to R1 claimwhy since National importance R2: national importance  Not R1 Terrorist threat  national importance Terrorist threat why since concede

101 Not allowed to tell Telling endangers investigation R1: What endangers an investigation is not allowed Exception to R1 claimwhy since National importance R2: national importance  Not R1 Terrorist threat  national importance Terrorist threat why since concede retract

102 Research issues Investigation of protocol properties Mathematical proof or experimentation Combinations of dialogue types Deliberation! Multi-party dialogues Dialogical agent behaviour (strategies)...

103 Further information http://people.cs.uu.nl/henry/siks/siks09.html


Download ppt "Argumentation Henry Prakken SIKS Basic Course Learning and Reasoning May 26 th, 2009."

Similar presentations


Ads by Google