Presentation is loading. Please wait.

Presentation is loading. Please wait.

Models of Legal Argumentation Trevor Bench-Capon Department of Computer Science The University of Liverpool Liverpool UK.

Similar presentations


Presentation on theme: "Models of Legal Argumentation Trevor Bench-Capon Department of Computer Science The University of Liverpool Liverpool UK."— Presentation transcript:

1 Models of Legal Argumentation Trevor Bench-Capon Department of Computer Science The University of Liverpool Liverpool UK

2 Overview Argument and Proof Arguments Based on Cases –HYPO –CATO Arguments Based on Rules Bodies of Arguments – Dung Argument Schemes – Toulmin Persuasion Using Purpose and Value

3 Argument and Proof Argument –John is old because he is aged 82 –Arguments persuade, not compel –Arguments leave things implicit. The hearer fills in the gaps and may be convinced Proof –John is aged 82 –John is a man –All men aged greater than 70 are old –82 > 70 –Therefore, John is old

4 Argument and Proof –Arguments may contain open textured concepts –The proof requires a threshold for old –The hearer needs only to accept that 82 is above the threshold Proof –John is aged 82 –John is a man –All men aged greater than 70 are old –82 > 70 –Therefore, John is old Argument –John is old because he is aged 82

5 Argument and Proof –Arguments may introduce new information –Speaker may assume John is man, but hearer knows he is a tortoise Proof –John is aged 82 –John is a man –All men aged greater than 70 are old –82 > 70 –Therefore, John is old Argument –John is old because he is aged 82

6 Legal Argument Legal Argument displays these typical characteristics of argument: –Unstated background and uncontested facts –Open texture and context dependent interpretation –New information and considerations

7 Arguments are Defeasible A sound proof has to be accepted But arguments are inherently and inescapably defeasible –They may be accepted, or challenged Different audiences may respond differently, accepting for different reasons, or offering different challenges –The challenge can be accepted and the argument withdrawn, or it can be rebutted –Thus arguments are embedded in a dialectic context – and the audience is important

8 SMILE Arguments Based on Cases Amherst Pittsburgh Rutgers IBP Ashley and Bruninghaus Trade Secrets Law Outcome Prediction GREBE Branting Industrial Injuries Semantic Net Based Wyoming

9 Arguments Based on Cases This was a focus of AI and Law from the beginning I will focus on –HYPO (Rissland and Ashley) –CATO (Ashley and Aleven) Both systems operate in US Trade Secrets Law

10 HYPO Main Features –Three-Ply Argument Structure –Use of Dimensions to Represent and Compare Cases

11 Three Ply Argument First Ply: –A case is cited by proponent Second Ply: –The citation is attacked by opponent By distinguishing the case With a counter example Third Ply: –The attack is rebutted by proponent Distinguishing the counter examples

12 Three Ply Argument Provides a simple but effective way of organising the argument Is clearly adversarial in nature Allows for both distinguishing and counter examples Can be considered as an argument scheme

13 Dimensions A dimension is a feature of the case which may need to be considered –In Trade Secret Law e.g. Security Measures Adopted Disclosures Subject to Restrictions Competitive Advantage Gained

14 Dimensions Are associated with –Preconditions –A range –Facts which determine the position within the range –A direction

15 Security Measures Adopted Precondition –The plaintiff adopted some security measures Range –From Minimal measures through some physical measures to nondisclosure agreements Facts –List of security measures adopted Direction –Stronger measures favour the plaintiff

16 Disclosures Subject to Restriction Precondition –Some disclosures were restricted Range –0-100% of disclosures restricted Facts –Percentage of disclosures restricted Direction –Plaintiff favoured by more restricted disclosures

17 Competitive Advantage Gained Precondition –Defendant saved development cost Range –$10,000 - $10,000,000 Facts –Plaintiff development time and cost –Defendant development time and cost Direction –Greater savings favour plaintiff

18 HYPO Trade Secret Example π = plaintiff ∆ = defendant CASE16 Yokana (∆) F7 Brought-Tools (π) F10 Secrets-Disclosed-Outsiders (∆) F16 Info-Reverse-Engineerable (∆) CASE30 American Precision (π) F7 Brought-Tools (π) F16 Info-Reverse-Engineerable (∆) F21 Knew-Info-Confidential (π) CASE Mason (?) F1 Disclosure-in-Negotiations (∆) F6 Security-Measures (π) F15 Unique-Product (π) F16 Info-Reverse-Engineerable (∆) F21 Knew-Info-Confidential (π) Mason (?) American Precision ( π ) F21 ( π ) F6 ( π ) F15 ( π ) Yokana ( ∆ ) F16 ( ∆ ) CFS F9 ( π ) F10 ( ∆ ) F7 ( π ) F18 ( π ) F19 ( ∆ ) F1 ( ∆ )

19 Comparing Cases On the basis of similarities between past cases and the current fact situation, HYPO forms a case lattice

20 Case Lattice

21 Typically The first level will contain both plaintiff and defendant cases These are available to be cited –KFC, American Precision or Digital Development for plaintiff –Speciner, Carver or Speedry for defendant If no case is available at the first level, we would need to descend a level until we found a case supporting our side –If F1 absent, Midland Ross or Yokana for defendant

22 First Ply (for Plaintiff) Where disclosure in negotiations, security measures, knew information confidential and unique product, plaintiff should win. Digital Development Note that pro-defendant factors are included here

23 Distinguishing Either additional pro-defendant factor in current case Or additional pro-plaintiff factor in cited case Thus we may distinguish Mason from Digital Development since the product was reverse engineerable in Mason but not Digital Development. Note that Unique Product does not distinguish Mason from KFC – it makes Mason better

24 Counter Example A case at the same level of the case lattice held for the other side E.g. Carver is CE to Digital Development Better a case with all the shared factors and more (“trumping CE”) E.g. American Precision if Midland Ross cited for the defendant

25 Third Ply - Rebuttal Distinguishes the counter examples E.g. Carver is distinguishable because security measures, knew information confidential and unique product in Mason, but not Carver

26 Argument, not a Decision Except for the trumping, more on point, counter example, we may choose which side should win We may reject the distinctions as unimportant We may follow the cited case or the counter example

27 CATO Also in US Trade Secrets Law Also uses 3 ply argument But Uses factors not dimensions Organises factors into a hierarchy, allowing additional argument moves Some additional rebutting moves

28 Factors No degree – factors either apply or do not apply The presence of a factor always favours either the plaintiff or the defendant –Security measures – plaintiff –No security measures – defendant –Outsider disclosures restricted – plaintiff –Competitive advantage - plaintiff

29 Factor Hierarchy Info Trade Secret -p Efforts to maintain Secrecy -p Info valuable -p Security Measures p No Security Measures - d Waiver of Confidentiality - d Competitive Advantage -p

30 Emphasising and Downplaying Distinctions Precedent: No security measures Case 1: Waiver of Confidentiality Case 2: Security Measures Case 1 and Case 2 can both be distinguished because no security measures is absent Case 1 can downplay the distinction because there is an alternative argument against efforts to maintain secrecy Case 2 can emphasise the distinction, because there is now no argument against efforts to maintain secrecy Confidentiality

31 Argument Moves in CATO ·Analogising a case to a past case with a favourable outcome ·Distinguishing a case with an unfavourable outcome;. ·Downplaying the significance of a decision; ·Emphasising the significance of a distinction; ·Citing a favourable case to emphasise strengths; ·Citing a favourable case to argue that weaknesses are not fatal; ·Citing a more on point counterexample to a case cited by an opponent; ·Citing an as on point counter example to a case cited by an opponent.

32 Arguments Based on Cases Cases are compared according to common features –Features tend to be at a level of abstraction above facts (issues) Arguments mainly based of differences between cases And the significance of these differences

33 Arguments Based on Rules In Law, rules often conflict –The person named in a will should inherit –A murderer should not inherit Conflicting rules provide an argument for and an argument against How do we resolve such arguments?

34 Types of Conflict Rules may conflict in several ways: –Contradictory conclusions If P then Q, If R then not Q –Denial of premises If P then Q, if R then not P –Rule inapplicable If P then Q, if R then not (if P then Q)

35 Resolution Through General Principles Prefer most specific rule –Statutes are often written as general rule and exceptions Prefer most recent rule –A recent case is preferred to an old one Prefer most authoritative rule –Supreme court better than lower courts These principles can conflict No general ordering seems possible

36 Weighing Reasons We can see the antecedents as reasons for the conclusion Some reasons may be stronger than others We should prefer the stronger reasons to the weaker reasons

37 Explicit Rule Priorities We can simply state which of a pair of conflicting rules has priority over the other Note: such priorities may themselves be the subject of debate

38 Dialogical Justification P  Q R  ¬ Q S  ¬ R T  ¬ (S  ¬ R) U  ¬ T Proponent wins Opponent wins

39 Reconstruction of HYPO Prakken and Sartor Cases are represented as 3 implications: (i) if pro-plaintiff factors then plaintiff (ii) if pro-defendant factors then defendant (iii) (i) < (ii) if defendant won, else (ii) < (i) –May be broadened by omitting factors –May be distinguished –Are deployed in a dialogue game

40 HYPO Trade Secret Example π = plaintiff ∆ = defendant CASE16 Yokana (∆) F7 Brought-Tools (π) F10 Secrets-Disclosed-Outsiders (∆) F16 Info-Reverse-Engineerable (∆) CASE30 American Precision (π) F7 Brought-Tools (π) F16 Info-Reverse-Engineerable (∆) F21 Knew-Info-Confidential (π) CASE Mason (?) F1 Disclosure-in-Negotiations (∆) F6 Security-Measures (π) F15 Unique-Product (π) F16 Info-Reverse-Engineerable (∆) F21 Knew-Info-Confidential (π) Mason (?) American Precision ( π ) F21 ( π ) F6 ( π ) F15 ( π ) Yokana ( ∆ ) F16 ( ∆ ) CFS F9 ( π ) F10 ( ∆ ) F7 ( π ) F18 ( π ) F19 ( ∆ ) F1 ( ∆ )

41 Example Yokana gives 3 rules –R1: F7  P –R2: F16 & F10  D –R3: R2 > R1 American Precision gives 3 rules –R4: F7 & F21  P –R5: F16  D –R6: R4 > R5

42 Rationales – Loui and Norman This records the progress of the dispute which may be important. –Consider a precedent which has F1 and F2 favouring plaintiff and F3 favouring the defendant and was won by plaintiff –Given a new case with only F1 it is unclear that the plaintiff should win –But suppose F2 was used to defeat F3: Now it can be seen that F1 can stand alone

43 Compare R4: F1  P R2: F3  D R5: F2  not (F3  D) R6: R5 > R2 Now we can confidently apply R4 R1: F1 & F2  P R2: F3  D R3: R1 > R2 Not clear that R4: F1  P We need a record of the dispute to decide which description is the right one

44 Argumentation Frameworks We can often view a legal dispute as a set of conflicting arguments P.M. Dung has developed an elegant way of looking at and reasoning about sets of conflicting arguments

45 Dung’s Argument Framework Introduced in AIJ 1995 Arguments at their most abstract –Only: which other arguments does an argument attack? Attacks always succeed –We cannot accept an argument and its attacker

46 Definitions An argumentation framework is a pair AF = –Where AR is a set of arguments and attacks is a binary relation on AR, i.e. attacks  AR  AR. An argument A  AR is acceptable with respect to set of arguments S if: (  x)((x  AR) &(attacks(x,A))  (  y)(y  S) & attacks(y,x). A set S of arguments is conflict-free if  (  x) (  y)( x  S) & (y  S) & attacks(x,y). A conflict-free set of arguments S is admissible if (  x)((x  S)  acceptable(x,S).

47 Preferred Extension A set of arguments S in an argumentation framework AF is a preferred extension if it is a maximal (with respect to set inclusion) admissible set of AR. Preferred Extensions are interesting because they represent maximal coherent positions, able to defend themselves against all attackers BUT: there may be multiple preferred extensions, and no way to choose between them

48 Odd Cycle a b c We can’t accept Anything here Akin to Paradoxes Preferred Extension is the empty set

49 Even Cycle a b c d We can accept Either a and c Or b and d Akin to Dilemmas Two Preferred Extensions {a,c} and {b,d}

50 In general Every AF has a preferred extension –Which may be the empty set AFs do not have a unique preferred extension –Even cycles give rise to choices An argument may be in every preferred extension (sceptically acceptable) An argument may be in some preferred extensions (credulously acceptable) An argument may be in no preferred extension (indefensible)

51 Decision Problems and Complexity Proofs of these results can be found in a series of papers by Paul Dunne and myself.

52 Example Set of Cases Pierson: Plaintiff is hunting a fox on open land. Defendant kills the fox. Keeble: Plaintiff is a professional hunter. Lures ducks to his pond. Defendant scares the ducks away Young: Plaintiff is a professional fisherman. Spreads his nets. Defendant gets inside the nets and catches the fish.

53 Ghen vs Rich: Ghen harpooned a whale, lost it. Ellis found it, sold it to Rich, who processed it. Found for Ghen. – “the iron holds the whale” Whaling is governed by conventions which the court respects

54 Conti vs ASPCA Chester, a talking parrot used by ASPCA for educational purposes, escaped. Conti found it and kept it as a pet. ASPCA reclaimed it. Found for ASPCA Chester was domesticated, and so ferae nauturae did not apply

55 Burros Cases New Mexico vs Morton Kleepe vs New Mexico –Unbranded burros straying from state lands –Showed that: Branding established possession Presence on land had to be more than accidental straying

56 Representing Keeble A: Pursuer had a right to the animal B: Pursuer not in possession C: Owns the land (so owns the animals) D: Wild animals not confined E: Efforts made to secure animals F: Pursuer has right to pursue livelihood unmolested

57 Keeble as AF B F A C D E Preferred extension is {A,C,E,F} Two ways to win

58 Pierson as AF {A, B,E} as in Keeble I, M: Pursuit is not enough J: Hypothetical: the animal was taken K: Hypothetical: animal was wounded L: Hypothetical Certain control is enough O: Reasonable prospect of capture P : Reasonable prospect too uncertain R: Reasonable prospect encourages desirable activity G: Not relevant: Interferer was trespassing H: Not relevant: Pursuer was trespassing Q: The land was open Situations which would establish right values Excludes some past cases

59 Pierson as AF B A E Preferred extensions are {B,I,M,P,Q} and {A,E,O,Q,R} M (P) L KJ I Q H G O (R) Two cycle

60 Young as AF Arguments in Pierson are all relevant– but now L is applicable and P is not F from Keeble is present S: Defendant was in competition with the plaintiff T: The competition was unfair U: Not for the court to rule on what is unfair competition.

61 Young as AF (Trespass omitted) B A E Preferred extension is {B,L,S,U} Argument U breaks the 4 cycles M (P) L KJ I O (R) F T S U

62 Ghen Versus Rich New Argument V: –The iron holds the whale is a convention throughout the whaling industry Attacks U: establishes what is unfair competition is whaling Attacks B: Establishes what counts as possession in whaling

63 Ghen as AF (Trespass omitted) B A E Preferred extension is {A,E,F,K,T,V} M (P) L KJ I O (R) F T S U V

64 Conti and Burros Cases Add some special cases –W: Domestication sufficient –X: Unbranded animals go to the owner of the land –Y: Branding sufficient –Z: Animals must live on the land: straying on to someone’s land does not affect title

65 Effect on AF B F A C D E W X Y Z

66 Argumentation Framework for Animals Cases A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N Analysis taken from Bench-Capon 2002 Jurix 2002

67 Some cycles here A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N

68 Argument Schemes In Dung’s framework anything can count as an argument, and anything can count as an attack Argument schemes suggest a form that arguments should have Argument schemes prescribe what will count as an attack

69 Modus Ponens as Argument Scheme Form: –If Antecedent then Consequent and –Antecedent: therefore –Consequent Attacks: –Consequent is not the case –Antecedent is not true –Consequent does not follow from Antecedent Compare: The three kinds of conflict for rule systems

70 Witness Testimony Form: –Witness 1 says that A and –Witness 1 is an a position to have observed A; therefore –A–A Attacks: –Witness 2 says A is not the case –Witness 1 not in position to have observed A –Witness 1 is mistaken –Witness 1 is lying

71 Toulmin Argument Schema One general Argument Schema that has been much used in AI and Law derives from Stephen Toulmin. Introduces –Modal Qualification –Backing –Rebuttal

72 Toulmin’s Argument Scheme DataClaim Warrant Backing Rebuttal Modal

73 Toulmin Example John is 82 John is old Over 80 Is old Demographic Data John is a Tortoise Normally

74 Useful To identify different roles for premises: –Basic data –General Rules –Justification –Degree of support –Exceptions Used –in explanation, –to organise the presentation of an argument –as the basis of dialogue games

75 Systems Using Toulmin Toulmin’s schema (sometimes adapted) has been used by –Marshall (1989) – organisation of legal argument –Lutomski (1989) – presentation of expert testimony –Stoors (1991) – organisation of policy argument –Bench-Capon (1985) – explanation –Bench-Capon (1998) – dialogue game –Zeleznikow and Stranieri - explanation

76 Argument Schemes Potentially a very fruitful area of study –Especially particular schema (such as witness testimony) As yet rather under researched

77 Disagreement and Persuasion In the remainder of the talk I will look at some of my current work The starting point is why do people disagree? And when they do, how do they persuade one another? I will look at –an extension to Dung’s framework, –an argument scheme for persuasive argument

78 Perelman says: If men oppose each other concerning a decision to be taken, it is not because they commit some error of logic or calculation. They discuss apropos the applicable rule, the ends to be considered, the meaning to be given to values, the interpretation and characterisation of facts.

79 Taxation Debate Raise taxes to promote equality Lower taxes to promote enterprise Brown sees force in both arguments – but what Brown does depends on (reveals?) whether Brown prefers equality or enterprise at a given time

80 To allow for rational disagreement We must distinguish attack from defeat We can accept arguments which are attacked, AND their attackers, provided the attacks fail Dung’s framework is too abstract to allow such talk – we need to be able to discuss value as well as conflict

81 Value-based Argumentation Framework A value-based argumentation framework (VAF) is a 5-tuple: VAF = As for Standard AF Set of values Function Mapping Elements of AR To Elements of V Set of Possible Audiences

82 Following Perelman we want to use the notion of an audience Audiences will have different preferences between values We individuate audiences by their ordering on values There are as many audiences as there are value orderings

83 Audience Specific VAF An audience specific VAF (AVAF) is a 5-tuple: AVAF = As for Standard AF Set of values Function Mapping Elements of AR To Elements of V Valpref a is the value preferences of audience a

84 Defeat in AVAF An argument A  AF defeats a an argument B  AF for audience a if and only if both attacks(A,B) and not valpref a (val(B),val(A)). Note: An argument is defeated by an attacker with the same value Defeat is always relative to an audience If there is only one value in V we have a standard argumentation framework

85 Definitions for AVAF An argument A  AR is acceptable to audience a with respect to set of arguments S, if: (  x)((x  AR & defeats a (x,A))  (  y)((y  S) & defeats a (y,x))). A set S of arguments is conflict-free for audience a if (  x) (  y)(( x  S & y  S)  (  attacks(x,y)  valpref(val(y),val(x)  valpref a ))). A conflict-free set of arguments S is admissible for audience a if (  x)(x  S  acceptable a (x,S)).

86 Preferred Extension of an AVAF A set of arguments S in an value- based argumentation framework is a preferred extension for audience a if it is a maximal (with respect to set inclusion) admissible for audience a set of AR.

87 Objective Acceptance An argument is objectively acceptable if it is in the preferred extension for every audience An argument if subjectively acceptable if it is in the preferred extension for some audience An argument is indefensible if it is no preferred extension of any audience

88 Two Valued Three Cycle a b c If blue > red, preferred extension is {a,b} If red > blue, preferred extension is {b,c} Note: b is in the preferred extension whatever the value order

89 Two Valued Four Cycle Connected Colours a b c d If blue > red, preferred extension is {a,c} If red > blue, preferred extension is {a,c} Preferred extension is unique, AND independent of value order

90 Some Technical Results on VAFS If there are no cycles with a single value, then there is a unique preferred extension –Efficient algorithm to find the preferred extension Cycles can give rise to objective acceptance –Odd cycles with more than one value –Some configurations of even cycles Possibilities to prune lines of argument with repeating values Heuristics to select attacks Note: what causes difficulties without values is a source of Objective Acceptance with them

91 Recall that there were Cycles in our Animals Cases A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N How does considering values help?

92 Pierson A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N A: Pierson Had A Right To the Animal B: Pierson had No possession E: Pierson was in full pursuit I: Pursuit not Enough O: Seizure not necessary (we want to encourage socially useful activity) M: we must insist on possession for clear law M and O form a 2-cycle: resolved by Value So A is Subjectively acceptable Blue: Need clear law Orange: Encourage useful activity

93 Keeble I A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N Green: Protect property rights C: owns the land so possesses the animals D: Animals not confined X: Unbranded animals belong to landowner. Not needed: Useless if blue greater than green Unnecessary else

94 Keeble II A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N Red: Promote economic activity F: Keeble was pursuing his livelihood

95 Young A S FCB GHD E I JKL M [P] O [R] Q T U V W X Y Z N Purple: Restrictive view of role of courts S: Defendant in Competition T: Competition was Unfair U: Not for the Court to rule on what is unfair competition U breaks the even cycle BTSEB Without U B is defeated by its position in the even cycle Note: 4 cycle BTSEB TE objectively acceptable

96 Schema For Persuasive Argument To consider individual arguments we need to look inside the nodes to see what an argument looks like and how it can be attacked We have developed a general schema for persuasive argument in practical reasoning This schema can be applied to reasoning with legal cases

97 Form of Justification of An Action It was right to do action A In those circumstances R To bring about these new circumstances S Which realised this goal G Which promoted this value V

98 Schematically R  S  G  V  A Effect of an action depends on the situation The goal is the subset of S that we wanted to bring about, the reason we did A The value is the purpose for which we wanted to realise the goal We refer to a justification of this form as a position

99 Attacking A Position A position can be attacked in a variety of ways: –Denial of an element E.g. A will not produce S from R –Contradiction of an element E.g. G promotes W not V –Alternatives E.g. B will also produce S from R –Side Effects E.g. G demotes W as well as promoting V We have identified 15 possible attacks – some with variants

100 Law As Practical Reasoning We need to choose one of two actions –Decide for the plaintiff –Decide for the defendant Circumstances are the case facts + a record of the decision Goals are subsets of the facts + a record of the decisions Values are behaviours to encourage and discourage Note: we see the judgement as a choice of action Not the identification of a property of the case

101 To Illustrate F1 F2 F3 … Fn Pwins Dwins Undecided Case Deciding for P produces Decided Case F1 F2 F3 … Fn Pwins Dwins Goal Encourages potential plaintiffs to realise Fn and not F2 selection from decided case

102 In this Representation 7 of the 15 attacks are not possible 2 pairs of attacks are identical –Only two actions –Actions always achieve the same result –Goals straightforward consequences of decided case –Distinct actions realise distinct goals One attack has two distinct variants Thus we can look for seven distinct forms of attack

103 Attacks and Argument Moves Two challenge the representation –Factors used to represent a case –Values associated with factors Four are variants of distinguishing a case One seems to be in neither HYPO nor CATO: disagreement as to which value is promoted in this context

104 Four Types of Distinction Precedent stronger: can be downplayed Current case weaker: can be downplayed Precedent stronger: can be emphasised Current case weaker: can be emphasised A single move in HYPO Two moves in CATO

105 Counter Examples A different position based on another precedent justifying the other action –Can be attacked in the same ways as the original position A rebuttal of the choice of goal –Same factors as G, but different outcome –Can only be met by reformulating the goal –Like a trumping counter example in HYPO

106 Summary We have seen –How arguments differ from proof –Two systems for case based argument in AI and Law –Arguments based on conflicting rules –Reasoning about sets of arguments –Argument schemes –How notions of value and purpose can be used

107


Download ppt "Models of Legal Argumentation Trevor Bench-Capon Department of Computer Science The University of Liverpool Liverpool UK."

Similar presentations


Ads by Google