Presentation is loading. Please wait.

Presentation is loading. Please wait.

Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent.

Similar presentations


Presentation on theme: "Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent."— Presentation transcript:

1 Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent ART) Group Department of Computer Science University of Liverpool, Liverpool UK {p.j.mcburney,s.d.parsons}@csc.liv.ac.uk Presentation to: Department of Computer Science University of Liverpool 6 February 2001

2 Representing Uncertainty with Argumentation2 Nature of the problem b Problem: Assessing the health risks of new chemicals and technologies b Classical decision theory methods require: Explicit delineation of all outcomesExplicit delineation of all outcomes Quantification of uncertainties and consequences.Quantification of uncertainties and consequences. b But for most domains: Scientific knowledge often limited (especially at outset)Scientific knowledge often limited (especially at outset) Experimental evidence ambiguous and conflictingExperimental evidence ambiguous and conflicting No agreement on quantification.No agreement on quantification.

3 Representing Uncertainty with Argumentation3 Types of evidence for chemical carcinogenicity b Chemical structure comparison b Mutagenic tests on tissue cultures b Animal bioassays b Human epidemiological studies b Explication of biomedical causal pathways. b These different sources of evidence may conflict. E.g. Formaldehyde.E.g. Formaldehyde.

4 Representing Uncertainty with Argumentation4 Risk Assessment for chemical X Are there adverse health effects from exposure to chemical X ? Are there adverse health effects from exposure to chemical X ? What is the likelihood and size of impact? What is the likelihood and size of impact? What should be done about chemical X ? What should be done about chemical X ?

5 Representing Uncertainty with Argumentation5 Argumentation to represent uncertainty b Two meanings of argument: A case for a claim (a tentative proof)A case for a claim (a tentative proof) A debate between people about a claim.A debate between people about a claim. b Our degree of certainty in a claim depends on the cases for and against it. The more and stronger cases against, the less certainty.The more and stronger cases against, the less certainty. A consensus in favour of a claim indicates the greatest certainty.A consensus in favour of a claim indicates the greatest certainty. b We can therefore represent uncertainty by means of dialectical argumentation. b We also require a mechanism for generating inferences from the dialectical status of a claim.

6 Representing Uncertainty with Argumentation6 Philosophical underpinning b We have adopted an explicit philosophy of science: Peras (1994) model of science as a 3-person game:Peras (1994) model of science as a 3-person game: –The Experimenter + Nature + The Scientific Community. Feyerabends (1971) philosophy of science as epistemological anarchism:Feyerabends (1971) philosophy of science as epistemological anarchism: –There are no absolute standards which distinguish science from non-science –Standards differ by time, by discipline and by context. b We see two principles as necessary for an activity to be called science: All claims are contestable by anyone (in the community)All claims are contestable by anyone (in the community) All claims are defeasible, with reasoning always to the best explanation.All claims are defeasible, with reasoning always to the best explanation.

7 Representing Uncertainty with Argumentation7 Peras Philosophy of Science Experimenter Nature Scientific Community Scientific Community Proposes and Undertakes Experiment Interprets results of experiment Responds to Experiment

8 Representing Uncertainty with Argumentation8 To model these, we need: b A theory of rational discourse between reasonable, consenting participants: Hitchcocks (1991) principles of rational mutual inquiryHitchcocks (1991) principles of rational mutual inquiry The discourse ethics of Habermas and Alexy (1978).The discourse ethics of Habermas and Alexy (1978). b A model for an argument: Toulmins (1958) argument schema.Toulmins (1958) argument schema. b A means to formalize complex dialogues: Walton and Krabbes (1995) characterization of different types of dialoguesWalton and Krabbes (1995) characterization of different types of dialogues Formal dialogue-games of Hamblin (1970, 1971) and MacKenzie (1979, 1990).Formal dialogue-games of Hamblin (1970, 1971) and MacKenzie (1979, 1990).

9 Representing Uncertainty with Argumentation9 Hitchcocks Principles b 18 Principles of rational mutual discourse, for example: Dialectification: The content and methods of dialogue should be decided by the participants.Dialectification: The content and methods of dialogue should be decided by the participants. Mutuality: no statement becomes a commitment of a participant unless he or she specifically accepts it.Mutuality: no statement becomes a commitment of a participant unless he or she specifically accepts it. Orderliness: one issue is raised and discussed at a time.Orderliness: one issue is raised and discussed at a time. Logical pluralism: both deductive and non-deductive inference is permitted.Logical pluralism: both deductive and non-deductive inference is permitted. Rule-consistency: there should be no situation where the rules prohibit all acts, including the null act.Rule-consistency: there should be no situation where the rules prohibit all acts, including the null act. Realism: the rules must make agreement between participants possible.Realism: the rules must make agreement between participants possible. Retraceability: participants must be free at all times to supplement, change or withdraw previous tentative commitments.Retraceability: participants must be free at all times to supplement, change or withdraw previous tentative commitments. Role reversability: the rules should permit the responsibility for initiating suggestions to shift between participants.Role reversability: the rules should permit the responsibility for initiating suggestions to shift between participants.

10 Representing Uncertainty with Argumentation10 Alexys Discourse Rules b Rules for discourse over moral and ethical questions, for example: Freedom of assemblyFreedom of assembly Common languageCommon language Freedom of speechFreedom of speech Freedom to challenge claimsFreedom to challenge claims Arguments required for claimsArguments required for claims Freedom to challenge argumentsFreedom to challenge arguments Freedom to disagree over modalitiesFreedom to disagree over modalities Requirement for clarification and precizationRequirement for clarification and precization Proportionate defenceProportionate defence No self-contradictions permitted.No self-contradictions permitted.

11 Representing Uncertainty with Argumentation11 Toulmins Argument Schema probably Modality X is carcinogenic to humans X is carcinogenic to humans Claim X is a chemical of type T X is a chemical of type T Data X is not carcinogenic to rats X is not carcinogenic to rats Rebuttal Epidemiological evidence for others Epidemiological evidence for others Backing Most other Type T chemicals are carcinogenic to humans Most other Type T chemicals are carcinogenic to humans Warrant Epidemiological evidence not unambiguous Epidemiological evidence not unambiguous Undercut (Pollock)

12 Representing Uncertainty with Argumentation12 Walton and Krabbes Typology of Dialogues b Information-seeking dialogues One participant seeks the answer to a question.One participant seeks the answer to a question. b Inquiries All participants collaborate to find the answer to a question.All participants collaborate to find the answer to a question. b Persuasions One participant seeks to persuade other(s) of the truth of a proposition.One participant seeks to persuade other(s) of the truth of a proposition. b Negotiations Participants seek to divide a scarce resource.Participants seek to divide a scarce resource. b Deliberations Participants collaborate to decide a course of action in some situation.Participants collaborate to decide a course of action in some situation. b Eristic dialogues Participants quarrel verbally as a substitute for physical fighting.Participants quarrel verbally as a substitute for physical fighting.

13 Representing Uncertainty with Argumentation13 Risk Assessment for chemical X Are there adverse health effects from exposure to chemical X ? Are there adverse health effects from exposure to chemical X ? What is the likelihood and size of impact? What is the likelihood and size of impact? What should be done about chemical X ? What should be done about chemical X ? Scientific Dialogues Regulatory Dialogue

14 Representing Uncertainty with Argumentation14 Risk Assessment Dialogues b Scientific dialogues: Does exposure (in a certain way at certain dose levels) to chemical X lead to adverse health effects? If so, what is the likelihood and magnitude of impact?Does exposure (in a certain way at certain dose levels) to chemical X lead to adverse health effects? If so, what is the likelihood and magnitude of impact? A mix of:A mix of: –Inquiries –Persuasion dialogues. b A regulatory dialogue: What regulatory actions (if any) should be taken regarding chemical X ?What regulatory actions (if any) should be taken regarding chemical X ? A mix of:A mix of: –Inquiries –Deliberations –Negotiations –Persuasion dialogues.

15 Representing Uncertainty with Argumentation15 Dialogue Games b Games between 2+ players where each moves by uttering a locution. Developed by philosophers to study fallacious reasoning.Developed by philosophers to study fallacious reasoning. Used in: agent dialogues (Parsons & Amgoud), software development (Stathis), modeling legal reasoning (Bench-Capon et al., Prakken).Used in: agent dialogues (Parsons & Amgoud), software development (Stathis), modeling legal reasoning (Bench-Capon et al., Prakken). b Rules define circumstances of: Commencement of the dialogueCommencement of the dialogue Permitted locutionsPermitted locutions Combinations of locutionsCombinations of locutions –e.g. cannot assert a proposition and its negation CommitmentCommitment –When does a player commit to some claim? Termination of the dialogue.Termination of the dialogue.

16 Representing Uncertainty with Argumentation16 The Risk Agora b A formal framework for representing dialogues concerning carcinogenic risk of chemicals. Represent the arguments for and against a chemical being a carcinogen.Represent the arguments for and against a chemical being a carcinogen. Represent the current state of scientific knowledge, including epistemic uncertainty.Represent the current state of scientific knowledge, including epistemic uncertainty. Enable contestation and defence of clains and arguments.Enable contestation and defence of clains and arguments. Enable comparison and synthesis of arguments for specific claims.Enable comparison and synthesis of arguments for specific claims. Enable summary snapshots of the debate at any time.Enable summary snapshots of the debate at any time. b We have fully specified the locutions and rules for a dialogue-game for scientific discourses.

17 Representing Uncertainty with Argumentation17 Speaking in the Agora b Participants can: Propose or assert claims, arguments, grounds, inference-rules, consequencesPropose or assert claims, arguments, grounds, inference-rules, consequences Modify each with modalitiesModify each with modalities Question or contest others proposals or assertionsQuestion or contest others proposals or assertions Accept others proposals or assertions.Accept others proposals or assertions. b Examples of locutions: propose ( participant 1: (claim, modality) )propose ( participant 1: (claim, modality) ) assert ( participant 1: (claim, modality) )assert ( participant 1: (claim, modality) ) show_arg ( participant 1: (arg_for_claim, modalities) )show_arg ( participant 1: (arg_for_claim, modalities) ) contest ( participant 2: propose ( participant 1: (claim, modality) ) )contest ( participant 2: propose ( participant 1: (claim, modality) ) ) etc.etc.

18 Representing Uncertainty with Argumentation18 Representing uncertainty in the Agora b We represent the degree of uncertainty in a claim by means of its dialectical argument status in the Agora. b We use a dictionary of labels due to Krause, Fox et al. (1998). We have modified definitions slightly to allow for counter-counter-arguments.We have modified definitions slightly to allow for counter-counter-arguments. This is an example, and other modality dictionaries could be defined.This is an example, and other modality dictionaries could be defined. A claim is: b Open - no arguments presented yet for it or against it. b Supported - at least one grounded argument presented for it. b Plausible - at least one consistent, grounded argument presented for it. b Probable - at least one consistent, grounded argument presented and no rebuttals or undercuts presented. b Accepted - at least one consistent, grounded argument presented for it and any rebuttals or undercuts have been attacked with counter-arguments.

19 Representing Uncertainty with Argumentation19 Debating experimental tests of claims b We also permit debate on: The validity of experiments to test scientific claims.The validity of experiments to test scientific claims. The results of valid experiments.The results of valid experiments. b An experimental test of a claim is: Open - no evidence either way.Open - no evidence either way. Invalid test - the scientific experiment is not accepted by the participants as a valid test of the claimInvalid test - the scientific experiment is not accepted by the participants as a valid test of the claim Inconclusive test - the test is accepted as valid, but the results are not accepted as statistically significant support for the claim or against it.Inconclusive test - the test is accepted as valid, but the results are not accepted as statistically significant support for the claim or against it. Disconfirming instance - the test is accepted as evidence against the claim.Disconfirming instance - the test is accepted as evidence against the claim. Confirming instance - the test is accepted as evidence for the claim.Confirming instance - the test is accepted as evidence for the claim.

20 Representing Uncertainty with Argumentation20 Experimental status of claims b Claims are then assigned labels according to the extent that debate in the Agora accepts experimental evidence for and against them. b A claim is: UntestedUntested InconclusiveInconclusive RefutedRefuted Confirmed.Confirmed. b Experimental evidence in favour of a claim can be presented as an argument for the claim.

21 Representing Uncertainty with Argumentation21 Inference from the Agora b We define a claim as (defeasibly) true at time t if and only if it is Accepted in the Agora at time t.if and only if it is Accepted in the Agora at time t. Otherwise, it is not (defeasibly) true at time t.Otherwise, it is not (defeasibly) true at time t. b This notion of truth depends on the opinions of the participants in the Agora, which may change over time. As more evidence is obtained and further arguments presented to the Agora, the truth status of a claim may change.As more evidence is obtained and further arguments presented to the Agora, the truth status of a claim may change. Such changes may be non-monotonic.Such changes may be non-monotonic.

22 Representing Uncertainty with Argumentation22 Formal properties of the Agora: b The Agora dialogue-game rules comply with: Alexys discourse rulesAlexys discourse rules 15 of Hitchcocks 18 Principles.15 of Hitchcocks 18 Principles. b Acceptability of claims is a game-theoretic semantics (Hintikka 1968): Truth of a proposition depends on a participant in the Agora having a strategy to defeat any opponent in the dialogue-game associated with the proposition.Truth of a proposition depends on a participant in the Agora having a strategy to defeat any opponent in the dialogue-game associated with the proposition. b Inference from finite snap-shots to the long-run is well-founded: We can place probabilistic bounds on the possibility of errors of inference from finite snapshots to values at infinity.We can place probabilistic bounds on the possibility of errors of inference from finite snapshots to values at infinity. This is analogous to the Neyman-Pearson (1928) theory of statistical inference.This is analogous to the Neyman-Pearson (1928) theory of statistical inference.

23 Representing Uncertainty with Argumentation23 Agora debate Time Inference from snapshots to infinite status Open Status for Claim P: (With apologies to Jackson Pollock) ProbablePlausibleAccepted Snapshots

24 Representing Uncertainty with Argumentation24 Theorem: Stability of labels in absence of new information. Let P be a claim. Suppose that: Ê A(P) is a consistent argument for P such that all rebuttals and undercuts against A(P) are themselves attacked by other arguments, Ë All arguments pertaining to P using the initial information and inference rules are eventually articulated by participants within the Agora, and Ì No new information concerning P is received by participants following commencement. Then: b The uncertainty label for P converges to Accepted as time goes to infinity.

25 Representing Uncertainty with Argumentation25 Key Theorem: Probability of Inference Errors is bounded. Consider a claim P. Suppose that: Ê The uncertainty label for P converges to a limit at infinity, Ë A snapshot is taken at a time t after all relevant arguments related to P have been presented, Ì The uncertainty label of P at time t is Accepted, and Í The probability of new information relevant to P arising after time t is less than, for some 0 < < 1. Then: The probability that the uncertainty label for claim P at infinity is also Accepted is at least 1 -.

26 Representing Uncertainty with Argumentation26 Example: b Assumptions: K1: The chemical X is produced by the human body naturally (it is endogenous).K1: The chemical X is produced by the human body naturally (it is endogenous). K2: X is endogeneous in rats.K2: X is endogeneous in rats. K3: An endogenous chemical is not carcinogenic.K3: An endogenous chemical is not carcinogenic. K4: Bioassays of X on rats show significant carcinogenic effects.K4: Bioassays of X on rats show significant carcinogenic effects. b Rules of inference: R1 (And Introduction): From P and Q, infer (P &Q).R1 (And Introduction): From P and Q, infer (P &Q). R2 (Modus Ponens): From P and (P implies Q) infer Q.R2 (Modus Ponens): From P and (P implies Q) infer Q. R3: If a chemical is carcinogenic in an animal species, infer that it is also carcinogenic in humans.R3: If a chemical is carcinogenic in an animal species, infer that it is also carcinogenic in humans.

27 Representing Uncertainty with Argumentation27 Example (cont): A dialogue concerning the statement P = X is carcinogenic to humans Snapshot status of Claim P: OpenSnapshot status of Claim P: Open Ê assert (Participant 1: (P, confirmed) ) Ë query (Participant 2: assert (Participant 1: (P, confirmed))) Ì show_arg (Participant 1: (K4, R3, P, (Confirmed, Valid, Confirmed)) Snapshot status of Claim P: AcceptedSnapshot status of Claim P: Accepted Í contest (Participant 2: assert (Participant 1: (P, confirmed))) Î query [Participant 3: contest (Participant 2: assert (Participant 1: (P, confirmed)))) Ï propose (Participant 2: (not-P, Plausible)) Ð query [Participant 1: propose (Participant 2: (not-P, Plausible)) Ñ show_arg (Participant 2: ((K1, K3), R2, not-P, (Confirmed, Probable, Valid, Plausible))) Snapshot status of Claim P: PlausibleSnapshot status of Claim P: Plausible

28 Representing Uncertainty with Argumentation28 Whats next: b A model of a deliberation dialogue Dialogues about what action(s) to take.Dialogues about what action(s) to take. Have proposed a model based on Wohlrapps (1998) retroflexive argumentation, a model of non-deductive inference (joint work with David Hitchcock).Have proposed a model based on Wohlrapps (1998) retroflexive argumentation, a model of non-deductive inference (joint work with David Hitchcock). b Locutions specific to regulatory domain Have proposed a first set using Habermas (1981) Theory of Communicative Action.Have proposed a first set using Habermas (1981) Theory of Communicative Action. b A means to combine different types of dialogue Have proposed a formalism using Parikhs (1985) Game Logic, a version of Dynamic Modal Logic (the modal logic of processes).Have proposed a formalism using Parikhs (1985) Game Logic, a version of Dynamic Modal Logic (the modal logic of processes). b A qualitative decision theory Will draw on Fox and Parsons (1998).Will draw on Fox and Parsons (1998).

29 Representing Uncertainty with Argumentation29 Other formal properties under exploration: b Can we automate these dialogues? b Will automated dialogues ever terminate? Under what circumstances?Under what circumstances? After how many moves? (Computational complexity).After how many moves? (Computational complexity). b When are two dialogues the same? b How do we assess the quality of a dialogue system? b How sensitive is the framework to changes in the game rules?

30 Representing Uncertainty with Argumentation30 Thanks to: b EPSRC Grant GR/L84117: Qualitative Decision TheoryGrant GR/L84117: Qualitative Decision Theory Grant GR/N35441/01: Symposium on Argument and ComputationGrant GR/N35441/01: Symposium on Argument and Computation Phd Studentship.Phd Studentship. b European Union Information Society Technologies Programme (IST): Sustainable Lifecycles in Information Ecosystems (SLIE) (IST-1999-10948).Sustainable Lifecycles in Information Ecosystems (SLIE) (IST-1999-10948). b Trevor Bench-Capon, Computer Science Dept, University of Liverpool. b John Fox, Advanced Computation Laboratory, Imperial Cancer Research Fund, London. b David Hitchcock, Philosophy Dept, McMaster University, Hamilton, Ontario. b Anonymous referees (UAI, GTDT, AMAI).


Download ppt "Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent."

Similar presentations


Ads by Google