Making Simple Decisions

Slides:



Advertisements
Similar presentations
Utility theory U: O-> R (utility maps from outcomes to a real number) represents preferences over outcomes ~ means indifference We need a way to talk about.
Advertisements

Making Simple Decisions Chapter 16 Some material borrowed from Jean-Claude Latombe and Daphne Koller by way of Marie desJadines,
Making Simple Decisions
DSC 3120 Generalized Modeling Techniques with Applications
Preference Elicitation Partial-revelation VCG mechanism for Combinatorial Auctions and Eliciting Non-price Preferences in Combinatorial Auctions.
Decision Analysis Chapter 3
THE HONG KONG UNIVERSITY OF SCIENCE & TECHNOLOGY CSIT 5220: Reasoning and Decision under Uncertainty L09: Graphical Models for Decision Problems Nevin.
1 Chapter 12 Value of Information. 2 Chapter 12, Value of information Learning Objectives: Probability and Perfect Information The Expected Value of Information.
Judgment and Decision Making in Information Systems Utility Functions, Utility Elicitation, and Risk Attitudes Yuval Shahar, M.D., Ph.D.

Multi-Attribute Utility Models with Interactions
Introduction Dr. Yan Liu Department of Biomedical, Industrial & Human Factors Engineering Wright State University.
Cooperating Intelligent Systems Utility theory Chapter 16, AIMA.
91.420/543: Artificial Intelligence UMass Lowell CS – Fall 2010 Lecture 16: MEU / Utilities Oct 8, 2010 A subset of Lecture 8 slides of Dan Klein – UC.
Making Simple Decisions Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 16.
Decision Making Under Uncertainty Russell and Norvig: ch 16 CMSC421 – Fall 2006.
Decision Analysis Chapter 3
Decision Making Under Uncertainty Russell and Norvig: ch 16, 17 CMSC421 – Fall 2003 material from Jean-Claude Latombe, and Daphne Koller.
CMSC 671 Fall 2003 Class #26 – Wednesday, November 26 Russell & Norvig 16.1 – 16.5 Some material borrowed from Jean-Claude Latombe and Daphne Koller by.
Utility Theory & MDPs Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart.
PGM 2003/04 Tirgul7 Foundations of Decision Theory (mostly from Pearl)
Decision Analysis (cont)
Chapter 3 Decision Analysis.
Reminder Midterm Mar 7 Project 2 deadline Mar 18 midnight in-class
1 Subjective Probability for Modeling Uncertainty with Limited Data Efstratios Nikolaidis The University of Toledo April 2009.
Policy Analysis: Frameworks and Models Stokey and Zeckhauser Ch1-3 Jenkins-Smith Ch1-3 Weimer and Vining Ch1-2.
Axioms Let W be statements known to be true in a domain An axiom is a rule presumed to be true An axiomatic set is a collection of axioms Given an axiomatic.
Advanced AI Rob Lass July 26, Administrative WebCT Problems? Why aren’t homeworks graded!? Midterm next Wed (Aug. 2nd)
Making Simple Decisions Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 16.
Chapter 5 Choice Under Uncertainty. Chapter 5Slide 2 Topics to be Discussed Describing Risk Preferences Toward Risk Reducing Risk The Demand for Risky.
Introduction to Bayesian Networks
CS 188: Artificial Intelligence Fall 2006 Lecture 18: Decision Diagrams 10/31/2006 Dan Klein – UC Berkeley.
Quantitative Decision Techniques 13/04/2009 Decision Trees and Utility Theory.
Lecture 3 on Individual Optimization Uncertainty Up until now we have been treating bidders as expected wealth maximizers, and in that way treating their.
Simulation is the process of studying the behavior of a real system by using a model that replicates the system under different scenarios. A simulation.
CS 188: Artificial Intelligence Spring 2007 Lecture 21:Reinforcement Learning: I Utilities and Simple decisions 4/10/2007 Srini Narayanan – ICSI and UC.
Decision Making Under Uncertainty CMSC 671 – Fall 2010 R&N, Chapters , , material from Lise Getoor, Jean-Claude Latombe, and.
Copyright © 2009 Cengage Learning 22.1 Chapter 22 Decision Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 23 Decision Analysis.
Making Simple Decisions Utility Theory MultiAttribute Utility Functions Decision Networks The Value of Information Summary.
Chapter 16: Making Simple Decision March 23, 2004.
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Decision Making Under Uncertainty CMSC 471 – Spring 2014 Class #12– Thursday, March 6 R&N, Chapters , material from Lise Getoor, Jean-Claude.
1 Optimizing Decisions over the Long-term in the Presence of Uncertain Response Edward Kambour.
S ystems Analysis Laboratory Helsinki University of Technology 1 Decision Analysis Raimo P. Hämäläinen Systems Analysis Laboratory Helsinki University.
1 Chapter 17 2 nd Part Making Complex Decisions --- Decision-theoretic Agent Design Xin Lu 11/04/2002.
Chapter 11 – Introduction to Risk Analysis u Why do individuals, companies, and stockholders take risks?
© 2015 McGraw-Hill Education. All rights reserved. Chapter 16 Decision Analysis.
Web-Mining Agents Agents and Rational Behavior Decision-Making under Uncertainty Simple Decisions Ralf Möller Universität zu Lübeck Institut für Informationssysteme.
Making Simple Decisions Chapter 16 Some material borrowed from Jean-Claude Latombe and Daphne Koller by way of Marie desJadines,
CS 188: Artificial Intelligence Spring 2009 Lecture 20: Decision Networks 4/2/2009 John DeNero – UC Berkeley Slides adapted from Dan Klein.
Chapter 16 March 25, Probability Theory: What an agent should believe based on the evidence Utility Theory: What the agent wants Decision Theory:
Decision Making ECE457 Applied Artificial Intelligence Spring 2007 Lecture #10.
Decisions under uncertainty and risk
CS 188: Artificial Intelligence Spring 2006
Nevin L. Zhang Room 3504, phone: ,
Ralf Möller Universität zu Lübeck Institut für Informationssysteme
Making Simple Decisions
Operations Management
Expectimax Lirong Xia. Expectimax Lirong Xia Project 2 MAX player: Pacman Question 1-3: Multiple MIN players: ghosts Extend classical minimax search.
CS 4/527: Artificial Intelligence
Risk Chapter 11.
Rational Decisions and
13. Acting under Uncertainty Wolfram Burgard and Bernhard Nebel
Making Simple Decisions
CS 188: Artificial Intelligence Fall 2007
CS 416 Artificial Intelligence
Making Simple Decisions
Decision Making.
Making Simple Decisions
Presentation transcript:

Making Simple Decisions

Outline Combining Belief & Desire Basis of Utility Theory Utility Functions. Multi-attribute Utility Functions. Decision Networks Value of Information Expert Systems

Combining Belief & Desire(1) Rational Decision based on Belief & Desire where uncertainty & conflicting goals exist Utility assigned single number by some Fn. express the desirability of a state combined with outcome prob. expected utility for each action

Combining Belief & Desire(2) Notation U(S) : utility of state S S : snapshot of the world A : action of the agent : outcome state by doing A E : available evidence Do(A) : executing A in current S

Combining Belief & Desire(3) Expected Utility Maximum Expected Utility(MEU) Choose an action which maximize agent’s expected utility

Basis of Utility Theory(1) Rational preference Preference of rational agent obey constraints Behavior is describable as maximization of expected utility

Basis of Utility Theory(2) Notation Lottery(L): a complex decision making scenario Different outcomes are determined by chance. : A is prefered to B : indifference btw. A & B : B is not prefered to A

Basis of Utility Theory(3) Constraints Orderability Transitivity Continuity

Basis of Utility Theory(4) Constraints (cont.) Substitutability Monotonicity Decomposability

Basis of Utility Theory(5) Utility principle Maximum Expected Utility principle Utility Fn. Represents that the agent’s actions are trying to achieve. Can be constructed by observing agent’s preferences.

Utility Functions. (1) Utility continue 0.9 0.1 death mapping state to real numbers approach Compare A to standard lottery u : best possible prize with prob. p u : worst possible catastrophe with prob. 1-p Adjust p until eg) $30 ~ L 0.9 continue 0.1 death

Utility Functions. (2) Utility scales positive linear transform Normalized utility u = 1.0, u = 0.0 Micromort one-millionth chance of death russian roulette, insurance QALY quality-adjusted life years

Utility Functions. (3) Money : does NOT behave as a utility fn. Given a lottery L risk-averse risk-seeking U $

Multi-attribute Utility Functions. (1) Multi-Attribute Utility Theory (MAUT) Outcomes are characterized by 2 or more attributes. eg) Site a new airport disruption by construction, cost of land, noise,…. Approach Identify regularities in the preference behavior

Multi-attribute Utility Functions. (2) Notation Attributes Attribute value vector Utility Fn.

Multi-attribute Utility Functions. (3) Dominance Certain (strict dominance, Fig.1) eg) airport site S1 cost less, less noise, safer than S2 : strict dominance of S1 over S2 Uncertain(Fig. 2) Fig. 1 Fig.2

Multi-attribute Utility Functions. (4) Dominance(cont.) stochastic dominance In real world problem eg) S1 : avg $3.7billion, standard deviation : $0.4billion S2 : avg $4.0billion, standard deviation : $0.35billion S1 stochastically dominates S2

Multi-attribute Utility Functions. (5) Dominance(cont.)

Multi-attribute Utility Functions. (6) Preferences without Uncertainty preferences btw. concrete outcome values. Preference structure X1 & X2 preferentially independent of X3 iff Preference btw. Does not depend on eg) Airport site : <Noise,Cost,Safety> <20,000 suffer, $4.6billion, 0.06deaths/mpm> vs. <70,000 suffer, $4.2billion, 0.06deaths/mpm>

Multi-attribute Utility Functions. (7) Preferences without Uncertainty(cont.) Mutual preferential independence(MPI) every pair of attributes is P.I of its complements. eg) Airport site : <Noise, Cost, Safety> Noise & Cost P.I Safety Noise & Safety P.I Cost Cost & Safety P.I Noise : <Noise,Cost,Safety> exhibits MPI Agent’s preference behavior

Multi-attribute Utility Functions. (8) Preferences with Uncertainty Preferences btw. Lotteries’ utility Utility Independence(U.I) X is utility-independent of Y iff preferences over lotteries’ attribute set X do not depend on particular values of a set of attribute Y. Mutual U.I(MUI) Each subset of attributes is U.I of the remaining attributes. agent’s behavior( for 3 attributes) :multiplicative Utility Fn.

Decision Networks Simple formalism for expressing & solving decision problem Belief networks + decision & utility nodes Nodes Chance nodes Decision nodes Utility nodes

A Simple Decision Network Decision node Airport Site CPT utility table Air Traffic Death U litigation Noise Cost Construction Utility node

A Simplified representation Airport Site Action-utility table Air Traffic U litigation Construction

Evaluating Decision Networks Function DN-eval (percept) returns action static D, a decision network set evidence variables for the current state for each possible value of decision node set decision node to that value calculate Posterior Prob. For parent nodes of the utility node calculate resulting utility for the action select the action with the highest utility return action

Value of Information (1) Idea Compute value of acquiring each of evidence Example : Buying oil drilling rights Three blocks A,B and C, exactly one has oil, work k Prior probabilities 1/3 each, mutually exclusive Current price of each block is k/3 Consultant offers accurate survey of A. Fair price?

Value of Information (2) Solution: Compute expected value of Information = Expected value of best action given information - Expected value of best action without information Survey may say “oil in A” or ‘no oil in A” =

General Formula Notation Value of perfect information Current evidence E, Current best action  Possible action outcomes Resulti(A)=Si Potential new evidence Ej Value of perfect information

Properties of VPI Nonnegative Nonadditive Order-Independent

Three generic cases for VoI a) Choice is obvious, information worth little b) Choice is nonobvious, information worth a lot c) Choice is nonobvious, information worth little

Decision-Theoretic Expert Systems (1) Decision analysis Decision maker States Preferences between outcomes Decision analyst Enumerate possible actions & outcomes Elicit preferences from decision maker to determine the best course of action Advantage Reflect preferences of the user, not system Avoid confusing likelihood and importance

Decision-Theoretic Expert Systems (2) Determine the scope of the problem Lay out the topology Assign probabilities Assign utilities Enter available evidence Evaluate the diagram Obtain new evidence Perform sensitivity Analysis