Web-Mining Agents Agents and Rational Behavior Decision-Making under Uncertainty Simple Decisions Ralf Möller Universität zu Lübeck Institut für Informationssysteme.

Slides:



Advertisements
Similar presentations
Decision Making Under Uncertainty Russell and Norvig: ch 16, 17 CMSC 671 – Fall 2005 material from Lise Getoor, Jean-Claude Latombe, and Daphne Koller.
Advertisements

Making Simple Decisions Chapter 16 Some material borrowed from Jean-Claude Latombe and Daphne Koller by way of Marie desJadines,
Making Simple Decisions
THE HONG KONG UNIVERSITY OF SCIENCE & TECHNOLOGY CSIT 5220: Reasoning and Decision under Uncertainty L09: Graphical Models for Decision Problems Nevin.
Agent Technology for e-Commerce Appendix A: Introduction to Decision Theory Maria Fasli
1 Chapter 12 Value of Information. 2 Chapter 12, Value of information Learning Objectives: Probability and Perfect Information The Expected Value of Information.
Utility Axioms Axiom: something obvious, cannot be proven Utility axioms (rules for clear thinking)
Judgment and Decision Making in Information Systems Utility Functions, Utility Elicitation, and Risk Attitudes Yuval Shahar, M.D., Ph.D.
Cooperating Intelligent Systems Utility theory Chapter 16, AIMA.
91.420/543: Artificial Intelligence UMass Lowell CS – Fall 2010 Lecture 16: MEU / Utilities Oct 8, 2010 A subset of Lecture 8 slides of Dan Klein – UC.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
Making Simple Decisions Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 16.
Decision Making Under Uncertainty Russell and Norvig: ch 16, 17 CMSC421 – Fall 2005.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Decision Making Under Uncertainty Russell and Norvig: ch 16 CMSC421 – Fall 2006.
Decision Making Under Uncertainty Russell and Norvig: ch 16, 17 CMSC421 – Fall 2003 material from Jean-Claude Latombe, and Daphne Koller.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
CMSC 671 Fall 2003 Class #26 – Wednesday, November 26 Russell & Norvig 16.1 – 16.5 Some material borrowed from Jean-Claude Latombe and Daphne Koller by.
GAMES AGAINST NATURE Topic #3. Games Against Nature In game theory, for reasons that will be explained later, the alternatives (e.g., LEFT and RIGHT)
Judgment and Decision Making in Information Systems Introduction: Decision Analysis and Human Judgment Yuval Shahar, M.D., Ph.D.
PGM 2003/04 Tirgul7 Foundations of Decision Theory (mostly from Pearl)
Decision Analysis (cont)
STOCHASTIC DOMINANCE APPROACH TO PORTFOLIO OPTIMIZATION Nesrin Alptekin Anadolu University, TURKEY.
Reminder Midterm Mar 7 Project 2 deadline Mar 18 midnight in-class
Making Simple Decisions
Decision & Risk Analysis Influence Diagrams, Decision Trees NOTE: Some materials for this presentation courtesy of Dr. Dan Maxwell Reference: Clemen &
Axioms Let W be statements known to be true in a domain An axiom is a rule presumed to be true An axiomatic set is a collection of axioms Given an axiomatic.
Making Simple Decisions Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 16.
CS 188: Artificial Intelligence Fall 2006 Lecture 18: Decision Diagrams 10/31/2006 Dan Klein – UC Berkeley.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Deciding Under Probabilistic Uncertainty Russell and Norvig: Sect ,Chap. 17 CS121 – Winter 2003.
Adversarial Search Chapter Games vs. search problems "Unpredictable" opponent  specifying a move for every possible opponent reply Time limits.
Decision Making Under Uncertainty CMSC 671 – Fall 2010 R&N, Chapters , , material from Lise Getoor, Jean-Claude Latombe, and.
Making Simple Decisions Utility Theory MultiAttribute Utility Functions Decision Networks The Value of Information Summary.
Chapter 16: Making Simple Decision March 23, 2004.
Decision Making Under Uncertainty CMSC 471 – Spring 2014 Class #12– Thursday, March 6 R&N, Chapters , material from Lise Getoor, Jean-Claude.
Fundamentals of Decision Theory Chapter 16 Mausam (Based on slides of someone from NPS, Maria Fasli)
Decision Making Under Uncertainty CMSC 471 – Spring 2041 Class #25– Tuesday, April 29 R&N, material from Lise Getoor, Jean-Claude Latombe, and.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
S ystems Analysis Laboratory Helsinki University of Technology 1 Decision Analysis Raimo P. Hämäläinen Systems Analysis Laboratory Helsinki University.
1 CSC 384 Lecture Slides (c) , C. Boutilier and P. Poupart CSC384: Lecture 24  Last time Variable elimination, decision trees  Today decision.
Making Simple Decisions Chapter 16 Some material borrowed from Jean-Claude Latombe and Daphne Koller by way of Marie desJadines,
Chapter 16 March 25, Probability Theory: What an agent should believe based on the evidence Utility Theory: What the agent wants Decision Theory:
1 Automated Planning and Decision Making 2007 Automated Planning and Decision Making Prof. Ronen Brafman Various Subjects.
Web-Mining Agents Agents and Rational Behavior Decision-Making under Uncertainty Complex Decisions Ralf Möller Universität zu Lübeck Institut für Informationssysteme.
Decision Making ECE457 Applied Artificial Intelligence Spring 2007 Lecture #10.
Decision Making Under Uncertainty CMSC 471 – Fall 2011 Class #23-24 – Thursday, November 17 / Tuesday, November 22 R&N, Chapters , ,
Behavioral Finance Preferences Part I Feb 16 Behavioral Finance Economics 437.
Ralf Möller Universität zu Lübeck Institut für Informationssysteme
Utilities and Decision Theory
Nevin L. Zhang Room 3504, phone: ,
Ralf Möller Universität zu Lübeck Institut für Informationssysteme
Maximum Expected Utility
Making Simple Decisions
Prof. Dr. Holger Schlingloff 1,2 Dr. Esteban Pavese 1
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #10
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #10
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Expectimax Lirong Xia. Expectimax Lirong Xia Project 2 MAX player: Pacman Question 1-3: Multiple MIN players: ghosts Extend classical minimax search.
CS 4/527: Artificial Intelligence
Decision Theory: Single Stage Decisions
Rational Decisions and
Decision Theory: Single Stage Decisions
13. Acting under Uncertainty Wolfram Burgard and Bernhard Nebel
Class #21 – Monday, November 10
Making Simple Decisions
Utilities and Decision Theory
Utilities and Decision Theory
Decision Making.
Making Simple Decisions
Presentation transcript:

Web-Mining Agents Agents and Rational Behavior Decision-Making under Uncertainty Simple Decisions Ralf Möller Universität zu Lübeck Institut für Informationssysteme

Literature Chapter 16 Material from Lise Getoor, Jean-Claude Latombe, Daphne Koller, and Stuart Russell

Decision Making Under Uncertainty Many environments have multiple possible outcomes Some of these outcomes may be good; others may be bad Some may be very likely; others unlikely What’s a poor agent going to do??

Non-Deterministic vs. Probabilistic Uncertainty ? bac {a,b,c}  decision that is best for worst case ? bac {a(p a ),b(p b ),c(p c )}  decision that maximizes expected utility value Non-deterministic modelProbabilistic model ~ Adversarial search

Expected Utility Random variable X with n values x 1,…,x n and distribution (p 1,…,p n ) E.g.: X is the state reached after doing an action A under uncertainty Function U of X E.g., U is the utility of a state The expected utility of A is EU[A=a] =  i=1,…,n p(x i |A=a)U(x i )

s0s0 s3s3 s2s2 s1s1 A U(S0) = 100 x x x 0.1 = = 62 One State/One Action Example

s0s0 s3s3 s2s2 s1s1 A A2 s4s U1(S0) = 62 U2(S0) = 74 U(S0) = max{U1(S0),U2(S0)} = 74 One State/Two Actions Example

s0s0 s3s3 s2s2 s1s1 A A2 s4s U1(S0) = 62 – 5 = 57 U2(S0) = 74 – 25 = 49 U(S0) = max{U1(S0),U2(S0)} = Introducing Action Costs

MEU Principle A rational agent should choose the action that maximizes agent’s expected utility This is the basis of the field of decision theory The MEU principle provides a normative criterion for rational choice of action

Not quite… Must have complete model of:  Actions  Utilities  States Even if you have a complete model, it might be computationally intractable In fact, a truly rational agent takes into account the utility of reasoning as well---bounded rationality Nevertheless, great progress has been made in this area recently, and we are able to solve much more complex decision-theoretic problems than ever before

We’ll look at Decision-Theoretic Planning  Simple decision making (ch. 16)  Sequential decision making (ch. 17)

Preferences A and B can be lotteries again: Prizes are special lotteries: [1, X; 0, not X]

Rational preferences

Rational preferences contd.

Axioms of Utility Theory Orderability ({, ~} is JEPD)  (A>B)  (A<B)  (A~B) Transitivity  (A>B)  (B>C)  (A>C) Continuity  A>B>C   p [p,A; 1-p,C] ~ B Substitutability  A~B  [p,A; 1-p,C]~[p,B; 1-p,C] Monotonicity  A>B  (p≥q  [p,A; 1-p,B] >~ [q,A; 1-q,B]) Decomposability  [p,A; 1-p, [q,B; 1-q, C]] ~ [p,A; (1-p)q, B; (1-p)(1-q), C] Decomposability: There is no fun in gambling

And then there was utility

Utilities -and-continue -as-before

Utility scales U(pay $30...) =

Money

Money Versus Utility Money <> Utility  More money is better, but not always in a linear relationship to the amount of money Expected Monetary Value Risk-averse – U(L) < U(S EMV(L) ) Risk-seeking – U(L) > U(S EMV(L) ) Risk-neutral – U(L) = U(S EMV(L) )

Value Functions Provides a ranking of alternatives, but not a meaningful metric scale Also known as an “ordinal utility function”

Multiattribute Utility Theory A given state may have multiple utilities ...because of multiple evaluation criteria ...because of multiple agents (interested parties) with different utility functions

Strict dominance

Stochastic dominance Continuous random variable S 1 stochastically dominates S 2 iff ∀ t. P( S 1 ≥ t ) ≥ P( S 2 ≥ t )

Stochastic Dominance

Preference structure: Deterministic month > vs. month > V(x 1,..., v n ) = i V i (x i )

Preference structure: Stochastic

Decision Networks Extend BNs to handle actions and utilities Also called influence diagrams Use BN inference methods to solve Perform Value of Information calculations

Decision Networks cont. Chance nodes: random variables, as in BNs Decision nodes: actions that decision maker can take Utility/value nodes: the utility of the outcome state.

R&N example

Umbrella Network weather forecast umbrella happiness take/don’t take f w p(f|w) sunny rain 0.3 rainy rain 0.7 sunny no rain 0.8 rainy no rain 0.2 P(rain) = 0.4 U(have,rain) = -25 U(have,~rain) = 0 U(~have, rain) = -100 U(~have, ~rain) = 100 have umbrella P(have|take) = 1.0 P(~have|~take)=1.0

Evaluating Decision Networks Set the evidence variables for current state For each possible value of the decision node:  Set decision node to that value  Calculate the posterior probability of the parent nodes of the utility node, using BN inference  Calculate the resulting utility for action Return the action with the highest utility

Decision Making: Umbrella Network weather forecast umbrella happiness take/don’t take f w p(f|w) sunny rain 0.3 rainy rain 0.7 sunny no rain 0.8 rainy no rain 0.2 P(rain) = 0.4 U(have,rain) = -25 U(have,~rain) = 0 U(~have, rain) = -100 U(~have, ~rain) = 100 have umbrella P(have|take) = 1.0 P(~have|~take)=1.0 Should I take my umbrella??

Value of information

General formula

Properties of VPI

Qualitative behaviors

Information Gathering Agent Ask questions Request(E j ) in a reasonable order Avoid irrelevant questions Take into account imporance of piece of information j in relation to Cost(E j )