Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Lecture 19 November 2, 2005 12-706 / 19-702 / 73-359.

Similar presentations


Presentation on theme: "Information Lecture 19 November 2, 2005 12-706 / 19-702 / 73-359."— Presentation transcript:

1 Information Lecture 19 November 2, 2005 12-706 / 19-702 / 73-359

2 Admin Issues zLandfill Gas Projects yGreat job. Range 75-98%, median/mean 92. zHW 5 Due next Wednesday zThis week’s office hours Thurs 4pm, Fri 1:30pm. yNext week back to normal (Pauli Mon AM). zSchedule Set for Rest of Semester

3 Agenda zValue of Information zFacility Feasibility Case Study zComments on Doing Sensitivity Analysis

4 Value of Information zWe have been doing decision analysis with best guesses of probabilities yHave been building trees with chance and decision nodes, finding expected values zIt is relevant and interesting to determine how important information might be in our decision problems. yCould be in the form of paying an expert, a fortune teller, etc. Goal is to reduce/eliminate uncertainty in the decision problem.

5 Willingness to Pay = EVPI zWe’re interested in knowing our WTP for (perfect) information about our decision. zThe book shows this as Bayesian probabilities, but think of it this way.. yWe consider the advice of “an expert who is always right”. yIf they say it will happen, it will. yIf they say it will not happen, it will not. yThey are never wrong. zBottom line - receiving their advice means we have eliminated the uncertainty about the event.

6

7

8

9 Discussion zThe difference between the 2 trees (decision scenarios) is the EVPI y$1000 - $580 = $420. yThat is the amount up to which you would be willing to pay for advice on how to invest. yIf you pay less than the $420, you would expect to come out ahead, net of the cost of the information. yIf you pay $425 for the info, you would expect to lose $5 overall! zFinding EVPI is really simple to do in @RISK / PrecisionTree plug-in (not so for treeplan!)

10 Similar: EVII zImperfect, rather than perfect, information (because it is rarely perfect) zExample: our expert acknowledges she is not always right, we use conditional probability (rather than assumption of 100% correct all the time) to solve trees. yIdeally, they are “almost always right” and “almost never wrong” ye.g.. P(Up Predicted | Up) is less than but close to 1. yP(Up Predicted | Down) is greater than but close to 0

11 Assessing the Expert

12 Expert side of EVII tree This is more complicated than EVPI because we do not know whether the expert is right or not. We have to decide whether to believe her.

13 Use Bayes’ Theorem z“Flip” the probabilities. zWe know P(“Up”|Up) but instead need P(Up | “Up”). zP(Up|”Up”) = z P(“Up”|Up)*P(Up) P(“Up”|Up)*P(Up)+.. P(“Up”|Down)P(Down) z= 0.8*0.5 0.8*0.5+0.15*0.3+0.2*0.2 =0.8247

14 EVII Tree Excerpt

15 Rolling Back to the Top

16 Final Thoughts on Plugins zYou can combine (in @RISK) the decision trees and the sensitivity plugins. zCan probably do in Treeplan, I havent tried it. zDo “Sensitivity of Expected Values” by varying the probabilities (see end Chap 5) zAlso - can do EVPI/EVII with @RISK. zDon’t need to do everything by hand!

17 Transition zSpeaking of Information.. zHow valuable would it be to have had better knowledge of the costs/revenues of a facility project after its been around for a while? zWould it change our decision?


Download ppt "Information Lecture 19 November 2, 2005 12-706 / 19-702 / 73-359."

Similar presentations


Ads by Google