Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reasoning under Uncertainty Eugene Fink LTI Seminar November 16, 2007.

Similar presentations


Presentation on theme: "Reasoning under Uncertainty Eugene Fink LTI Seminar November 16, 2007."— Presentation transcript:

1 Reasoning under Uncertainty Eugene Fink LTI Seminar November 16, 2007

2 Challenges The available knowledge about the real world is inherently uncertain. We usually make decisions based on incomplete and partially inaccurate data.

3 Learning of reasonable default assumptions Representation of uncertainty Challenges Fast reasoning based on uncertain knowledge Elicitation of critical additional data Contingency reasoning

4 Projects RADAR / Space-Time (2003–2008) “Reflective Agent with Distributed Adaptive Reasoning” Scheduling and resource allocation under uncertainty. RAPID (2007–2011) “Representation and Analysis of Probabilistic Intelligence Data” Analysis of uncertain military- intelligence data and planning of future data collection.

5 Outline Representation of uncertainty Reasoning based on uncertain knowledge Elicitation of missing data Future research challenges Representation of uncertainty

6 Alternative representations Approximations Mary’s weight is about 150. Mary’s cell phone is probably in her purse. Ranges or sets of possible values Mary’s weight is between 140 and 160. Mary’s cell phone may be in her purse, office, home, or car. Probability distributions Phone location: 95%purse 2% home 2% office 1% car Weight: probability density 140 160

7 Approximations Simple and intuitive approach, which usually does not require changes to standard algorithms. BUT… We assume that small input changes do not cause large output changes We may need to modify standard algorithms to ensure that they do not violate this assumption

8 Approximations Example: Selecting an amount of medication. patient weight amount of medication Since small input changes translate into small output changes, we can use an approximate weight value.

9 Approximations Example: Loading an elevator. load weight chance of overloading We can adapt this procedure to the use of approximate weights by subtracting a safety margin from the weight limit. 155 LB 140 LB

10 Approximations Example: Playing the “exact weight” game. If your weight is exactly 150 lb, you are a winner! player weight prize If we use approximate weight values, we cannot determine the chances of winning.

11 Ranges or sets of possible values Explicit representation of a margin of error Moderate changes to standard algorithms BUT… We may lose the accuracy of computation, and we cannot evaluate the probabilities of different possible values.

12 Ranges or sets of possible values Example: Selecting an amount of medication. patient weight amount of medication We obtain a range that includes the correct amount of medication. If the range width is within the acceptable margin of error, we can use it to select an appropriate amount.

13 Ranges or sets of possible values Example: Loading an elevator. load weight chance of overloading We identify the danger of overloading, but we cannot determine its probability.

14 Ranges or sets of possible values Example: Playing the “exact weight” game. player weight prize We still cannot determine the chances of winning.

15 Probability distributions Accurate analysis of possible values and their probabilities. BUT… Major changes to standard algorithms Major increase of the running time

16 Probability distributions Example: Playing the “exact weight” game. player weight prize We can determine possible outcomes and evaluate their probabilities.

17 RADAR / RAPID approach to uncertainty representation ranges or sets of values probability distributions ranges or sets with probabilities We approximate a probability density function by a set of uniform distributions, and represent it as a set of ranges with probabilities. Weight: 0.1 chance: [140..145] 0.8 chance: [145..155] 0.1 chance: [155..160] probability density 140 160 150 weight

18 Uncertain data Nominal values An uncertain nominal value is a set of possible values and their probabilities. Phone location: 0.95 chance: purse 0.02 chance: home 0.02 chance: office 0.01 chance: car

19 Uncertain data Nominal values Integers and reals An uncertain numeric value is a probability- density function represented by a set of uniform distributions. probability density 140 160 150 weight Weight: 0.1 chance: [140..145] 0.8 chance: [145..155] 0.1 chance: [155..160]

20 Uncertain data Nominal values Integers and reals Strings An uncertain string is a regular expression with probabilities.

21 Uncertain data Nominal values Integers and reals Strings Spatial regions An uncertain region is a set of rectangular regions and their probabilities. 0.1 0.8 x y

22 Uncertain data Nominal values Integers and reals Strings Spatial regions Functions An uncertain function is a piecewise-linear function with uncertain y-coordinates patient weight amount of medication or a set of possible functions and their probabilities. 0.8 chance 0.2 chance

23 Outline Representation of uncertainty Reasoning based on uncertain knowledge Elicitation of missing data Future research challenges

24 Uncertainty arithmetic We have developed a library of basic operations on uncertain data, which input and output uncertain values. Arithmetic operations Logical operations ≤ ≠ ¬ Function application Analysis of distributions μ σ

25 Uncertainty arithmetic Allows extension of standard algorithms to reasoning with uncertain values Supports the control of the trade-off between the speed and accuracy BUT… Approximate and relatively slow Assumes that all probability distributions are independent

26 RADAR application Scheduling and resource allocation based on uncertain knowledge of scheduling constraints, preferences, and available resources. Uncertain room and event properties Uncertain resource availability and prices Uncertain utility functions We use an optimization algorithm that searches for a schedule with the greatest expected quality.

27 RADAR results Manual Auto 0.83 0.72 9 rooms 62 events Manual Auto 0.83 0.63 13 rooms 84 events without uncertainty with uncertainty 10 Search time 0.8 0.9 0.7 0.6 1 2 3 4 5678 9 Schedule Quality Time (seconds) 13 rooms 84 events Manual Auto 0.78 5 rooms 32 events 0.80 Schedule Quality Manual and auto scheduling problem size Scheduling of conference events.

28 RAPID application Analysis of military intelligence, which usually includes uncertain and partially inaccurate data. Relational database with uncertain data Retrieval of approximate and probabilistic matches for given queries Automated inferences, verification of given hypotheses, and search for novel patterns

29 Outline Representation of uncertainty Reasoning based on uncertain knowledge Elicitation of missing data Future research challenges

30 Elicitation challenge Identification of critical missing data Analysis of the trade-off between the cost of data acquisition and the expected performance improvements Planning of effective data collection

31 RADAR / RAPID approach to elicitation of additional data For each question, compute its expected impact on the overall utility, and select questions with best expected impacts For each candidate question, estimate the probabilities of possible answers For each possible answer, compute its cost, as well as its impact on the utility of reasoning or optimization

32 RADAR / RAPID approach to elicitation of additional data Model Const- ruction Model Evalu- ation Question Selection Reasoning or Optimization current model model utility and limitations questions answers Top-Level Control Data Collection

33 RADAR application The system identifies critical missing knowledge, sends related questions to the user, and improves the world model based on the user’s answers. Elicitation of additional data about scheduling constraints, preferences, and available resources.

34 RADAR application Elicitation of additional data about scheduling constraints, preferences, and available resources. Info elicitorParserOptimizer Process new info Update resource allocation Choose and send questions Top-level control and learning Graphical user interface User

35 Missing info: Invited talk: – Projector need Poster session: – Room size – Projector need RADAR example: Initial schedule Assumptions: Invited talk: – Needs a projector Poster session: – Small room is OK – Needs no projector Available rooms: Room num. Area (feet 2 ) Proj- ector 123123 2,000 1,000 1,000 Yes No Yes Requests: Invited talk, 9–10am: Needs a large room Poster session, 9–11am: Needs a room 1 2 3 Initial schedule: Talk Posters

36 Requests: Invited talk, 9–10am: Needs a large room Poster session, 9–11am: Needs a room RADAR example: Choice of questions 1 2 3 Initial schedule: Talk Posters Candidate questions: Invited talk: Needs a projector? Poster session: Needs a larger room? Needs a projector? Useless info: There are no large rooms w/o a projector × Useless info: There are no unoccupied larger rooms × Potentially useful info √

37 RADAR example: Improved schedule Requests: Invited talk, 9–10am: Needs a large room Poster session, 9–11am: Needs a room 1 2 3 Initial schedule: Talk Posters Info elicitation: System: Does the poster session need a projector? User: A projector may be useful, but not really necessary. 1 2 3 New schedule: Talk Posters

38 RADAR results Repairing a conference schedule after a “ crisis ” loss of rooms. After Crisis 0.50 Manual Repair 0.61 Auto w/o Elicitation 0.68 Auto with Elicitation 0.72 Schedule Quality Manual and auto repair 0.68 0.72 Schedule Quality 10 30 20 4050 Number of Questions Dependency of the quality on the number of questions

39 RAPID application Identification of critical uncertainties, based on given tasks and priorities Planning of intelligence collection, based on the analysis of cost/benefit trade-offs and related risks Proactive collection of military intelligence.

40 RAPID application Proactive collection of military intelligence. Critical uncertainties Uncertain inference rules Query matches Evaluation of hypotheses Prioritized plans for proactive data collection Inferred facts Learned inference rules Goals, queries, and hypotheses Uncertain facts Knowledge entry and editing

41 Outline Representation of uncertainty Reasoning based on uncertain knowledge Elicitation of missing data Future research challenges

42 Future work Learning of defaults and “common-sense” rules Contingency reasoning Theory of proactive learning

43 Defaults assumptions Learning to make reasonable common-sense assumptions in the absence of specific data. Example assumptions: Almost all people weigh less than 500 lb Tall people usually weigh more than short people For people under eighteen years old, the expected weight increases with age

44 Defaults assumptions Learning to make reasonable common-sense assumptions in the absence of specific data. Representation of general uncertain assumptions, context-based assumptions, and uncertain dependencies Passive and active learning of these assumptions and dependencies Unsupervised learning of relevant contexts

45 Contingency reasoning Analysis of possible future developments and preparation to likely developments. Identification of critical uncertainties and their discretization into specific scenarios Compact representation of scenario spaces Construction of related contingency plans

46 Proactive learning General theory of the development and analysis of related learning techniques. Integration of learning with follow-up reasoning Model Const- ruction Model Evalu- ation Question Selection Reasoning or Optimization current model model utility and limitations questionsanswers Top-Level Control Data Collection Integration of learning algorithms with reasoning engines that use the learned knowledge.

47 Proactive learning General theory for the development and analysis of related learning techniques. Integration of learning with follow-up reasoning Automated selection of learning examples Model Const- ruction Model Evalu- ation Question Selection Reasoning or Optimization current model model utility and limitations questionsanswers Top-Level Control Data Collection Active selection of examples based on the trade-off among their cost, expected accuracy, and impact on the learned- knowledge utility.

48 Proactive learning General theory for the development and analysis of related learning techniques. Integration of learning with follow-up reasoning Automated selection of high-level strategies Model Const- ruction Model Evalu- ation Question Selection Reasoning or Optimization current model model utility and limitations questionsanswers Top-Level Control Data Collection Intelligent choice and guidance of learning strategies, with the purpose to reduce the cost and time of learning. Automated selection of learning examples

49 Proactive learning General theory for the development and analysis of related learning techniques. Integration of learning with follow-up reasoning Proactive analysis of future needs Model Const- ruction Model Evalu- ation Question Selection Reasoning or Optimization current model model utility and limitations questionsanswers Top-Level Control Data Collection Automated evaluation of future needs for the learned knowledge, and adaptation of the learning process to both expected and sudden changes in these needs. Automated selection of high-level strategies Automated selection of learning examples

50


Download ppt "Reasoning under Uncertainty Eugene Fink LTI Seminar November 16, 2007."

Similar presentations


Ads by Google