Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14.

Similar presentations


Presentation on theme: "Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14."— Presentation transcript:

1 Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14

2 Knowledge Based Agents Expert systems –expertise available to decision makers and technicians who need answers quickly. “Weak AI” methods –Do not require significant amounts of knowledge specific to a the domain “Strong AI” methods –Depend on having a lot of very specific knowledge about the domain

3 Example of Domain Specific KNowledge Production rules for diagnosing a car problem If the engine does not turn over and the headlights do not turn on Then the problem is the battery or battery cables If the engine does not turn over and the gas gauge reads “Empty” Then the problem is your wife left you with an empty gas tank

4 Knowledge Based Systems Separate the “knowledge” from the program from the program control –Example: “If the engine does not turn over and the headlights do not turn on Then the problem is the battery or battery cables” –Cannot be used to diagnose a non-working flashlight even though it is the same concept Knowledge Based Systems avoid “hard coding” domain knowledge into a program

5 Separation of Knowledge & Control Simple prolog example: 1.Print_color (Rose) :- write(“Red”) Print_color (Sky) :- write(“Yellow”) Print_color (_) :- write(“Unknown”) 2. Print_color (x) :- Color(x,y), write (y) Print_color (_) :- write(“Unknown”) Color(Rose, Red) Color(Sky, Yellow) #2 seperates knolwedge from program control KB becomes modular and modifiable

6

7 KRL KRL = representation + inference method –Language for representing facts –Method(s) for establishing new facts from known facts –Some KRL’s FOL Fuzzy logic Production rules Semantic nets Case-based reasoning Model-based reasoning

8 KRL: Syntax and Semantics Syntax –Rules for constructing valid sentences. Valid configuration of symbols that constitute a sentences x >= yvalid math sentence xy >= invalide math sentence

9 KRL: Syntax and Semantics Semantics –“meaning” of a sentence. Determines the facts in the world to which the sentence refers to. What an agent “believes” x >= y False if y is bigger than x; true if otherwise

10 Where does meaning come from? Symbol Grounding Problem A symbol system is a set of symbols that can be operated on by a set of rules 2 + 2 = x X = 4 There is no need to make use of the meaning, only the syntax –So were does the meaning come from?

11

12 Wumpus World PEAS description Performance measure –gold +1000, death -1000 –-1 per step, -10 for using the arrow Environment –Squares adjacent to wumpus are smelly –Squares adjacent to pit are breezy –Glitter iff gold is in the same square –Shooting kills wumpus if you are facing it –Shooting uses up the only arrow –Grabbing picks up gold if in same square –Releasing drops the gold in same square Sensors: Stench, Breeze, Glitter, Bump, Scream Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot

13 Wumpus world characterization Fully Observable No – only local perception Deterministic Yes – outcomes exactly specified Episodic No – sequential at the level of actions Static Yes – Wumpus and Pits do not move Discrete Yes Single-agent? Yes – Wumpus is essentially a natural feature

14 Exploring a wumpus world

15

16

17

18

19

20

21

22 What about this situation?

23 What is uncertainty? Incomplete information Incomplete knowledge Where does uncertainty come from?

24 Rational Agents and Uncertainty Rational agents “do the right thing” –In FOL the right thing is to act on knowing what conditions are True or False What does it mean to “do the right thing” if an agent does not does not “know” enough to infer the facts of a situation? –In uncertainty the right thing depends on the importance of the goal and the likelihood that ti will be achieved for some action.

25 Probability Probabilistic assertions summarize effects of –laziness: failure to enumerate exceptions, qualifications, etc. –ignorance: lack of relevant facts, initial conditions, etc. Subjective probability: Probabilities relate propositions to agent's own state of knowledge P( disease = ebola) = 0.001 –Assigns a degree of belief –These are not assertions about the world Probabilities of propositions change with new evidence: P( disease = ebola | travel = African jungle) = 0.03

26 Syntax Basic element: random variable Possible worlds defined by assignment of values to random variables. Boolean random variables e.g., Cavity (do I have a cavity?) Discrete random variables e.g., Weather is one of Domain values must be exhaustive and mutually exclusive Elementary proposition constructed by assignment of a value to a random variable: e.g., Weather = sunny, Cavity = false Complex propositions formed from elementary propositions and standard logical connectives e.g., Weather = sunny  Cavity = false

27 Axioms of probability For any propositions A, B –0 ≤ P(A) ≤ 1 –P(true) = 1 and P(false) = 0 –P(A  B) = P(A) + P(B) - P(A  B)

28 Prior probability Prior or unconditional probabilities of propositions e.g., P(Cavity = true) = 0.1 and P(Weather = sunny) = 0.72 correspond to belief prior to arrival of any (new) evidence Probability distribution gives values for all possible assignments: P(Weather) = (normalized, i.e., sums to 1) Joint probability distribution for a set of random variables gives the probability of every atomic event on those random variables P(Weather,Cavity) = a 4 × 2 matrix of values: Weather =sunnyrainycloudysnow Cavity = true 0.1440.02 0.016 0.02 Cavity = false0.5760.08 0.064 0.08 Every question about a domain can be answered by the joint distribution

29 Conditional probability Conditional or posterior probabilities e.g., P(cavity | toothache) = 0.8 i.e., given that toothache is all I know If we know more, e.g., cavity is also given, then we have P(cavity | toothache,cavity) = 1 New evidence may be irrelevant, allowing simplification, e.g., P(cavity | toothache, sunny) = P(cavity | toothache) = 0.8

30 Conditional probability Definition of conditional probability: P(a | b) = P(a  b) / P(b) if P(b) > 0 Product rule gives an alternative formulation: P(a  b) = P(a | b) P(b) = P(b | a) P(a)


Download ppt "Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14."

Similar presentations


Ads by Google