Presentation is loading. Please wait.

Presentation is loading. Please wait.

Artificial Intelligence Knowledge Representation Problem.

Similar presentations


Presentation on theme: "Artificial Intelligence Knowledge Representation Problem."— Presentation transcript:

1 Artificial Intelligence Knowledge Representation Problem

2

3

4

5 “Strong Fever” 40.1°C 42°C 41.4°C 39.3°C 38.7°C 37.2°C 38°C 40.1°C 42°C 41.4°C 39.3°C 38.7°C 37.2°C 38°C “Strong Fever” Conventional set theory Fuzzy set theory Reasoning With Uncertainty

6 Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: partial observability (road state, other drivers' plans, etc.) noisy sensors (traffic reports) uncertainty in action outcomes (flat tire, etc.) “A 25 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact etc etc.”

7 Making decisions under uncertainty Suppose I believe the following: P(A 25 gets me there on time | …) = 0.04 P(A 90 gets me there on time | …) = 0.70 P(A 120 gets me there on time | …) = 0.95 P(A 1440 gets me there on time | …) = 0.9999 Which action to choose? Depends on my preferences for missing flight vs. time spent waiting, etc. Utility theory is used to represent and infer preferences Decision theory = probability theory + utility theory

8 Uncertainty in Expert Systems from correct premises and correct sound rules  correct conclusions but sometimes we have to manage uncertain information, encode uncertain pieces of knowledge, model parallel firing of inference rules, tackle ambiguity There is a number of various models of uncertain reasoning: Bayesian Reasoning – classical statistical approach Dempster-Shafer Theory of Evidence Stanford Certainty Algebra – MYCIN

9 Bayesian Reasoning ……. given that a and b are independent ……. given that a depends on b - prior probability (unconditional) … p(hypothesis) - posterior probability (conditional)… p(hypothesis|evidence) prospector, dice examples P(e)P(h) P(e|h)

10 What is Fuzzy logic Fuzzy logic is a superset of conventional logic (Boolean) It was created by Dr. Lotfi Zadeh in 1960s for the purpose of modeling the inherent in natural language In fuzzy logic, it is possible to have partial truth values

11 Fuzzy Sets Lotfi A. Zadeh, The founder of fuzzy logic. L. A. Zadeh, “Fuzzy sets,” Information and Control, vol. 8, pp. 338-353, 1965.

12

13 Fuzzy logic driven picture generator Massive Engine Battle scenes from Lord of the Rings

14 “fuzzy”  The word “fuzzy” can be defined as “imprecisely defined, confused, vague” vague  Humans represent and manage natural language terms (data) which are vague. Almost all answers to questions raised in everyday life are within some proximity of the absolute truth Degree’s of truth

15 Reasoning With Uncertainty

16 Fuzzy set theory basics There is a strong correlation between Boolean logic, and classic set theory Likewise, there is a very strong correlation between Fuzzy logic, and Fuzzy set theory In Fuzzy set theory, one deals with a set “S” which determines a universe of discourse and a fuzzy subset “F” that contains degrees of membership and the relationship between the two sets

17 Crisp Sets Classical sets are called crisp sets either an element belongs to a set or not, i.e., Member Function of crisp set or

18 Sets {Live dinosaurs in British Museum} = 

19 Fuzzy Sets categorization of elements x i into a set S described through a membership function  (s) :x  [0,1] associates each element x i with a degree of membership in S: 0 means no, 1 means full membership values in between indicate how strongly an element is affiliated with the set

20 Formal definition: A fuzzy set A in X is expressed as a set of ordered pairs: Fuzzy Sets Universe or universe of discourse Fuzzy set Membership function (MF) A fuzzy set is totally characterized by a membership function (MF).

21 Fuzzy Sets with Discrete Universes Fuzzy set C = “desirable city to live in” X = {SF, Boston, LA} C = {(SF, 0.9), (Boston, 0.8), (LA, 0.6)} Fuzzy set A = “sensible number of children” X = {0, 1, 2, 3, 4, 5, 6} A = {(0,.1), (1,.3), (2,.7), (3, 1), (4,.6), (5,.2), (6,.1)}

22 Fuzzy Sets Sets with fuzzy boundaries A = Set of tall people Heights 5’10’’ 1.0 Crisp set A Membership function Heights 5’10’’6’2’’.5.9 Fuzzy set A 1.0

23 Fuzzy Sets Formal definition: A fuzzy set A in X is expressed as a set of ordered pairs: Universe or universe of discourse Fuzzy set Membership function (MF) A fuzzy set is totally characterized by a membership function (MF).

24 Possibility vs.. Probability possibility refers to allowed values probability expresses expected occurrences of events Example: rolling dice X is an integer in U = {2,3,4,5,6,7,8,9,19,11,12} probabilities p(X = 7) = 2*3/36 = 1/67 = 1+6 = 2+5 = 3+4 possibilities Poss{X = 7} = 1 the same for all numbers in U

25 Set-Theoretic Operations Subset: Complement: Union: Intersection:

26 Set Operations a) AB b) AB c) AB d) AB e) AB f) AB

27 Boolean OR

28 Fuzzy OR

29 Logics in general LanguageOntological Commitment Epistemological Commitment Propositional logicfactstrue/false/unknown First-order logicfacts, objects, relations true/false/unknown Temporal logicfacts, objects, relations, times true/false/unknown Probability theoryfactsdegree of belief Fuzzy logicfacts+degree of truthknown interval value

30 Rough set theory Rough set theory was developed by Zdzislaw Pawlak in the early 1980’s. Representative Publications: Z. Pawlak, “Rough Sets”, International Journal of Computer and Information Sciences, Vol.11, 341-356 (1982). Z. Pawlak, Rough Sets - Theoretical Aspect of Reasoning about Data, Kluwer Academic Pubilishers (1991).

31 Introduction (2) The main goal of the rough set analysis is induction of approximations of concepts. Rough sets constitutes a sound basis for KDD. It offers mathematical tools to discover patterns hidden in data. It can be used for feature selection, feature extraction, data reduction, decision rule generation, and pattern extraction (templates, association rules) etc. identifies partial or total dependencies in data, eliminates redundant data, gives approach to null values, missing data, dynamic data and others.

32 Information Systems/Tables IS is a pair (U, A) U is a non-empty finite set of objects. A is a non-empty finite set of attributes such that for every is called the value set of a. Age LEMS x 1 16-30 50 x2 16-30 0 x3 31-45 1-25 x4 31-45 1-25 x5 46-60 26-49 x6 16-30 26-49 x7 46-60 26-49

33 Decision Systems/Tables DS: is the decision attribute (instead of one we can consider more decision attributes). The elements of A are called the condition attributes. Age LEMS Walk x 1 16-30 50 yes x2 16-30 0 no x3 31-45 1-25 no x4 31-45 1-25 yes x5 46-60 26-49 no x6 16-30 26-49 yes x7 46-60 26-49 no

34 Indiscernibility Let IS = (U, A) be an information system, then with any there is an associated equivalence relation: where is called the B-indiscernibility relation. If then objects x and x’ are indiscernible from each other by attributes from B. The equivalence classes of the B-indiscernibility relation are denoted by

35 An Example of Indiscernibility The non-empty subsets of the condition attributes are {Age}, {LEMS}, and {Age, LEMS}. IND({Age}) = {{x1,x2,x6}, {x3,x4}, {x5,x7}} IND({LEMS}) = {{x1}, {x2}, {x3,x4}, {x5,x6,x7}} IND({Age,LEMS}) = {{x1}, {x2}, {x3,x4}, {x5,x7}, {x6}}. Age LEMS Walk x 1 16-30 50 yes x2 16-30 0 no x3 31-45 1-25 no x4 31-45 1-25 yes x5 46-60 26-49 no x6 16-30 26-49 yes x7 46-60 26-49 no

36 Set Approximation Let T = (U, A) and let and We can approximate X using only the information contained in B by constructing the B-lower and B-upper approximations of X, denoted and respectively, where

37 Set Approximation (2) B-boundary region of X, consists of those objects that we cannot decisively classify into X in B. B-outside region of X, consists of those objects that can be with certainty classified as not belonging to X. A set is said to be rough if its boundary region is non-empty, otherwise the set is crisp.

38 An Example of Set Approximation Let W = {x | Walk(x) = yes}. The decision class, Walk, is rough since the boundary region is not empty. Age LEMS Walk x 1 16-30 50 yes x2 16-30 0 no x3 31-45 1-25 no x4 31-45 1-25 yes x5 46-60 26-49 no x6 16-30 26-49 yes x7 46-60 26-49 no

39 An Example of Set Approximation (2) yes yes/no no {{x1},{x6}} {{x3,x4}} {{x2}, {x5,x7}} AW

40 U set X U/R R : subset of attributes Lower & Upper Approximations

41 Lower & Upper Approximations (2) Lower Approximation: Upper Approximation:

42 Lower & Upper Approximations (3) X1 = {u | Flu(u) = yes} = {u2, u3, u6, u7} RX1 = {u2, u3} = {u2, u3, u6, u7, u8, u5} X2 = {u | Flu(u) = no} = {u1, u4, u5, u8} RX2 = {u1, u4} = {u1, u4, u5, u8, u7, u6} The indiscernibility classes defined by R = {Headache, Temp.} are {u1}, {u2}, {u3}, {u4}, {u5, u7}, {u6, u8}.

43 Lower & Upper Approximations (4) R = {Headache, Temp.} U/R = { {u1}, {u2}, {u3}, {u4}, {u5, u7}, {u6, u8}} X1 = {u | Flu(u) = yes} = {u2,u3,u6,u7} X2 = {u | Flu(u) = no} = {u1,u4,u5,u8} RX1 = {u2, u3} = {u2, u3, u6, u7, u8, u5} RX2 = {u1, u4} = {u1, u4, u5, u8, u7, u6} u1 u4 u3 X1 X2 u5u7 u2 u6u8

44 Four Basic Classes of Rough Sets X is roughly B-definable, iff and X is internally B-undefinable, iff and X is externally B-undefinable, iff and X is totally B-undefinable, iff and

45 An Example of Reducts & Core Reduct1 = {Muscle-pain,Temp.} Reduct2 = {Headache, Temp.} CORE = {Headache,Temp} {MusclePain, Temp} = {Temp}

46 Probability Logic Set Soft Techniques for KDD

47 Stoch. Proc. Belief Nets Conn. Nets GDT Deduction Induction Abduction RoughSets Fuzzy Sets Soft Techniques for KDD (2)

48 Deduction InductionAbduction GDT GrC RS&ILP RS TM A Hybrid Model

49 (a) Solve the Problem using depth-first search (b) Solve the Problem using Greedy first search 49

50 Given the following rules: 1. IF (lecturing X) AND (marking-practicals X) THEN ADD (overworked X) 2. IF (month february) THEN ADD (lecturing ali) 3. IF (month february) THEN ADD (marking-practicals ali) 4. IF (overworked X) OR (slept-badly X) THEN ADD (bad-mood X) 5. IF (bad-mood X) THEN DELETE (happy X) 6. IF (lecturing X) THEN DELETE (researching X) Let us assume that initially we have a working memory with the following fact elements: (month february) (happy ali) (researching ali) Apply forward chaining.


Download ppt "Artificial Intelligence Knowledge Representation Problem."

Similar presentations


Ads by Google