Presentation is loading. Please wait.

Presentation is loading. Please wait.

Machine Learning -Ramya Karri -Rushin Barot. Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open.

Similar presentations


Presentation on theme: "Machine Learning -Ramya Karri -Rushin Barot. Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open."— Presentation transcript:

1 Machine Learning -Ramya Karri -Rushin Barot

2 Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open World Assumption How does the Learners knowledge is effected by the knowledge of the knower

3 Learning from Examples Two agents – Knower – Learner Closed World Assumption – Universe of discourse ‘U’ – Knower has complete knowledge about the universe – The universe is closed i.e. nothing else beside U exists

4 Quality of learning Learner knowledge consists of attributes of objects Can learner’s knowledge can match the knower’s knowledge? Is the learner able to learn concepts demonstrated by the knower?

5 Quality of learning(Contd..) Quality of learning can be defined as degree of dependency between the set of knower’s and learner’s attributes i.e. how exactly the knower’s knowledge can be learned.

6 Example Uabcde 112011 212011 320010 400121 521021 600122 720010 801221 921022 1020010

7 Example(Contd..) B= {a, b, c, d} is set of learner’s attributes and e is the knower’s attribute. The knower’s knowledge has the following concepts – |e 0 |={3, 7, 10}=X 0 – |e 1 | ={1, 2, 4, 5, 8}=X 1 – |e 2 |={6, 9}=X 2

8 Example(Contd..) The learner’s knowledge consists of following basic concepts

9 To learn knower’s knowledge means to express each knower’s basic concept by means of learner’s basic concepts Compute approximation of knower’s basic concepts, in terms of learner’s basic concepts i.e.

10 Inferences concept X0 is exact and can be learned fully Concept X1 is roughly B-definable i.e. only the instances 1, 2 and 8 can be learned by the learner, instances 3,7,10 do not belong to the concept, instances 4, 5, 6 and 9 cannot be decided by the learner whether they belong to X1 or not. Concept X2 is internally B-undefinable, since there are no positive instances of the concept

11 Derive the quality of learning POSB{e} = only those instances properly classified by the learner={1, 2, 3, 7, 8, 10} Therefore Quality of learning is

12 Decision algorithm Another decision algorithm

13 Are all the instances are necessary to learn the knower’s knowledge? Ans : Some instances are crucial for concept learning but some are not Remove instance 10 the table is as follows

14 Uabcde 112011 212011 320010 400121 521021 600122 720010 801221 921022

15 Let us remove instance 4 and 8 Uabcde 112011 212011 320010 521021 600122 720010 921022 1020010

16 Case of an Imperfect Teacher How lack of knowledge by the knower would affect the learner’s ability to learn ? Whether the learner would be able to discover the knower’s deficiency

17 UabC 102+ 201+ 310+ 4010 5100 611- 721- 801- 910-

18 B= {a, b} is the set of learner’s attributes C is knower’s attribute. Two concepts – X + and X-, denoted by + and – values. Compute whether sets X+, X- and X0 are definable in terms of attributes a and b

19

20 Every substitution for value 0 in attribute c, values + or –, the boundary region remain unchanged. the knower’s lack of knowledge is unessential The fact that he failed in classifying examples 4 and 5 does not disturb the learning process.

21 UabC 102+ 201+ 310+ 401+ 5100 6110 721- 801- 910-

22 X+ = {1, 2, 3, 4} X_= {7, 8, 9} X0 = {5, 6}

23 The learner can discover that the knower is unable to classify object 6

24 Inductive Learning Assumption – U is not constant and changed during the learning process. – Every new instance is classified by the knower and the learner is suppose to classify it too on the basis of his actual knowledge.

25 Open World Assumption(OWA) Whole concept is unknown to the knower and only certain instances of the concept are known

26

27

28

29 Possibilities:- – New instance confirms actual knowledge – New instance contradicts actual knowledge – New instance is completely new case.

30 New instance confirms actual knowledge

31 New instance contradicts actual knowledge Quality of learning decrease

32 Conclusion If decision table is consistent it provide Highest quality of learning If decision table is inconsistent, new confirming instance increases learner’s knowledge or new contradict instance will decrease quality of learning.


Download ppt "Machine Learning -Ramya Karri -Rushin Barot. Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open."

Similar presentations


Ads by Google