Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 1. Introduction Background Key questions 2. Probabilistic Exemplar Based Model Representation Classification process Learning process 3. Empirical Evaluation.

Similar presentations


Presentation on theme: "1 1. Introduction Background Key questions 2. Probabilistic Exemplar Based Model Representation Classification process Learning process 3. Empirical Evaluation."— Presentation transcript:

1 1 1. Introduction Background Key questions 2. Probabilistic Exemplar Based Model Representation Classification process Learning process 3. Empirical Evaluation 4. Related Work A Probabilistic Exemplar Based Model for Case-Based Reasoning Andrés Rodríguez, Sunil Vadera, Enrique Sucar Ins. Inv. Eléctricas, University of Salford, ITESM Morelos MICAI 2000 Abril 2000 OUTLINE

2 2 Introduction Case Base Reasoning Cycle case base Adaptation Evaluation Retention Retrieval How do we assess similarity ? Is the proposal a likely solution ? Which cases do we retain ? How do we adapt the old solution? Representation ? Case Base Reasoning Paradigm New cases can be solved by adapting solutions that were used to solve similar cases in the past.

3 3 A B category case C Interesting when categories not defined by nec/suf conds. data is unstructured categories not disjoint not all the data exists in advance uncertainty involved A B exemplar category prototypical case C

4 4 Key Questions The objective. A B exemplar category new exemplar What is a good representation for an EBM? What notion of similarity can be adopted? How can a new case be classified? How can it be learned incrementally?

5 5 Representation... C1 Cw e1 ei ek eq f1 f2 fm fj fn P(f1 | parents(f1)) P(fn | parents(fn)) P(e1 | C1) P(ei | C1 ) P(eq | Cw)... C1 Cw Ci e1 ei ek eq

6 6 Classification Process fa,..., fh new case... Cm C1 ec fk fj Stage 1 Rank the categories. Rank(ei) = Stage 2 Determination of an Exemplar P(ei | fa,..., fh) P(f | ei) nfei... C1 Cw e1 ei ek eq f1 f2 fm fj fn P(f1 | parents(f1)) P(fn | parents(fn)) P(e1 | C1) P(eq | Cw).... C1 e1 ei ek f1 f2 fm fj P(f1 | parents(f1)) P(e1 | C1)... f  ei

7 7 Learning Process Classification process new training case Add exemplar Retain ? C e1 e2 e3 C e1 e2 e3 C e1 e2 e3

8 8 What makes a good exemplar? A prototypical member [Rosch and Mervis (1974)] 1. High family resemblance in the region 2. Low family resemblance with other regions. Summary representation is a Bayesian net consisting of the features of all the cases represented by the exemplar. Prototypical case Focality Peripherality Prototypicality New case ei C

9 9 Estimating the Parameters Need to estimate P(fi| parents(fi)) Requires 2 n+1 values for n parents. Intersection may not have many examples Noisy OR model Exception Independence Absence of fi given e1 is independent of absence of a feature given e2 Accountability Condition If a case is not represented by any of the exemplars, then it does not have any of the exemplars’ features.... C1 Cw e1 ei ek eq f1 f2 fm fj fn P(f1 | parents(f1)) P(fn | parents(fn)) P(e1 | C1) P(eq | Cw)...

10 10... A e6 e8 e9 f1 f2 f4 fn-1 f3 fn Ve P(e6 | A) P(e9 | A) P(f1 | e6,Ve ) P(fn | e9,Ve) Virtual Exemplar Estimate of P(f | Ve) Where n : number of cases in the category  and parameters that determine the rate of decay

11 11 Empirical Evaluation Tested on Votes, Zoo, Audiology Decay: = 0.6,  = 0.1, threshold = 0.75 70/30 training/testing split Good accuracy for Votes (89%) & Zoo (92%) Poor for Audiology (50%) Audiology: Compression & Accuracy 0 20 40 60 80 100 1234567891011 Category Percent CompressionAccuracy

12 12 Bayesian models Inductive models with supervised learning Case-based models PROTOS CASEY REMIND OC1 C4.5 COBWEB Inductive models unsupervised learning AutoClass IBL CBR-Express Naive Bayes Heckerman's Tirri's PEBM Protos Use of remindings, censors, difference links Learns from failure by user explanation Uses many heuristics Tirri and Myllymäkis’ model Uses all cases not exemplars Assumes cases are mutually exclusive Assumes features are independent given case Related Work

13 13 Conclusion Developed a model with: probabilistic exemplars foundations in Bayesian nets is incremental promising results Future Need to: develop quicker propagation test on more data sets evaluate it relative to others investigate, , threshold multilevel features dependent features


Download ppt "1 1. Introduction Background Key questions 2. Probabilistic Exemplar Based Model Representation Classification process Learning process 3. Empirical Evaluation."

Similar presentations


Ads by Google