Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Global Models from Local Patterns A.J. Knobbe.

Similar presentations


Presentation on theme: "Building Global Models from Local Patterns A.J. Knobbe."— Presentation transcript:

1 Building Global Models from Local Patterns A.J. Knobbe

2 Feature-continuum attributes (constructed) features patterns classifiers target concept

3 Two-phased process Break discovery up into two phases Transform complex problem into more simple one  frequent patterns  correlated patterns  interesting subgroups  decision boundaries  …  frequent patterns  correlated patterns  interesting subgroups  decision boundaries  …  redundancy reduction  dependency modeling  global model building  …  redundancy reduction  dependency modeling  global model building  … Pattern Discovery phase Pattern Combination phase  Pattern Teams  pattern networks  global predictive models  …

4 Task: Subgroup Discovery Subgroup Discovery: Find subgroups that show substantially different distribution of target concept. top-down search for patterns inductive constraints (sometimes monotonic) evaluation measures: novelty, X 2, information gain also known as rule discovery, correlated pattern mining

5 Novelty Also known as weighted relative accuracy Balance between coverage and unexpectedness nov(S,T) = p(ST) – p(S)  p(T) between −.25 and.25, 0 means uninteresting TF T.42.13.55 F.12.33.541.0 nov(S  T) = p(ST)−p(S)  p(T) =.42 −.297 =.123 subgroup target

6 Demo Subgroup Discovery redundancy exists in set of local patterns

7 Demo Subgroup Discovery

8 Pattern Combination phase Feature selection, redundancy reduction – Pattern Teams Dependency modeling – Bayesian networks – Association rules Global modeling – Classifiers, regression models

9 Pattern Teams & Pattern Networks

10 Pattern Teams Pattern Discovery typically produces very many patterns with high levels of redundancy  Report small informative subset with specific properties Promote dissimilarity of patterns reported Additional value of individual patterns Consider extent of patterns – Treat patterns as binary features/items

11 Intuitions No two patterns should cover same set of examples No pattern should cover complement of another pattern No pattern should cover logical combination of two or more other patterns Patterns should be mutually exclusive The pattern set should lead to the best performing classifier Patterns should lie on convex hull in ROC-space

12 Quality measures for pattern sets Judge pattern sets on the basis quality function Joint Entropy (miki) Exclusive Coverage Wrapper accuracy Area Under Curve in ROC-space Bayesian Dirichlet equivalent uniform unsupervised supervised

13 Pattern Teams 82 subgroups discovered 4 subgroups in pattern team

14 Pattern Network Again, treat patterns as binary features Bayesian networks – conditional independence of patterns Explain relationships between patterns Explain role of patterns in Pattern Team

15 Demo Pattern Team & Network redundancy removed to find truly divers patterns, in this case using maximization of joint entropy

16 Demo Pattern Team & Network peak around 89k peak around 16k peak around 39k pattern team, and related patterns can be presented in a bayesian network

17 Properties of SD phase in PC What knowledge about Subgroup Discovery parameters can be exploited in Combination? Interestingness – Are interesting subgroups diverse? – Are interesting subgroups correlated? Information content Support of patterns

18 joint entropy of 2 interesting subgroups subgroups are very novel, 1 bit of information subgroups are relatively novel, up to 2 bits of information

19 correlation of interesting subgroups subgroups are novel, but potentially independent subgroups are very novel, and correlate

20 Building Classifiers from Local Patterns

21 Combination strategies How to interpret a pattern set? Conjunctive (intersection of patterns) Disjunctive (union of patterns) Majority vote (equal weight linear separator) … Contingencies/Classifiers

22 Decision Table Majority (DTM) Treat every truth-assignment as contingency Classification based on conditional probability Use majority class for empty contingencies Only works with Pattern Team (else overfitting)

23 Support Vector Machine (SVM) SVM with linear kernel Binary data All dimensions have same scale Works with large pattern sets Subgroup discovery has removed XOR-like dependencies Interesting subgroups correlate

24 XOR-like dependencies

25 p2p2 p1p1

26 p2p2 p1p1 (0,0) (1,0) (0,1) (1,1)

27 Division of labour between 2 phases Subgroup Discovery Phase – Feature selection – Decision boundary finding/thresholding – Multivariate dependencies (XOR) Pattern Combination Phase – Pattern selection – Combination (XOR?) – Class assignment

28 Combination-aware Subgroup Discovery Better global model Superficially uninteresting patterns can be reported pruning of search space (new rule-measures) subgroups are not novel, team is optimal

29 Combination-aware Subgroup Discovery Subgroup Discovery ++: Find a set of subgroups that show substantially different distribution of target concept. Considerations – support of pattern – diversity of pattern – …

30 Conclusions Less hasty approach to model building Interesting patterns serve two purposes – understandable knowledge – building blocks of global model Pattern discovery without combination limited Information exchange between phases Integration of two phases non-trivial


Download ppt "Building Global Models from Local Patterns A.J. Knobbe."

Similar presentations


Ads by Google