Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,

Similar presentations


Presentation on theme: "Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,"— Presentation transcript:

1 Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano, P. Larranaga, et al. Artificial Intelligence in Medicine, vol. 22, no. 3, pp , June 2001 Cho, Dong-Yeon

2 (C) 2001 SNU CSE Biointelligence Lab
Introduction Combining the predictions of a set of classifier More accurate than any of the component classifier How to create and combine an ensemble of classifier A new multi-classifier construction methodology based on the stacked generalization paradigm A number of classifier layers Upper layer classifiers receive the class predicted by its immediately previous layer as input. Bayesian network structure (C) 2001 SNU CSE Biointelligence Lab

3 Multi-classifier Schemata
Multi-classifier Structure Stacked generalization Each layer of classifiers is used to combine the predictions of the classifiers of its preceding layer. A single classifier at the top-most level outputs the ultimate prediction. Two-level system Bayesian network makes a consensus vote system over the predictions of the level-0 single classifiers. It can identify the possible conditional independencies and dependencies existing between the results obtained by level-0 classifiers. (C) 2001 SNU CSE Biointelligence Lab

4 (C) 2001 SNU CSE Biointelligence Lab

5 (C) 2001 SNU CSE Biointelligence Lab
Multi-classifier Construction Leaving One Out sequence n-1 training example Testing the learned model with the jth case Obtained results are used as training set in the Bayesian network construction (C) 2001 SNU CSE Biointelligence Lab

6 Layer-0 Composite Classifiers
Decision Trees Avoid overfitting Prepruning: Weighting the discriminant capability of the attribute selected, and thus discarding a possible successive splitting of the dataset Postpruning: After allowing a huge expansion of the tree, then removing branches and leaves ID3: only prepruning C4.5: both pruning techniques (C) 2001 SNU CSE Biointelligence Lab

7 (C) 2001 SNU CSE Biointelligence Lab
Instance-based Learning k-nearest neighbor (k-NN) algorithm – similarity function IB4 keeps a classification performance record for each saved instance and removes some of the saved instances that believed to be noisy instance using a significance test. IB: the weight of each attribute reflects the attribute’s relative importance for classification The attribute weights are increased for attributes with similar values for correct classifications or for attributes with different values for incorrect classifications, and they are decreased otherwise. (C) 2001 SNU CSE Biointelligence Lab

8 (C) 2001 SNU CSE Biointelligence Lab
Rule Induction cn2 rule induction program It has been designed with the aim of inducing short, simple, comprehensible rules in domains where problems of poor description language and/or noise may be present. The rules are searched in a general-to-specific way. Strict match oneR It is a very simple rule inductor that searches and only applies the best rule in the datafile. Ripper It is a fast rule inductor (C) 2001 SNU CSE Biointelligence Lab

9 (C) 2001 SNU CSE Biointelligence Lab
Naive Bayes (NB) Classifiers Assumption of independence between the occurrence of features values NB classifier NBTree classifier It builds a decision tree applying the Naive Bayes classifier at the leaves of the tree. (C) 2001 SNU CSE Biointelligence Lab

10 Layer-1 Classifier: Bayesian Network
Bayesian Networks BNs are directed acyclic graphs (DAGs) Concept of conditional independence among variables It constitutes an efficient device to perform probabilistic inference. The problem of building such a network remains. (C) 2001 SNU CSE Biointelligence Lab

11 (C) 2001 SNU CSE Biointelligence Lab
Bayesian Networks as Classifiers Naive Bayes approach This assummes independence among all the predictor variables given the class. The Bayesian network structure is fixed, having all predictor variables as sons of the variable to be predicted. (C) 2001 SNU CSE Biointelligence Lab

12 (C) 2001 SNU CSE Biointelligence Lab
Markov Blanquet (MB) approach In a BN any variable is influenced only by its Markov Blanquet, that is, its parent variables, its children variables and the parent variables of its children variables. The search in the set of structures that are MB of the variable to be classified. (C) 2001 SNU CSE Biointelligence Lab

13 (C) 2001 SNU CSE Biointelligence Lab
Genetic algorithm begin AGA Make initial population at random WHILE NOT stop DO BEGIN Select parents from the population Produce children from the selected parents Mutate the individuals Extend the population by adding the children to it Reduce the extended population END Output the best individual found end AGA (C) 2001 SNU CSE Biointelligence Lab

14 (C) 2001 SNU CSE Biointelligence Lab
Notation and representation Assuming an ordering between the nodes Without assuming an ordering between the nodes (C) 2001 SNU CSE Biointelligence Lab

15 (C) 2001 SNU CSE Biointelligence Lab
Obtained model Automatically inducing Bayesian networks with a Markov Blanquet structure with respect to the class variable based on GAs Each individual in GA will be a BN structure, and all the predictor variables form the MB of the variable to be classified. (C) 2001 SNU CSE Biointelligence Lab

16 (C) 2001 SNU CSE Biointelligence Lab
Experimental Results Datafile 1210 ICU patients Survival: 996 cases, 82.31% Not survival: 214 cases, 17.69% 10-fold cross-validation ICU datafile variables (C) 2001 SNU CSE Biointelligence Lab

17 (C) 2001 SNU CSE Biointelligence Lab
Results Standard medical methods ML standard approaches and multi-classifier (C) 2001 SNU CSE Biointelligence Lab

18 Conclusion and Further Work
A new multi-classifier construction method Outperforming existing standard machine learning methods by combining them for predicting the survival of patients at ICU As further work, this method will be applied taking the specificity and sensitivity of the data, and the method will be applied to bigger databases. (C) 2001 SNU CSE Biointelligence Lab


Download ppt "Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,"

Similar presentations


Ads by Google