1 Using Bayesian Network for combining classifiers Leonardo Nogueira Matos Departamento de Computação Universidade Federal de Sergipe.

Slides:



Advertisements
Similar presentations
Numbers Treasure Hunt Following each question, click on the answer. If correct, the next page will load with a graphic first – these can be used to check.
Advertisements

2 Casa 15m Perspectiva Lateral Izquierda.
Repaso: Unidad 2 Lección 2
1 A B C
Scenario: EOT/EOT-R/COT Resident admitted March 10th Admitted for PT and OT following knee replacement for patient with CHF, COPD, shortness of breath.
Simplifications of Context-Free Grammars
Variations of the Turing Machine
Angstrom Care 培苗社 Quadratic Equation II
AP STUDY SESSION 2.
1
Slide 1Fig 39-CO, p Slide 2Fig 39-1, p.1246.
Select from the most commonly used minutes below.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
STATISTICS HYPOTHESES TEST (I)
Properties Use, share, or modify this drill on mathematic properties. There is too much material for a single class, so you’ll have to select for your.
David Burdett May 11, 2004 Package Binding for WS CDL.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Local Customization Chapter 2. Local Customization 2-2 Objectives Customization Considerations Types of Data Elements Location for Locally Defined Data.
Process a Customer Chapter 2. Process a Customer 2-2 Objectives Understand what defines a Customer Learn how to check for an existing Customer Learn how.
Custom Services and Training Provider Details Chapter 4.
CALENDAR.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt BlendsDigraphsShort.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt FactorsFactors.
1 Click here to End Presentation Software: Installation and Updates Internet Download CD release NACIS Updates.
Media-Monitoring Final Report April - May 2010 News.
Break Time Remaining 10:00.
Turing Machines.
Table 12.1: Cash Flows to a Cash and Carry Trading Strategy.
PP Test Review Sections 6-1 to 6-6
Bright Futures Guidelines Priorities and Screening Tables
EIS Bridge Tool and Staging Tables September 1, 2009 Instructor: Way Poteat Slide: 1.
Bellwork Do the following problem on a ½ sheet of paper and turn in.
1 The Royal Doulton Company The Royal Doulton Company is an English company producing tableware and collectables, dating to Operating originally.
Operating Systems Operating Systems - Winter 2010 Chapter 3 – Input/Output Vrije Universiteit Amsterdam.
Exarte Bezoek aan de Mediacampus Bachelor in de grafische en digitale media April 2014.
TESOL International Convention Presentation- ESL Instruction: Developing Your Skills to Become a Master Conductor by Beth Clifton Crumpler by.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
Adding Up In Chunks.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Artificial Intelligence
Before Between After.
Subtraction: Adding UP
: 3 00.
5 minutes.
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
1 Let’s Recapitulate. 2 Regular Languages DFAs NFAs Regular Expressions Regular Grammars.
Speak Up for Safety Dr. Susan Strauss Harassment & Bullying Consultant November 9, 2012.
1 Titre de la diapositive SDMO Industries – Training Département MICS KERYS 09- MICS KERYS – WEBSITE.
Essential Cell Biology
Converting a Fraction to %
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Clock will move after 1 minute
famous photographer Ara Guler famous photographer ARA GULER.
PSSA Preparation.
Immunobiology: The Immune System in Health & Disease Sixth Edition
Physics for Scientists & Engineers, 3rd Edition
Energy Generation in Mitochondria and Chlorplasts
Select a time to count down from the clock above
Copyright Tim Morris/St Stephen's School
1.step PMIT start + initial project data input Concept Concept.
1 Dr. Scott Schaefer Least Squares Curves, Rational Representations, Splines and Continuity.
1 Decidability continued…. 2 Theorem: For a recursively enumerable language it is undecidable to determine whether is finite Proof: We will reduce the.
FIGURE 3-1 Basic parts of a computer. Dale R. Patrick Electricity and Electronics: A Survey, 5e Copyright ©2002 by Pearson Education, Inc. Upper Saddle.
Presentation transcript:

1 Using Bayesian Network for combining classifiers Leonardo Nogueira Matos Departamento de Computação Universidade Federal de Sergipe

2 Agenda Why combining classifiers? Bayesian network principles Bayesian network as an ensemble of classifiers Experimental results Future works and conclusions

3 Why combining classifiers? Classifiers can colabore with each other Minimizes computational effort for training Maximizes global recognition rate

4 Why not to do so? Because combining individual preditions can be so difficult as divising a robust single classifier

5 Why not to do so? Because combining individual preditions can be so difficult as divising a robust single classifier Decision Classifiers Combiner

6 Approaches for combining classifiers L1. Data LevelL3. Decision Level L2. Feature Level Fixed rules Trainable rules

7 Approaches for combining classifiers L1. Data LevelL3. Decision Level L2. Feature Level Fixed rules Trainable rules

8 Why not to do so? Because combining individual preditions can be so difficult as divising a robust single classifier Decision Classifiers Combiner p(w|x)

9 Approaches for combining classifiers L1. Data LevelL3. Decision Level L2. Feature Level Fixed rules Trainable rules

10 Existent scenarios Pattern space Pattern 2 1 classifiers

11 Our scenery Pattern space classifiers

12 A closed look

13 A closed look – discriminant function

14 A closed look – using multiple classifiers

15 A closed look – using multiple classifiers The challegers: How can we combine classifier's output? How can we identify regions in pattern space?

16 Agenda Why combining classifiers? Bayesian network principles Bayesian network as an ensemble of classifiers Experimental results Future works and conclusions

17 Bayesian network principles A B C Those circles represent binary random variables

18 Bayesian network principles A B C Those circles represent binary random variables

19 Bayesian network principles A B C Those circles represent binary random variables dataset

20 Bayesian network principles A B C Those circles represent binary random variables instance

21 Bayesian network principles A B C Jointly probability inference is a combinatorial problem 2 possibilities 4 possibilities

22 Bayesian network principles A B C Jointly probability inference is a combinatorial problem Independence makes computation a little more simple

23 Bayesian network principles A B C Arest – indicates statistical dependence between variables

24 Bayesian network principles A B C Arc – represents causality

25 Bayesian network principles A B C A Bayesian network is a DAG (Direct Aciclic Graph) where nodes represent random variables and arcs represent causality relatioship

26 Bayesian network principles A B C There are polinomial time algorithms to compute inference in BN

27 Bayesian network principles A B C There are polinomial time algorithms to compute inference in BN Evidence

28 Bayesian network principles A B C There are polinomial time algorithms to compute inference in BN Evidence messages

29 Bayesian network principles A B C There are polinomial time algorithms to compute inference in BN Evidence

30 Agenda Why combining classifiers? Bayesian network principles Bayesian network as an ensemble of classifiers Experimental results Future works and conclusions

31 A Fundamental Goal

32 Another insight From a statistical point-of-view a Bayesian network is also a graphical model to represents a complex and factored probability distribution function

33 Another insight From a statistical point-of-view a Bayesian network is also a graphical model to represents a complex and factored probability distribution function

34 Another insight From a statistical point-of-view a Bayesian network is also a graphical model to represents a complex and factored probability distribution function The challegers: How can we combine classifier's output? How can we identify regions in pattern space?

35 How can we combine classifier's output? We use a BN as a graphical model of the pdf P(w|x) We assume that classifier participate in computing that function Each classifier must be a statistical classifier

36 How can we identify regions in pattern space?

37 Splitting pattern space

38 Defining a region

39 Patterns in a region

40 Algorithm

41 Bayesian Network Structure

42 Bayesian networks for combining classifiers

43 Agenda Why combining classifiers? Bayesian network principles Bayesian network as an ensemble of classifiers Experimental results Future works and conclusions

44 Results with UCI databases

45 Results with NIST database

46 System I classifiers

47 Preliminaries

48 Results with the complete dataset

49 Agenda Why combining classifiers? Bayesian network principles Bayesian network as an ensemble of classifiers Experimental results Future works and conclusions

50 Future works

51 Future works

52 Future works

53 Future works

54 Future works Pattern space Pattern 2 1 classifiers

55 Conclusions We have developed a method for combining classifiers using a Bayesian network A BN act as trainable ensemble of statistical classifiers The method is not suitable for small size dataset Experimental results reveal a good performance with a large dataset As a future work we intend to use a similar approach for splitting the feature vector and combine classifiers specialized on each piece of it.

56 Thank you!