Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lower-Bound Estimate for Cost-sensitive Decision Trees Mikhail Goubko, senior researcher Trapeznikov Institute of Control Sciences of the Russian Academy.

Similar presentations


Presentation on theme: "Lower-Bound Estimate for Cost-sensitive Decision Trees Mikhail Goubko, senior researcher Trapeznikov Institute of Control Sciences of the Russian Academy."— Presentation transcript:

1

2 Lower-Bound Estimate for Cost-sensitive Decision Trees Mikhail Goubko, senior researcher Trapeznikov Institute of Control Sciences of the Russian Academy of Sciences, Moscow, Russia 18 th IFAC World Congress, Milan, August 31, 2011

3 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 from Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Decision tree – a popular classification tool for: machine learning pattern recognition fault detection medical diagnostics situational control Decision is made from a series of tests of attributes The next attribute tested depends on the results of the previous tests Decision trees are learned from data Compact trees are good Expected length of the path in a tree is the most popular measure of tree size

4 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Set D of decisions chest cold influenza healthy Set of cases N Set of attributes M No High No High No High No High No High Yes High No Low Yes Low No Low Cough t0t0 t0t0 Chronic illness No Yes No Yes No Yes No Wheezing 11 22 33 44 55 66 77  nn

5 S U M M A R Y Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Set D of decisions chest cold influenza healthy Set of cases N Set of attributes M No Yes No Yes No High Low Cough t0t0 t0t0 Chronic illness No Yes No Yes No Yes No Wheezing 11 22 33 44 55 66 77  nn

6 S U M M A R Y Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Set D of decisions chest cold influenza healthy Set of cases N Set of attributes M No Yes No Yes No Yes No Yes No 11 22 33 44 55 66 77  nn No Yes No High Low

7 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Set D of decisions chest cold influenza healthy Set of cases N Set of attributes M No Yes Cough t0t0 t0t0 Chronic illness Yes No Wheezing 11 22 33 44 55 66 77  nn No Yes No High Low

8 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Tests differ in measurement costs (e.g., a general blood analysis is much cheaper than a computer tomography procedure) The decision tree is grown to minimize the expected cost of classification given zero misclassification rate on a learning set Other types of costs relevant to decision tree growing (Turney, 2000), i.e. misclassification, teaching, intervention costs, are not considered Measurement costs (see Turney, 2000) may depend on: individual case true class of the case side-effects (value of hidden attributes) prior tests prior results (other attributes’ values) correct answer of the current question Studied! Covered! Can be accounted!

9 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps

10 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Hyafil and Rivest (1976), and also Zantema and Bodlaender (2000) have shown the decision tree growing problem to be NP-hard Sieling (2008) has shown that the size of an optimal tree is hard to approximate up to any constant factor Quinlan (1979) developed an information-gain-based heuristics ID3. Now it is commercial and has version 5 Heuristic algorithms for cost-sensitive tree construction: CS-ID3 - Tan (1993) IDX - Norton (1989) EG2 - Núñez (1991) Lower-bound estimates for cost-sensitive decision trees: Entropy-based estimate - Ohta and Kanaya (1991) Huffman code length - Biasizzo, Žužek, and Novak (1998) Combinatory estimate - Bessiere, Hebrard, and O’Sullivan (2009) Hyafil and Rivest (1976), and also Zantema and Bodlaender (2000) have shown the decision tree growing problem to be NP-hard Sieling (2008) has shown that the size of an optimal tree is hard to approximate up to any constant factor Quinlan (1979) developed an information-gain-based heuristics ID3. Now it is commercial and has version 5 Heuristic algorithms for cost-sensitive tree construction: CS-ID3 - Tan (1993) IDX - Norton (1989) EG2 - Núñez (1991) Lower-bound estimates for cost-sensitive decision trees: Entropy-based estimate - Ohta and Kanaya (1991) Huffman code length - Biasizzo, Žužek, and Novak (1998) Combinatory estimate - Bessiere, Hebrard, and O’Sullivan (2009)

11 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Definition 1. A subset of tests Q  M isolates case w in subset of cases S  N ( w  S ) iff sequence of tests Q assures proper decision f(w) given initial uncertainty S and w is the real state of the world. Definition 2. Optimal set of questions Q(w, S)  M is the cheapest of the sets of questions that isolate case w in set S ; Define also minimum cost The lower-bound estimate Unlike known estimates it performs well when: 1)the number of classes is small compared to that of examples 2)there is a small number of examples

12 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps The problem of unknown case classification is replaced by the problem of proving the true case to a third party. What is the true case? Test this I know the true case! Try testing this Initial problem Simplified problem

13 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Average time  n 2 m, ( n - the number of cases, m - the number of tests) Calculation of the estimate is reduced to a number of set-covering problems and is NP-hard in the worst case. But experiments show nice performance for the estimate and its linear programming relaxation.

14 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps 1. Use the lower-bound estimate to evaluate extra costs due to imperfectness of heuristic tree growing algorithms The quality of the lower-bound estimate on real data sets The quality of the estimate varies from 60 to 95% depending on the data set

15 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Comparing new algorithms with known heuristics (IDX, CS-ID3, EG2) on different data sets (tree costs are shown) 2. Use the lower-bound estimate to build new tree growing algorithms new algorithms perform better on small data sets new algorithms work worse in the presence of “dummy” tests adjacency of results shows that heuristic trees are nearly optimal ones

16 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps General methods of hierarchy optimization General methods of hierarchy optimization

17 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Cost function The problem is to find Allowed arbitrary number of layers asymmetric hierarchies multiple subordination several top nodes Not allowed cycles unconnected parts subordinating to the “worker” nodes (black)

18 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Sectional cost function 1. Covers most of applied problems of hierarchy optimization 2. Analytical methods when the tallest or the flattest hierarchy is optimal when an optimal tree exists when an optimal hierarchy has the shape of a conveyor 3. Algorithms optimal hierarchy search optimal tree search Build an optimal conveyor-belt hierarchy 4. The general problem of hierarchy optimization is complex Homogenous cost function 1. Also has numerous applications 2. Closed-form solution for the optimal hierarchy problem 3. Efficient algorithms of nearly-optimal tree construction

19 S U M M A R Y 1.New lower-bound estimate is suggested for the decision tree with case-dependent test costs 2.Unlike known estimates it performs well when the number of classes is small 3.Estimate calculation average performance is n 2  m operations for n examples and m tests 4.Use the estimate to evaluate absolute losses of heuristic decision tree algorithms 5.Use the estimate in split criteria of greedy top-down algorithms of decision tree construction 6.Experiments on real data sets show these algorithms to give comparable results with popular cost-sensitive heuristics – IDX, CS-ID3, EG2 7.Algorithms suggested perform better on small data sets with lack of tests Mikhail Goubko Lower-Bound Estimate for Cost-sensitive Decision Trees IFAC’2011, Milan, August 31 Introduction Decision trees Costs Results Motivation Literature Definitions Calculation Applications Bonus: Hierarchy Optimization Bonus: Hierarchy Optimization Model Methods Apps Applications of hierarchy optimization methods Manufacturing planning (assembly line balancing) Networks design communication and computing networks data collection networks structure of hierarchical cellular networks Computational mathematics optimal coding structure of algorithms real-time computation and aggregation hierarchical parallel computing User interfaces design optimizing hierarchical menus building compact and informative taxonomies Data mining decision trees growing structuring database indexes Organization design org. chart re-engineering theoretical models of a hierarchical firm


Download ppt "Lower-Bound Estimate for Cost-sensitive Decision Trees Mikhail Goubko, senior researcher Trapeznikov Institute of Control Sciences of the Russian Academy."

Similar presentations


Ads by Google