Download presentation

Presentation is loading. Please wait.

Published byCara Marvin Modified about 1 year ago

1
RULE-BASED MULTICRITERIA DECISION SUPPORT USING ROUGH SET APPROACH Roman Slowinski Laboratory of Intelligent Decision Support Systems Institute of Computing Science Poznan University of Technology Roman Slowinski 2006

2
2 Plan Rough Set approach to preference modeling Classical Rough Set Approach (CRSA) Dominance-based Rough Set Approach (DRSA) for multiple-criteria sorting Granular computing with dominance cones Induction of decision rules from dominance-based rough approximations Examples of multiple-criteria sorting Decision rule preference model vs. utility function and outranking relation DRSA for multicriteria choice and ranking Dominance relation for pairs of objects Induction of decision rules from dominance-based rough approximations Examples of multiple-criteria ranking Extensions of DRSA dealing with preference-ordered data Conclusions

3
3 Inconsistencies in data – Rough Set Theory Zdzisław Pawlak (1926 – 2006) badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiteraturePhysicsMathematicsStudent

4
4 Inconsistencies in data – Rough Set Theory Objects with the same description are indiscernible and create blocks badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

5
5 Inconsistencies in data – Rough Set Theory Objects with the same description are indiscernible and create blocks badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

6
6 Inconsistencies in data – Rough Set Theory Objects with the same description are indiscernible and create blocks badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

7
7 Inconsistencies in data – Rough Set Theory Objects with the same description are indiscernible and create blocks badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

8
8 Inconsistencies in data – Rough Set Theory Objects with the same description are indiscernible and create blocks badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

9
9 Inconsistencies in data – Rough Set Theory Objects with the same description are indiscernible and create granules badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

10
10 Inconsistencies in data – Rough Set Theory Another information assigns objects to some classes (sets, concepts) badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

11
11 Inconsistencies in data – Rough Set Theory Another information assigns objects to some classes (sets, concepts) badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

12
12 Inconsistencies in data – Rough Set Theory Another information assigns objects to some classes (sets, concepts) badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

13
13 Inconsistencies in data – Rough Set Theory The granules of indiscernible objects are used to approximate classes badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student

14
14 Inconsistencies in data – Rough Set Theory Lower approximation of class „good” badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student Lower Approximation

15
15 Inconsistencies in data – Rough Set Theory Lower and upper approximation of class „good” badmediumbad S8 badmediumbad S7 good S6 good mediumgoodS5 goodmedium S4 medium S3 mediumbadmedium S2 bad mediumgoodS1 Overall classLiterature (L)Physics (Ph)Mathematics (M)Student Lower Approximation Upper Approximation

16
16 CRSA – decision rules induced from rough approximations Certain decision rule supported by objects from lower approximation of Cl t (discriminant rule) Possible decision rule supported by objects from upper approximation of Cl t (partly discriminant rule) R.Słowiński, D.Vanderpooten: A generalized definition of rough approximations based on similarity. IEEE Transactions on Data and Knowledge Engineering, 12 (2000) no. 2,

17
17 CRSA – decision rules induced from rough approximations Certain decision rule supported by objects from lower approximation of Cl t (discriminant rule) Possible decision rule supported by objects from upper approximation of Cl t (partly discriminant rule) Approximate decision rule supported by objects from the boundary of Cl t where Cl t,Cl s,...,Cl u are classes to which belong inconsistent objects supporting this rule

18
18 Rough Set approach to preference modeling The preference information has the form of decision examples and the preference model is a set of decision rules “People make decisions by searching for rules that provide good justification of their choices” (Slovic, 1975) Rules make evidence of a decision policy and can be used for both explanation of past decisions & recommendation of future decisions Construction of decision rules is done by induction (kind of regression) Induction is a paradigm of AI: machine learning, data mining, knowledge discovery Regression aproach has also been used for other preference models: utility (value) function, e.g. UTA methods, MACBETH outranking relation, e.g. ELECTRE TRI Assistant

19
19 Rough Set approach to preference modeling Advantages of preference modeling by induction from examples: requires less cognitive effort from the agent, agents are more confident exercising their decisions than explaining them, it is concordant with the principle of posterior rationality (March, 1988) Problem inconsistencies in the set of decision examples, due to: uncertainty of information – hesitation, unstable preferences, incompleteness of the family of attributes and criteria, granularity of information

20
20 Rough Set approach to preference modeling Inconsistency w.r.t. dominance principle (semantic correlation)

21
21 Rough Set approach to preference modeling Example of inconsistencies in preference information: Examples of classification of S1 and S2 are inconsistent StudentMathematics (M)Physics (Ph)Literature (L)Overall class S1goodmediumbad S2medium badmedium S3medium S4good mediumgood S5goodmediumgood S6good S7bad S8bad mediumbad

22
22 Rough Set approach to preference modeling Handling these inconsistencies is of crucial importance for knowledge discovery and decision support Rough set theory (RST), proposed by Pawlak (1982), provides an excellent framework for dealing with inconsistency in knowledge representation The philosophy of RST is based on observation that information about objects (actions) is granular, thus their representation needs a granular approximation In the context of multicriteria decision support, the concept of granular approximation has to be adapted so as to handle semantic correlation between condition attributes and decision classes Greco, S., Matarazzo, B., Słowiński, R.: Rough sets theory for multicriteria decision analysis. European J. of Operational Research, 129 (2001) no.1, 1-47

23
23 Classical Rough Set Approach (CRSA) Let U be a finite universe of discourse composed of objects (actions) described by a finite set of attributes Sets of objects indiscernible w.r.t. attributes create granules of knowledge (elementary sets) Any subset XU may be expressed in terms of these granules: either precisely – as a union of the granules or roughly – by two ordinary sets, called lower and upper approximations The lower approximation of X consists of all the granules included in X The upper approximation of X consists of all the granules having non-empty intersection with X

24
24 CRSA – formal definitions Approximation space U = finite set of objects (universe) C = set of condition attributes D = set of decision attributes CD= X C = – condition attribute space X D = – decision attribute space

25
25 CRSA – formal definitions Indiscernibility relation in the approximation space x is indiscernible with y by PC in X P iff x q =y q for all qP x is indiscernible with y by RD in X R iff x q =y q for all qR I P (x), I R (x) – equivalence classes including x I D makes a partition of U into decision classes Cl={Cl t, t=1,...,m} Granules of knowledge are bounded sets: I P (x) in X P and I R (x) in X R (PC and RD) Classification patterns to be discovered are functions representing granules I R (x) by granules I P (x)

26
26 CRSA – illustration of formal definitions Example Objects = firms InvestmentsSalesEffectiveness 4017,8High 3530High High 3135High High High High Medium 2725Medium 219.5Medium Medium Medium Medium 17.55Low 112Low 109Low 513Low

27
27 CRSA – illustration of formal definitions Objects in condition attribute space attribute 1 (Investment) attribute 2 (Sales)

28
28 CRSA – illustration of formal definitions a1a1 a2a Indiscernibility sets Quantitative attributes are discretized according to perception of the user

29
29 a1a CRSA – illustration of formal definitions Granules of knowlegde are bounded sets I P (x) a2a2

30
30 a1a CRSA – illustration of formal definitions Lower approximation of class High a2a2

31
31 a1a Upper approximation of class High a2a2 CRSA – illustration of formal definitions

32
32 CRSA – illustration of formal definitions a1a1 a2a Lower approximation of class Medium

33
33 CRSA – illustration of formal definitions a1a1 a2a Upper approximation of class Medium

34
34 Boundary set of classes High and Medium CRSA – illustration of formal definitions a1a a2a2

35
35 a1a CRSA – illustration of formal definitions Lower = Upper approximation of class Low a2a2

36
36 CRSA – formal definitions Basic properies of rough approximations Accuracy measures Accuracy and quality of approximation of XU by attributes PC Quality of approximation of classification Cl={Cl t, t=1,...m} by attributes PC Rough membership of xU to XU, given PC

37
37 CRSA – formal definitions Cl-reduct of PC, denoted by RED Cl (P), is a minimal subset P' of P which keeps the quality of classification Cl unchanged, i.e. Cl-core is the intersection of all the Cl-reducts of P:

38
38 CRSA – decision rules induced from rough approximations Certain decision rule supported by objects from lower approximation of Cl t (discriminant rule) Possible decision rule supported by objects from upper approximation of Cl t (partly discriminant rule) Approximate decision rule supported by objects from the boundary of Cl t where Cl t,Cl s,...,Cl u are classes to which belong inconsistent objects supporting this rule

39
39 CRSA – summary of useful results Characterization of decision classes (even in case of inconsistency) in terms of chosen attributes by lower and upper approximation Measure of the quality of approximation indicating how good the chosen set of attributes is for approximation of the classification Reduction of knowledge contained in the table to the description by relevant attributes belonging to reducts The core of attributes indicating indispensable attributes Decision rules induced from lower and upper approximations of decision classes show classification patterns existing in data

40
40 Sets of condition (C) and decision (D) criteria are semantically correlated q – weak preference relation (outranking) on U w.r.t. criterion q{CD} (complete preorder) x q q y q : “x q is at least as good as y q on criterion q” xD P y : x dominates y with respect to PC in condition space X P if x q q y q for all criteria qP is a partial preorder Analogically, we define xD R y in decision space X R, RD Dominance-based Rough Set Approach (DRSA) to MCDA

41
41 Dominance-based Rough Set Approach (DRSA) For simplicity : D={d} I d makes a partition of U into decision classes Cl={Cl t, t=1,...,m} [xCl r, yCl s, r>s] x y (x y and not y x) In order to handle semantic correlation between condition and decision criteria: – upward union of classes, t=2,...,n („at least” class Cl t ) – downward union of classes, t=1,...,n-1 („at most” class Cl t ) are positive and negative dominance cones in X D, with D reduced to single dimension d

42
42 Dominance-based Rough Set Approach (DRSA) Granular computing with dominance cones Granules of knowledge are open sets in condition space X P (PC) (x)= {yU: yD P x} : P-dominating set (x) = {yU: xD P y} : P-dominated set P-dominating and P-dominated sets are positive and negative dominance cones in X P Sorting patterns (preference model) to be discovered are functions representing granules, by granules

43
43 DRSA – illustration of formal definitions Example Investments Sales Effectiveness 4017,8High 3530High High 3135High High High High Medium 2725Medium 219.5Medium Medium Medium Medium 17.55Low 112Low 109Low 513Low

44
44 criterion 1 (Investment) criterion 2 (Sales) DRSA – illustration of formal definitions Objects in condition criteria space

45
45 DRSA – illustration of formal definitions Granular computing with dominance cones c1c c2c2 x

46
46 DRSA – illustration of formal definitions Granular computing with dominance cones c1c c2c2

47
47 c1c DRSA – illustration of formal definitions Lower approximation of upward union of class High c2c2

48
48 DRSA – illustration of formal definitions Upper approximation and the boundary of upward union of class High c1c c2c2

49
49 DRSA – illustration of formal definitions Lower = Upper approximation of upward union of class Medium c1c c2c2

50
50 DRSA – illustration of formal definitions Lower = upper approximation of downward union of class Low c1c c2c2

51
51 DRSA – illustration of formal definitions Lower approximation of downward union of class Medium c1c c2c2

52
52 DRSA – illustration of formal definitions Upper approximation and the boundary of downward union of class Medium c1c c2c2

53
Dominance-based Rough Set Approach vs. Classical RSA Comparison of CRSA and DRSA c1c1 c2c a1a1 a2a Classes:

54
54 Variable-Consistency DRSA (VC-DRSA) Variable-Consistency DRSA for consistency level l [0,1] x e.g. l = 0.8

55
55 DRSA – formal definitions Basic properies of rough approximations, for t=2,…,m Identity of boundaries, for t=2,…,m Quality of approximation of sorting Cl={Cl t, t=1,...,m} by criteria PC Cl-reducts and Cl-core of PC

56
56 Induction of decision rules from rough approximations certain D -decision rules, supported by objects without ambiguity: if x q1 q1 r q1 and x q2 q2 r q2 and … x qp qp r qp, then x possible D -decision rules, supported by objects with or without any ambiguity: if x q1 q1 r q1 and x q2 q2 r q2 and … x qp qp r qp, then x possibly DRSA – induction of decision rules from rough approximations

57
57 DRSA – induction of decision rules from rough approximations Induction of decision rules from rough approximations certain D -decision rules, supported by objects without ambiguity: if x q1 q1 r q1 and x q2 q2 r q2 and … x qp qp r qp, then x possible D -decision rules, supported by objects with or without any ambiguity: if x q1 q1 r q1 and x q2 q2 r q2 and … x qp qp r qp, then x possibly approximate D -decision rules, supported by objects Cl s Cl s+1 …Cl t without possibility of discerning to which class: if x q1 q1 r q1 and... x qk qk r qk and x qk+1 qk+1 r qk+1 and... x qp qp r qp, then xCl s Cl s+1 …Cl t.

58
58 c1c DRSA – decision rules Certain D -decision rules for the class High -base object c2c2 If f(x,c 1 )35, then x belongs to class „High”

59
59 c1c DRSA – decision rules Possible D -decision rules for the class High -base object c2c2 If f(x,c 1 )22.5 & f(x,c 2 )20, then x possibly belongs to class „High”

60
60 c1c DRSA – decision rules Approximate D -decision rules for the class Medium or High - base objects c2c2 If f(x,c 1 )[22.5, 27] & f(x,c 2 )[20, 25], then x belongs to class „Medium” or „High”

61
61 Support of a rule : Strength of a rule : Certainty (or confidence) factor of a rule (Łukasiewicz, 1913) : Coverage factor of a rule : Relation between certainty, coverage and Bayes theorem: Measures characterizing decision rules in system S=U, C, D

62
62 Measures characterizing decision rules in system S=U, C, D For a given decision table S, the probability (frequency) is calculated as: With respect to data from decision table S, the concepts of a priori & a posteriori probability are no more meaningful: Bayesian confirmation measure adapted to decision rules : „ is verified more often, when is verified, rather than when is not verified” Monotonicity: c(,) = F[supp S (,), supp S (,), supp S (,), supp S (,)] is a function non-decreasing with respect to supp S (,) & supp S (,) and non-increasing with respect to supp S (,) & supp S (,) S.Greco, Z.Pawlak, R.Słowiński: Can Bayesian confirmation measures be useful for rough set decision rules? Engineering Appls. of Artificial Intelligence (2004)

63
63 c1c DRSA – decision rules Certain D -decision rules for at least Medium c2c2

64
64 c1c DRSA – decision rules Decision rule wih a hyperplane for at least Medium c2c2 If f(x,c 1 )9.75 and f(x,c 2 )9.5 and 1.5 f(x,c 1 )+1.0 f(x,c 2 )22.5, then x is at least Medium

65
65 DRSA – application of decision rules Application of decision rules: „intersection” of rules matching object x Final assignment:

66
66 Rough Set approach to multiple-criteria sorting Example of preference information about students: Examples of classification of S1 and S2 are inconsistent StudentMathematics (M)Physics (Ph)Literature (L)Overall class S1goodmediumbad S2medium badmedium S3medium S4good mediumgood S5goodmediumgood S6good S7bad S8bad mediumbad

67
67 Rough Set approach to multiple-criteria sorting If we eliminate Literature, then more inconsistencies appear: Examples of classification of S1, S2, S3 and S5 are inconsistent StudentMathematics (M)Physics (Ph)Literature (L)Overall class S1goodmediumbad S2medium badmedium S3medium S4good mediumgood S5goodmediumgood S6good S7bad S8bad mediumbad

68
68 Rough Set approach to multiple-criteria sorting Elimination of Mathematics does not increase inconsistencies: Subset of criteria {Ph,L} is a reduct of {M,Ph,L} StudentMathematics (M)Physics (Ph)Literature (L)Overall class S1goodmediumbad S2medium badmedium S3medium S4good mediumgood S5goodmediumgood S6good S7bad S8bad mediumbad

69
69 Rough Set approach to multiple-criteria sorting Elimination of Physics also does not increase inconsistencies: Subset of criteria {M,L} is a reduct of {M,Ph,L} Intersection of reducts {M,L} and {Ph,L} gives the core {L} StudentMathematics (M)Physics (Ph)Literature (L)Overall class S1goodmediumbad S2medium badmedium S3medium S4good mediumgood S5goodmediumgood S6good S7bad S8bad mediumbad

70
70 Rough Set approach to multiple-criteria sorting Let us represent the students in condition space {M,L} : bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

71
71 Rough Set approach to multiple-criteria sorting Dominance cones in condition space {M,L} : P={M,L} bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

72
72 Dominance cones in condition space {M,L} : P={M,L} bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 Rough Set approach to multiple-criteria sorting

73
73 Dominance cones in condition space {M,L} : P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

74
74 Lower approximation of at least good students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

75
75 Upper approximation of at least good students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

76
76 Lower approximation of at least medium students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

77
77 Upper approximation of at least medium students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

78
78 Boundary region of at least medium students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

79
79 Lower approximation of at most medium students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

80
80 Upper approximation of at most medium students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

81
81 Lower approximation of at most bad students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

82
82 Upper approximation of at most bad students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

83
83 Boundary region of at most bad students: P={M,L} Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8

84
84 Rough Set approach to multiple-criteria sorting Decision rules in terms of {M,L} : D > - certain rule bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 If M good & L medium, then student good

85
85 Decision rules in terms of {M,L} : D > - certain rule Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 If M medium & L medium, then student medium

86
86 Decision rules in terms of {M,L} : D >< - approximate rule Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 If M medium & L bad, then student is bad or medium

87
87 Decision rules in terms of {M,L} : D < - certain rule Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 If M medium, then student medium

88
88 Decision rules in terms of {M,L} : D < - certain rule Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 If L bad, then student medium

89
89 Decision rules in terms of {M,L} : D < - certain rule Rough Set approach to multiple-criteria sorting bad medium good medium Mathematics Literature good medium bad S5,S6 S7 S2 S1 S4 S3 S8 If M bad, then student bad

90
90 Set of decision rules in terms of {M, L} representing preferences: If M good & L medium, then student good {S4,S5,S6} If M medium & L medium, then student medium {S3,S4,S5,S6} If M medium & L bad, then student is bad or medium {S1,S2} If M medium, then student medium{S2,S3,S7,S8} If L bad, then student medium {S1,S2,S7} If M bad, then student bad {S7,S8} Rough Set approach to multiple-criteria sorting

91
91 Set of decision rules in terms of {M,Ph,L} representing preferences: If M good & L medium, then student good {S4,S5,S6} If M medium & L medium, then student medium {S3,S4,S5,S6} If M medium & L bad, then student is bad or medium {S1,S2} If Ph medium & L medium then student medium{S1,S2,S3,S7,S8} If M bad, then student bad {S7,S8} The preference model involving all three criteria is more concise Rough Set approach to multiple-criteria sorting

92
92 Rough Set approach to multiple-criteria sorting Importance and interaction among criteria Quality of approximation of sorting P (Cl) (PC) is a fuzzy measure with the property of Choquet capacity ( (Cl)=0, C (Cl)=r and R (Cl) P (Cl)r for any RPC) Such measure can be used to calculate Shapley value or Benzhaf index, i.e. an average „contribution” of criterion q in all coalitions of criteria, q{1,…,m} Fuzzy measure theory permits, moreover, to calculate interaction indices (Murofushi & Soneda, Grabisch or Roubens) for pairs (or larger subsets) of criteria, i.e. an average „added value” resulting from putting together q and q’ in all coalitions of criteria, q,q’{1,…,m}

93
93 Rough Set approach to multiple-criteria sorting Quality of approximation of sorting students C (Cl)= [8-card({S1,S2})]/8 = 0.75

94
94 DRSA – preference modeling by decision rules A set of (D D D )-rules induced from rough approximations represents a preference model of a Decision Maker Traditional preference models: utility function (e.g. additive, multiplicative, associative, Choquet integral, Sugeno integral), binary relation (e.g. outranking relation, fuzzy relation) Decision rule model is the most general model of preferences: a general utility function, Sugeno or Choquet inegral, or outranking relation exists if and only if there exists the decision rule model Słowiński, R., Greco, S., Matarazzo, B.: “Axiomatization of utility, outranking and decision-rule preference models for multiple-criteria classification problems under partial inconsistency with the dominance principle”, Control and Cybernetics, 31 (2002) no.4,

95
95 Representation axiom (cancellation property): for every dimension i=1,…,n, for every evaluation x i,y i X i and a -i,b -i X -i, and for every pair of decision classes Cl r,Cl s {Cl 1,…,Cl m }: {x i a -i Cl r and y i b -i Cl s } {y i a -i or x i b -i } The above axiom constitutes a minimal condition that makes the weak preference relation i a complete preorder This axiom does not require pre-definition of criteria scales g i, nor the dominance relation, in order to derive 3 preference models: general utility function, outranking relation, set of decision rules D or D Greco, S., Matarazzo, B., Słowiński, R.: Conjoint measurement and rough set approach for multicriteria sorting problems in presence of ordinal criteria. [In]: A.Colorni, M.Paruccini, B.Roy (eds.), A-MCD-A: Aide Multi Critère à la Décision – Multiple Criteria Decision Aiding, European Commission Report EUR EN, Joint Research Centre, Ispra, 2001, pp DRSA – preference modeling by decision rules

96
96 Value-driven methods The preference model is a utility function U and a set of thresholds z t, t=1,…,p-1, on U, separating the decision classes Cl t, t=0,1,…,p A value of utility function U is calculated for each action aA e.g. aCl 2, dCl p1 Comparison of decision rule preference model and utility function U z1z1 z p-2 z p-1 z2z2 Cl 1 Cl 2 Cl p-1 Cl p U[g 1 (a),…,g n (a)]U[g 1 (d),…,g n (d)]

97
97 ELECTRE TRI Decision classes Cl t are caracterized by limit profiles b t, t=0,1,…,p The preference model, i.e. outranking relation S, is constructed for each couple (a, b t ), for every aA and b t, t=0,1,…,p Comparison of decision rule preference model and outranking relation g1g1 g2g2 g3g3 gngn b0b0 b1b1 b p-2 b p-1 bpbp b2b2 Cl 1 Cl 2 Cl p-1 Cl p a d

98
98 ELECTRE TRI Decision classes Cl t are caracterized by limit profiles b t, t=0,1,…,p Compare action a successively to each profile b t, t=p-1,…,1,0; if b t is the first profile such that aSb t, then aCl t+1 e.g. aCl 1, dCl p1 Comparison of decision rule preference model and outranking relation g1g1 g2g2 g3g3 gngn b0b0 b1b1 b p-2 b p-1 bpbp b2b2 Cl 1 Cl 2 Cl p-1 Cl p a d comparison of action a to profiles b t

99
99 Rule-based classification The preference model is a set of decision rules for unions, t=2,…,p A decision rule compares an action profile to a partial profile using a dominance relation e.g. aCl 2 >, because profile of a dominates partial profiles of r 2 and r 3 Comparison of decision rule preference model and outranking relation a e.g. for Cl 2 > g1g1 g2g2 g3g3 gngn r1r1 r3r3 r2r2

100
100 An impact of DRSA on Knowledge Discovery from data bases Example of diagnostic data 76 buses (objects) 8 symptoms (attributes) Decision = technical state: 3 – good state (in use) 2 – minor repair 1 – major repair (out of use) Discover patterns = find relationships between symptoms and the technical state Patterns explain expert’s decisions and support diagnosis of new buses

101
101 DRSA – example of technical diagnostics 76 vehicles (objects) 8 attributes with preference scale decision = technical state: 3 – good state (in use) 2 – minor repair 1 – major repair (out of use) all criteria are semantically correlated with the decision inconsistent objects: 11, 12, 39

102
102

103
103

104
104

105
105 Input (preference) information: Rerefence actions Criteria Rank in order q1q1 q2q2...qmqm xf(x, q 1 )f(x, q 2 )...f(x, q m )1st yf(y, q 1 )f(y, q 2 )...f(y, q m )2nd uf(u, q 1 )f(u, q 2 )...f(u, q m )3rd... zf(z, q 1 )f(z, q 2 )...f(z, q m )k-th Pairs of ref. actions Difference on criteria Preference relation q1q1 q2q2...qmqm (x,y)(x,y) 1(x,y)1(x,y)2(x,y)2(x,y) m(x,y)m(x,y) xSy (y,x)(y,x) 1(y,x)1(y,x)2(y,x)2(y,x)... m(y,x)m(y,x) yS c x (y,u)(y,u) 1(y,u)1(y,u)2(y,u)2(y,u)... m(y,u)m(y,u) ySu... (v,z)(v,z)1(v,z)1(v,z)2(v,z)2(v,z) m(v,z)m(v,z)vS c z S – outranking S c – non-outranking i – difference on q i DRSA for multiple-criteria choice and ranking Pairwise Comparison Table (PCT) ARAR BARARBARAR

106
106 DRSA for multiple-criteria choice and ranking Dominance for pairs of actions (x,y),(w,z)UU For cardinal criterion qC : (x,y)D q (w,z) if and where h q k q For subset PC of criteria: P-dominance relation on pairs of actions: (x,y)D P (w,z) if (x,y)D q (w,z) for all qP i.e., if x is preferred to y at least as much as w is preferred to z for all qP D q is reflexive, transitive, but not necessarily complete (partial preorder) is a partial preorder on AA x q q (x,y)D q (w,z) z y w For ordinal criterion q C:

107
107 Let BUU be a set of pairs of reference objects in a given PCT Granules of knowledge positive dominance cone negative dominance cone DRSA for multiple-criteria choice and ranking

108
108 DRSA for multiple-criteria choice and ranking – formal definitions P-lower and P-upper approximations of outranking relations S: P-lower and P-upper approximations of non-outranking relation S c : P-boundaries of S and S c : PC

109
109 DRSA for multiple-criteria choice and ranking – formal definitions Basic properties: Quality of approximation of S and S c : (S,S c )-reduct and (S,S c )-core Variable-consistency rough approximations of S and S c ( l (0,1]) :

110
110 Decision rules (for criteria with cardinal scales) D -decision rules if x y and x y and... x y, then xSy D -decision rules if x y and x y and... x y, then xS c y D -decision rules if x y and x y and... x y and x y and... x y, then xSy or xS c y DRSA for multiple-criteria choice and ranking – decision rules

111
111 q 1 = 2.3 q 2 = 18.0 … q m = 3.0 q 1 = 1.2 q 2 = 19.0 … q m = 5.7 q 1 = 4.7 q 2 = 14.0 … q m = 7.1 q 1 = 0.5 q 2 = 12.0 … q m = 9.0 Example of application of DRSA Acquiring reference objects Preference information on reference objects: making a ranking pairwise comparison of the objects (x S y) or (x S c y) Building the Pairwise Comparison Table (PCT) Inducing rules from rough approximations of relations S and S c x y u z Pairs of objects Difference of evaluations on each criterion Preference relation q 1 q 2 … q n (x,y) 1 (x,y) = -1.1 2 (x,y) = n (x,y) = 2.7 x Sc yx Sc y (y,z) 1 (y,z) = -2.4 2 (y,z) = n (y,z) = -4.1 y S zy S z (y,u) 1 (y,u) = 1.8 2 (y,u) = n (y,u) = -6.0 y S u... (u,z) 1 (u,z) = -4.2 2 (u,z) = n (u,z) = 1.9 u S c z …

112
112 Induction of decision rules from rough approximations of S and S c if q 1 (a,b) ≥ -2.4 & q 2 (a,b) ≥ 4.0 then aSb if q 1 (a,b) ≤ -1.1 & q 2 (a,b) ≤ 1.0 then aS c b if q 1 (a,b) ≤ 2.4 & q 2 (a,b) ≤ -4.0 then aS c b if q 1 (a,b) ≥ 4.1 & q 2 (a,b) ≥ 1.9 then aSb q 1 q 2 x S c y z S c y u S c y u S c z y S x y S z y S u t S c q u S z

113
113 Application of decision rules for decision support Application of decision rules (preference model) on the whole set A induces a specific preference structure on A Any pair of objects (x,y)AA can match the decision rules in one of four ways: xSy and not xS c y, that is true outranking (xS T y), xS c y and not xSy, that is false outranking (xS F y), xSy and xS c y, that is contradictory outranking (xS K y), not xSy and not xS c y, that is unknown outranking (xS U y). x y x xx yy y xSTyxSTy xSFyxSFy xSKyxSKy xSUyxSUy S S ScSc ScSc The 4-valued outranking underlines the presence and the absence of positive and negative reasons of outranking

114
114 DRSA for multicriteria choice and ranking – Net Flow Score x v u y z S S ScSc ScSc weakness of xstrength of x NFS(x) = strength(x) – weakness(x) (–,–) (+,–) (+,+) (–,+) S – positive argument S c – negative argument

115
115 DRSA for multiple-criteria choice and ranking – final recommendation Exploitation of the preference structure by the Net Flow Score procedure for each action xA: NFS(x) = strength(x) – weakness(x) Final recommendation: ranking: complete preorder determined by NSF(x) in A best choice: action(s) x*A such that NSF(x*)= max {NSF(x)} xAxA

116
116 DRSA for multiple-criteria choice and ranking – example Decision table with reference objects (warehouses)

117
117 DRSA for multicriteria choice and ranking – example Pairwise comparison table (PCT)

118
118 DRSA for multicriteria choice and ranking – example Quality of approximation of S and Sc by criteria from set C is 0.44 RED PCT = CORE PCT = {A 1,A 2,A 3 } D -decision rules and D -decision rules

119
119 DRSA for multicriteria choice and ranking – example D -decision rules

120
120 DRSA for multicriteria choice and ranking – example Ranking of warehouses for sale by decision rules and the NFS procedure Final ranking: (2',6') (8') (9') (1') (4') (5') (3') (7',10') Best choice: select warehouse 2' and 6' having maximum score (11)

121
121 Other extensions of DRSA DRSA for choice and ranking with graded preference relations DRSA for choice and ranking with Lorenz dominance relation DRSA for decision with multiple decision makers DRSA with missing values of attributes and criteria DRSA for hierarchical decision making Fuzzy set extensions of DRSA DRSA for decision under risk Discovering association rules in preference-ordered data sets Building a strategy of efficient intervention based on decision rules

122
122 Preference information is given in terms of decision examples on reference actions Preference model induced from rough approximations of unions of decision classes (or preference relations S and S c ) is expressed in a natural and comprehensible language of "if..., then..." decision rules Decision rules do not convert ordinal information into numeric one and can represent inconsistent preferences Heterogeneous information (attributes, criteria) and scales of preference (ordinal, cardinal) can be processed within DRSA DRSA supplies useful elements of knowledge about decision situation: certain and doubtful knowledge distinguished by lower and upper approximations relevance of particular attributes or criteria and information about their interaction reducts of attributes or criteria conveying important knowledge contained in data the core of indispensable attributes and criteria decision rules can be used for both explanation of past decisions and decision support Conclusions

123
123 Free software available ROSE ROugh Set data Explorer 4eMka Dominance-based Rough Set Approach to Multicriteria Classification JAMM A New Decision Support Tool for Analysis and Solving of Multicriteria Classification Problems

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google