Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rule Selection as Submodular Function

Similar presentations


Presentation on theme: "Rule Selection as Submodular Function"β€” Presentation transcript:

1 Rule Selection as Submodular Function
Wentao Ding

2 Rule Selection Problem
A combination of two weighted coverage functions. maximize 𝐻 π‘ƒπ‘œπ‘  π‘Œ ,𝑁𝑒𝑔 π‘Œ s.t. 𝑗: 𝑑 𝑗 βˆˆπ‘… 𝑖 π‘₯ 𝑖 > 𝑦 𝑗 𝑦 𝑗 ∈ 0,1 π‘₯ 𝑖 ∈ 0,1 R1 R2 R3 R4 : Positive : Negative

3 Submodular Function Submodular: Maximize a submodular function
Def 1: βˆ€π΄βŠ†π΅. 𝑓 𝐴βˆͺ π‘₯ βˆ’π‘“ 𝐴 β‰₯𝑓 𝐡βˆͺ π‘₯ βˆ’π‘“ 𝐡 Def 2: βˆ€π΄βŠ†π΅. 𝑓 𝐴 +𝑓 𝐡 β‰₯𝑓 𝐴βˆͺ𝐡 +𝑓 𝐴∩𝐡 Coverage function is monotone submodular function. Maximize a submodular function usuallyΒ NP-hard. Β½-approx if symmetric, π‘’βˆ’1 𝑒 -approx if monotone. Combination of two submodular functions Close under non-negative linear combinations. Difference of Submodular function: Inapproximable. Ratio of Submodular function: Depends

4 Rule Section Problem Let #π‘π‘œπ‘ π‘‘π‘–π‘£π‘’ #π‘Žπ‘™π‘™ =𝐢, 𝑁𝑒𝑔 π‘Œ #π‘Žπ‘™π‘™ =𝑓 π‘Œ , π‘ƒπ‘œπ‘  π‘Œ #π‘Žπ‘™π‘™ =𝑔 π‘Œ . TP=𝑔 π‘Œ FP=𝑓 π‘Œ TN=1βˆ’πΆβˆ’π‘“ π‘Œ FN=πΆβˆ’π‘” π‘Œ π‘ƒπ‘Ÿπ‘’π‘π‘–π‘ π‘–π‘œπ‘›= TP TP+FP = 𝑔 π‘Œ 𝑓 π‘Œ +𝑔 π‘Œ π‘…π‘’π‘π‘Žπ‘™π‘™= TP TP+FN = 𝑔 π‘Œ 𝐢 π‘¨π’„π’„π’–π’“π’‚π’„π’š= TP+TN 1 =1βˆ’πΆ+𝑔 π‘Œ βˆ’π‘“ π‘Œ 𝑭 𝟏 = 2TP 2TP+FP+FN = 2𝑔 π‘Œ 𝐢+𝑓 π‘Œ +𝑔 π‘Œ β‰ˆ 𝑔 π‘Œ 𝑓 π‘Œ

5 Ratio of submodular function
min 𝑁 π‘Œ 𝑃 π‘Œ βˆƒπΆ.𝑃 π‘Œ β‰₯𝐢 min 𝑁 π‘Œ 𝐢 max 𝑃 π‘Œ 𝑁 π‘Œ βˆƒπ΅.𝑁 π‘Œ ≀𝐡 max 𝑃 π‘Œ 𝐡 A Submodular Cover (SCSC) / Submodular Knapsack (SCSK) max 𝑓 𝑋 π‘‹βŠ†π‘‰βˆ§π‘” 𝑋 β‰₯𝐢 𝑓,𝑔:submodular max 𝑔 𝑋 π‘‹βŠ†π‘‰βˆ§π‘“ 𝑋 ≀𝐡 𝑓,𝑔:submodular Can be approximated by ratio based greedy method

6 Curvature of submodular function
𝒦 𝑓 =1βˆ’ min π‘£βˆˆπ‘‰ 𝑓 𝑣 π‘‰βˆ–π‘£ 𝑓 𝑣 ∈ 0,1 Approximate ratio with simple greedy algorithm 𝑓 π‘Œ 𝐺 𝑔 π‘Œ 𝐺 ≀ 1 1βˆ’ exp 𝒦 𝑓 βˆ’1 β‹… 𝑓 π‘Œ βˆ— 𝑔 π‘Œ βˆ— (if 𝒦 𝑃 =0, there exists an πœ–-bounded scheme) R1 R2 R3 R4 𝒦 𝑔 = 1 2 𝒦 𝑓 =1 a (1+)-approximation for RS minimization in O(log(1/)) calls to the subroutine

7 Optimizing RS Ellipsoid Approximation:O 𝑛 log 𝑛 -approx.
Related Learning Problem Submodular optimization Feature selection Data subset selection (Document Summarization) ……

8 Reference George L. Nemhauser, Laurence A. Wolsey, Marshall L. Fisher: An analysis of approximations for maximizing submodular set functions - I. Math. Program. 14(1): (1978) Michele Conforti, GΓ©rard CornuΓ©jols: Submodular set functions, matroids and the greedy algorithm: Tight worst-case bounds and some generalizations of the Rado-Edmonds theorem. Discrete Applied Mathematics 7(3): (1984) Mukund Narasimhan, Jeff A. Bilmes: A Submodular-supermodular Procedure with Applications to Discriminative Structure Learning. UAI 2005: Rishabh K. Iyer, Jeff A. Bilmes: Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints. NIPS 2013: Rishabh K. Iyer, Jeff A. Bilmes: Algorithms for Approximate Minimization of the Difference Between Submodular Functions, with Applications. UAI 2012: Wenruo Bai, Rishabh K. Iyer, Kai Wei, Jeff A. Bilmes: Algorithms for Optimizing the Ratio of Submodular Functions. ICML 2016:

9 Thanks for listening Q & A


Download ppt "Rule Selection as Submodular Function"

Similar presentations


Ads by Google