Submodular Set Function Maximization via the Multilinear Relaxation & Dependent Rounding Chandra Chekuri Univ. of Illinois, Urbana-Champaign.

Slides:



Advertisements
Similar presentations
Iterative Rounding and Iterative Relaxation
Advertisements

On allocations that maximize fairness Uriel Feige Microsoft Research and Weizmann Institute.
Combinatorial Auctions with Complement-Free Bidders – An Overview Speaker: Michael Schapira Based on joint works with Shahar Dobzinski & Noam Nisan.
1 Adaptive Submodularity: A New Approach to Active Learning and Stochastic Optimization Joint work with Andreas Krause 1 Daniel Golovin.
Submodular Set Function Maximization A Mini-Survey Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
Submodularity for Distributed Sensing Problems Zeyn Saigol IR Lab, School of Computer Science University of Birmingham 6 th July 2010.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Semi-Definite Algorithm for Max-CUT Ran Berenfeld May 10,2005.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Algorithms for Max-min Optimization
Max-Min Fair Allocation of Indivisible Goods Amin Saberi Stanford University Joint work with Arash Asadpour TexPoint fonts used in EMF. Read the TexPoint.
The Submodular Welfare Problem Lecturer: Moran Feldman Based on “Optimal Approximation for the Submodular Welfare Problem in the Value Oracle Model” By.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Semidefinite Programming
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
Introduction to Linear and Integer Programming Lecture 7: Feb 1.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Pushkar Tripathi Georgia Institute of Technology Approximability of Combinatorial Optimization Problems with Submodular Cost Functions Based on joint work.
Approximation Algorithms
Efficiently handling discrete structure in machine learning Stefanie Jegelka MADALGO summer school.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint.
Fast Algorithms for Submodular Optimization
Round and Approx: A technique for packing problems Nikhil Bansal (IBM Watson) Maxim Sviridenko (IBM Watson) Alberto Caprara (U. Bologna, Italy)
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Martin Grötschel  Institute of Mathematics, Technische Universität Berlin (TUB)  DFG-Research Center “Mathematics for key technologies” (M ATHEON ) 
Randomized Composable Core-sets for Submodular Maximization Morteza Zadimoghaddam and Vahab Mirrokni Google Research New York.
Approximation Algorithms for Prize-Collecting Forest Problems with Submodular Penalty Functions Chaitanya Swamy University of Waterloo Joint work with.
5 Maximizing submodular functions Minimizing convex functions: Polynomial time solvable! Minimizing submodular functions: Polynomial time solvable!
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
Submodular Maximization with Cardinality Constraints Moran Feldman Based On Submodular Maximization with Cardinality Constraints. Niv Buchbinder, Moran.
New algorithms for Disjoint Paths and Routing Problems
A Unified Continuous Greedy Algorithm for Submodular Maximization Moran Feldman Roy SchwartzJoseph (Seffi) Naor Technion – Israel Institute of Technology.
Maximization Problems with Submodular Objective Functions Moran Feldman Publication List Improved Approximations for k-Exchange Systems. Moran Feldman,
Submodular Set Function Maximization A Mini-Survey Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
Deterministic Algorithms for Submodular Maximization Problems Moran Feldman The Open University of Israel Joint work with Niv Buchbinder.
Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified Continuous Greedy Algorithm for Submodular Maximization.
Maximizing Symmetric Submodular Functions Moran Feldman EPFL.
Approximation Algorithms Greedy Strategies. I hear, I forget. I learn, I remember. I do, I understand! 2 Max and Min  min f is equivalent to max –f.
Approximation Algorithms based on linear programming.
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
Approximation algorithms
Approximation algorithms for combinatorial allocation problems
Contention Resolution Schemes: Offline and Online
Lap Chi Lau we will only use slides 4 to 19
Topics in Algorithms Lap Chi Lau.
Moran Feldman The Open University of Israel
Approximation algorithms
Combinatorial Optimization Under Uncertainty
The Price of information in combinatorial optimization
Distributed Submodular Maximization in Massive Datasets
Combinatorial Prophet Inequalities
Framework for the Secretary Problem on the Intersection of Matroids
Bart M. P. Jansen June 3rd 2016, Algorithms for Optimization Problems
Coverage Approximation Algorithms
Richard Anderson Lecture 28 Coping with NP-Completeness
(22nd August, 2018) Sahil Singla
Rule Selection as Submodular Function
Submodular Maximization Through the Lens of the Multilinear Relaxation
On Approximating Covering Integer Programs
Tree packing, mincut, and Metric-TSP
Submodular Function Maximization with Packing Constraints via MWU
Topics in Algorithms 2005 Max Cuts
Submodular Maximization with Cardinality Constraints
Guess Free Maximization of Submodular and Linear Sums
Presentation transcript:

Submodular Set Function Maximization via the Multilinear Relaxation & Dependent Rounding Chandra Chekuri Univ. of Illinois, Urbana-Champaign

Max weight independent set N a finite ground set w : N ! R + weights on N I µ 2 N is an independence family of subsets I is downward closed: A 2 I and B ½ A ) B 2 I max w(S) s.t S 2 I

Independence families stable sets in graphs matchings in graphs and hypergraphs matroids and intersection of matroids packing problems: feasible {0,1} solutions to A x · b where A is a non-negative matrix

Max weight independent set max weight stable set in graphs max weight matchings max weight independent set in a matroid max weight independent set in intersection of two matroids max profit knapsack etc max w(S) s.t S 2 I

This talk f is a non-negative submodular set function on N Motivation: several applications mathematical interest max f(S) s.t. S 2 I

Submodular Set Functions A function f : 2 N ! R + is submodular if AB j f(A+j) – f(A) ¸ f(B+j) – f(B) for all A ½ B, i 2 N\B f(A+j) – f(A) ≥ f(A+i+j) – f(A+i) for all A  N, i, j  N\A

Submodular Set Functions A function f : 2 N ! R + is submodular if AB j f(A+j) – f(A) ¸ f(B+j) – f(B) for all A ½ B, i 2 N\B f(A+j) – f(A) ≥ f(A+i+j) – f(A+i) for all A  N, i, j  N\A Equivalently: f(A) + f(B) ≥ f(A  B) + f(A  B) 8 A,B  N

G=(V,E) undirected graph f : 2 V ! R + where f(S) = | δ (S)| Cut functions in graphs S

Coverage in Set Systems X 1, X 2,..., X n subsets of set U f : 2 {1,2,..., n} ! R + where f(A) = | [ i in A X i | X1X1 X2X2 X3X3 X4X4 X5X5 X1X1 X2X2 X3X3 X4X4 X5X5

Submodular Set Functions Non-negative submodular set functions f(A) ≥ 0 8 A ) f(A) + f(B) ¸ f(A [ B) (sub-additive) Monotone submodular set functions f( ϕ ) = 0 and f(A) ≤ f(B) for all A  B Symmetric submodular set functions f(A) = f(N\A) for all A

Other examples Cut functions in hypergraphs (symmetric non-negative) Cut functions in directed graphs (non-negative) Rank functions of matroids (monotone) Generalizations of coverage in set systems (monotone) Entropy/mutual information of a set of random variables...

Example: Max-Cut f is cut function of a given graph G=(V,E) I = 2 V : unconstrained NP-Hard max f(S) s.t S 2 I

Example: Max k-Coverage X 1,X 2,...,X n subsets of U and integer k N = {1,2,...,n} f is the set coverage function (monotone) I = { A µ N : |A| · k } (cardinality constraint) NP-Hard max f(S) s.t S 2 I

Approximation Algorithms A is an approx. alg. for a maximization problem: A runs in polynomial time for all instances I of the problem A (I) ¸ ® OPT(I) ® ( · 1) is the worst-case approximation ratio of A

Techniques f is a non-negative submodular set function on N Greedy Local Search Multilinear relaxation and rounding max f(S) s.t. S 2 I

Greedy and Local-Search [Nemhauser-Wolsey-Fisher’78, Fisher-Nemhauser-Wolsey’78] Work well for “combinatorial” constraints: matroids, intersection of matroids and generalizations Recent work shows applicability to non-monotone functions [Feige-Mirrokni-Vondrak’07] [Lee-Mirrokni- Nagarajan-Sviridenko’08] [Lee-Sviridenko-Vondrak’09] [Gupta etal, 2010]

Motivation for mathematical programming approach Quest for optimal results Greedy/local search not so easy to adapt for packing constraints of the form Ax · b Known advantages of geometric and continuous optimization methods and the polyhedral approach

Math. Programming approach max w(S) s.t S 2 I max w ¢ x s.t x 2 P( I ) Exact algorithm: P( I ) = convexhull( { 1 S : S 2 I }) x i 2 [0,1] indicator variable for i

Math. Programming approach max w(S) s.t S 2 I max w ¢ x s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I Exact algorithm: P( I ) = convexhull( { 1 S : S 2 I }) Approx. algorithm: P( I ) ¾ convexhull( { 1 S : S 2 I }) P( I ) solvable : can do linear optimization over it

Math. Programming approach max f(S) s.t S 2 I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I P( I ) ¶ convexhull( { 1 S : S 2 I }) and solvable

Math. Programming approach What is the continuous extension F ? How to optimize with objective F ? How do we round ? max f(S) s.t S 2 I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Some results [Calinescu-C-Pal-Vondrak’07]+[Vondrak’08]=[CCPV’09] Theorem: There is a randomized (1-1/e) ' approximation for maximizing a monotone f subject to any matroid constraint. [C-Vondrak-Zenklusen’09] Theorem: (1-1/e- ² )-approximation for monotone f subject to a matroid and a constant number of packing/knapsack constraints.

What is special about 1-1/e? Greedy gives (1-1/e)-approximation for the problem max { f(S) | |S| · k } when f is monotone [NWF’78] Obtaining a (1-1/e + ² )-approximation requires exponentially many value queries to f [FNW’78] Unless P=NP no (1-1/e + ² )-approximation for special case of Max k-Coverage [Feige’98] New results give (1-1/e) for any matroid constraint improving ½. Moreover, algorithm is interesting and techniques have been quite useful.

Submodular Welfare Problem n items/goods (N) to be allocated to k players each player has a submodular utility function f i (A i ) is the utility to i if A i is allocation to i) Goal: maximize welfare of allocation  i f i (A i ) Can be reduced to a single f and a (partition) matroid constraint and hence (1-1/e) approximation

Some more results [C-Vondrak-Zenklusen’11] Extend approach to non-monotone f Rounding framework via contention resolution schemes Several results from framework including the ability to handle intersection of different types of constraints

Math. Programming approach What is the continuous extension F ? How to optimize with objective F ? How do we round ? max f(S) s.t S 2 I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Multilinear extension of f [CCPV’07] inspired by [Ageev-Sviridenko] For f : 2 N ! R + define F : [0,1] N ! R + as x = (x 1, x 2,..., x n )  [0,1] N R: random set, include i independently with prob. x i F( x ) = E [ f(R) ] =  S  N f(S)  i  S x i  i  N\S (1-x i )

Why multilinear extension? Ideally a concave extension to maximize Could choose (“standard”) concave closure f + of f Evaluating f + ( x ) is NP-Hard!

Properties of F F( x ) can be evaluated (approximately) by random sampling F is a smooth submodular function  2 F/  x i  x j ≤ 0 for all i,j. Recall f(A+j) – f(A) ≥ f(A+i+j) – f(A+i) for all A, i, j F is concave along any non-negative direction vector  F/  x i ≥ 0 for all i if f is monotone

Math. Programming approach What is the continuous extension F ? ✔ How to optimize with objective F ? How do we round ? max f(S) s.t S 2 I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Maximizing F max { F(x) |  x i · k, x i 2 [0,1] } is NP-Hard

Approximately maximizing F [Vondrak’08] Theorem: For any monotone f, there is a (1-1/e) approximation for the problem max { F( x ) | x  P } where P  [0,1] N is any solvable polytope. Algorithm: Continuous-Greedy

Approximately maximizing F [C-Vondrak-Zenklusen’11] Theorem: For any non-negative f, there is a ¼ approximation for the problem max { F( x ) | x  P } where P  [0,1] n is any down-closed solvable polytope. Remark: approximation can be obtained Algorithm: Local-Search variants

Local-Search based algorithm Problem: max { F( x ) | x 2 P }, P is down-monotone x * = a local optimum of F in P Q = { z 2 P | z · 1 - x * } y * = a local optimum of F in Q Output better of x * and y *

Local-Search based algorithm Problem: max { F( x ) | x 2 P }, P is down-monotone x * = a local optimum of F in P Q = { z 2 P | z · 1 - x * } y * = a local optimum of F in Q Output better of x * and y * Theorem: Above algorithm gives a ¼ approximation.

Math. Programming approach What is the continuous extension F ? ✔ How to optimize with objective F ? ✔ How do we round ? max f(S) s.t S 2 I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Rounding Rounding and approximation depend on I and P( I ) Two results: For matroid polytope a special rounding A general approach via contention resolution schemes

Rounding in Matroids Matroid M = (N, I ) Independence polytope: P( M ) = convhull({ 1 S | S 2 I }) given by following system [Edmonds]   i 2 S x i · rank M (S) 8 S µ N x 2 [0,1] N

Rounding in Matroids [Calinescu-C-Pal-Vondrak’07] Theorem: Given any point x in P( M ), there is a randomized polynomial time algorithm to round x to a vertex x* (hence an indep set of M ) such that E [ x * ] = x F( x* ) ≥ F( x ) [C-Vondrak-Zenklusen’09] Different rounding with additional properties and apps.

Rounding F( x * ) = E [f(R)] where R is obtained by independently rounding each i with probability x * i R unlikely to be in I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Rounding F( x * ) = E [f(R)] where R is obtained by independently rounding each i with probability x * i R unlikely to be in I Obtain R’ µ R s.t. R’ 2 I and E [f(R’)] ¸ c f(R) max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

A simple question? x is a convex combination of spanning trees R: pick each e 2 E independently with probability x e Question: what is the expected size of a maximal forest in R? (n - # of connected components)

A simple question? x is a convex combination of spanning trees of G R: pick each e 2 E independently with probability x e Question: what is the expected size of a maximal forest in R? (n - # of connected components) Answer: ¸ (1-1/e) (n-1)

Related question x is a convex combination of spanning trees of G R: pick each e 2 E independently with probability x e Want a (random) forest R’ µ R s.t. for every edge e Pr[e 2 R’ | e 2 R] ¸ c

Related question x is a convex combination of spanning trees of G R: pick each e 2 E independently with probability x e Want a (random) forest R’ µ R s.t. for every edge e Pr[e 2 R’ | e 2 R] ¸ c ) there is a forest of size  e c x e = c (n-1) in R

Related question x is a convex combination of spanning trees of G R: pick each e 2 E independently with probability x e Want a (random) forest R’ µ R s.t. for every edge e Pr[e 2 R’ | e 2 R] ¸ c Theorem: c = (1-1/e) is achievable & optimal [CVZ’11] (true for any matroid)

Contention Resolution Schemes I an independence family on N P( I ) a relaxation for I and x 2 P( I ) R: random set from independent rounding of x CR scheme for P( I ): given x, R outputs R’ µ R s.t. 1.R’ 2 I 2.and for all i, Pr[i 2 R’ | i 2 R] ¸ c

Rounding and CR schemes Theorem: A monotone CR scheme for P( I ) can be used to round s.t. E [f(S * )] ¸ c F( x * ) Via FKG inequality max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Remarks [CVZ’11] Several existing rounding schemes are CR schemes CR schemes for different constraints can be combined for their intersection CR schemes through correlation gap and LP duality

Math. Programming approach Problem reduced to finding a good relaxation P( I ) and a contention resolution scheme for P( I ) max f(S) s.t S 2 I max F( x ) s.t x 2 P( I ) Round x * 2 P( I ) to S * 2 I

Concluding Remarks Substantial progress on submodular function maximization problems in the last few years New tools and connections including a general framework via the multilinear relaxation Increased awareness and more applications Several open problems still remain

Thanks!