Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/30 Constructive Algorithms for Discrepancy Minimization Nikhil Bansal (IBM)

Similar presentations


Presentation on theme: "1/30 Constructive Algorithms for Discrepancy Minimization Nikhil Bansal (IBM)"— Presentation transcript:

1 1/30 Constructive Algorithms for Discrepancy Minimization Nikhil Bansal (IBM)

2 2/30 Discrepancy: What is it? Study of gaps in approximating the continuous by the discrete. Problem: How uniformly can you distribute points in a grid. “Uniform” : For every axis-parallel rectangle R | (# points in R) - (Area of R) | should be low. n 1/2 Discrepancy: Max over rectangles R |(# points in R) – (Area of R)|

3 3/30 Distributing points in a grid Problem: How uniformly can you distribute points in a grid. “Uniform” : For every axis-parallel rectangle R | (# points in R) - (Area of R) | should be low. Uniform Random Van der Corput Set n= 64 points n 1/2 discrepancyn 1/2 (loglog n) 1/2 O(log n) discrepancy!

4 4/30 Discrepancy: Example 2 Input: n points placed arbitrarily in a grid. Color them red/blue such that each rectangle is colored as evenly as possible Discrepancy: max over rect. R ( | # red in R - # blue in R | ) Continuous: Color each element 1/2 red and 1/2 blue (0 discrepancy) Discrete: Random has about O(n 1/2 log 1/2 n) Can achieve O(log 2.5 n)

5 5/30 Applications CS: Computational Geometry, Comb. Optimization, Monte-Carlo simulation, Machine learning, Complexity, Pseudo-Randomness, … Math: Dynamical Systems, Combinatorics, Mathematical Finance, Number Theory, Ramsey Theory, Algebra, Measure Theory, …

6 6/30 Combinatorial Discrepancy Universe: U= [1,…,n] Subsets: S 1,S 2,…,S m Color elements red/blue so each set is colored as evenly as possible. Find  : [n] ! {-1,+1} to Minimize |  (S)| 1 = max S |  i 2 S  (i) | For simplicity consider m=n henceforth. S1S1 S2S2 S3S3 S4S4

7 7/30 Best Known Algorithm Random: Color each element i independently as x(i) = +1 or -1 with probability ½ each. Thm: Discrepancy = O (n log n) 1/2 Pf: For each set, expect O(n 1/2 ) discrepancy Standard tail bounds: Pr[ |  i 2 S x(i) | ¸ c n 1/2 ] ¼ e -c 2 Union bound + Choose c ¼ (log n) 1/2 Analysis tight: Random actually incurs  (n log n) 1/2 ).

8 8/30 Better Colorings Exist! [Spencer 85]: (Six standard deviations suffice) Always exists coloring with discrepancy · 6n 1/2 (In general for arbitrary m, discrepancy = O(n 1/2 log(m/n) 1/2 ) Tight: For m=n, cannot beat 0.5 n 1/2 (Hadamard Matrix, “orthogonal” sets) Inherently non-constructive proof (pigeonhole principle on exponentially large universe) Challenge: Can we find it algorithmically ? Certain algorithms do not work [Spencer] Conjecture [Alon-Spencer]: May not be possible.

9 9/30 Beck Fiala Thm U = [1,…,n] Sets: S 1,S 2,…,S m Suppose each element lies in at most t sets (t << n). [Beck Fiala’ 81]: Discrepancy 2t -1. (elegant linear algebraic argument, algorithmic result) Beck Fiala Conjecture: O(t 1/2 ) discrepancy possible Other results: O( t 1/2 log t log n ) [Beck] O( t 1/2 log n ) [Srinivasan] O( t 1/2 log 1/2 n ) [Banaszczyk] S1S1 S2S2 S3S3 S4S4 Non-constructive

10 10/30 Approximating Discrepancy Question: If a set system has low discrepancy (say << n 1/2 ) Can we find a good discrepancy coloring ? [Charikar, Newman, Nikolov 11]: Even 0 vs. O (n 1/2 ) is NP-Hard (Matousek): What if system has low Hereditary discrepancy? herdisc (U,S) = max U’ ½ U disc (U’, S |U’ ) Robust measure of discrepancy (often same as discrepancy) Widely used: TU set systems, Geomety, … S1S2…S1S2… S’ 1 S’ 2 … 1 2 … n1’ 2’ … n’

11 11/30 Our Results Thm 1: Can get Spencer’s bound constructively. That is, O(n 1/2 ) discrepancy for m=n sets. Thm 2: If each element lies in at most t sets, get bound of O(t 1/2 log n) constructively (Srinivasan’s bound) Thm 3: For any set system, can find Discrepancy · O(log (mn)) Hereditary discrepancy. Other Problems: Constructive bounds (matching current best) k-permutation problem [Spencer, Srinivasan,Tetali] Geometric problems, …

12 12/30 Relaxations: LPs and SDPs Not clear how to use. Linear Program is useless. Can color each element ½ red and ½ blue. Discrepancy of each set = 0! SDPs (LP on v i ¢ v j, cannot control dimension of v’s) |  i 2 S v i | 2 · n 8 S |v i | 2 = 1 Intended solution v i = (+1,0,…,0) or (-1,0,…,0). Trivially feasible: v i = e i (all v i ’s orthogonal) Yet, SDPs will be a major tool.

13 13/30 Punch line SDP very helpful if “tighter” bounds needed for some sets. |  i 2 S v i | 2 · 2 n |  i 2 S’ v i | 2 · n/log n |v i | 2 · 1 Not apriori clear why one can do this. Entropy Method. Algorithm will construct coloring over time and use several SDPs in the process. Tighter bound for S’

14 14/30 Talk Outline Introduction The Method Low Hereditary discrepancy -> Good coloring Additional Ideas Spencer’s O(n 1/2 ) bound

15 15/30 Our Approach

16 16/30 Algorithm (at high level) Cube: {-1,+1} n Analysis: Few steps to reach a vertex (walk has high variance) Disc( S i ) does a random walk (with low variance) start finish Algorithm: “Sticky” random walk Each step generated by rounding a suitable SDP Move in various dimensions correlated, e.g.  t 1 +  t 2 ¼ 0 Each dimension: An Element Each vertex: A Coloring

17 17/30 An SDP Hereditary disc. ) the following SDP is feasible SDP: Low discrepancy: |  i 2 S j v i | 2 · 2 |v i | 2 = 1 Rounding: Pick random Gaussian g = (g 1,g 2,…,g n ) each coordinate g i is iid N(0,1) For each i, consider  i = g ¢ v i Obtain v i 2 R n

18 18/30 Properties of Rounding Lemma: If g 2 R n is random Gaussian. For any v 2 R n, g ¢ v is distributed as N(0, |v| 2 ) Pf: N(0,a 2 ) + N(0,b 2 ) = N(0,a 2 +b 2 ) g ¢ v =  i v(i) g i » N(0,  i v(i) 2 ) 1.Each  i » N(0,  ) 2.For each set S,  i 2 S  i = g ¢ (  i 2 S v i ) » N(0, · 2 ) (std deviation · ) SDP: |v i | 2 = 1 |  i 2 S v i | 2 · 2 Recall:  i = g ¢ v i  ’s mimics a low discrepancy coloring (but is not {-1,+1})

19 19/30 Algorithm Overview Construct coloring iteratively. Initially: Start with coloring x 0 = (0,0,0, …,0) at t = 0. At Time t: Update coloring as x t = x t-1 +  (  t 1,…,  t n ) (  tiny: 1/n suffices) x(i) x t (i) =  (  1 i +  2 i + … +  t i ) Color of element i: Does random walk over time with step size ¼  Fixed if reaches -1 or +1. time +1 Set S: x t (S) =  i 2 S x t (i) does a random walk w/ step  N(0, · 2 )

20 20/30 Analysis Consider time T = O(1/  2 ) Claim 1: With prob. ½, at least n/2 elements reach -1 or +1. Pf: Each element doing random walk with size ¼  Recall: Random walk with step 1, is ¼ O(t 1/2 ) away in t steps. A Trouble: Various element updates are correlated Consider basic walk x(t+1) = x(t) 1 with prob ½ Define Energy  (t) = x(t) 2 E[  (t+1)] = ½ (x(t)+1) 2 + ½ (x(t)-1) 2 = x(t) 2 + 1 =  (t)+1 Expected energy = n at t= n. Claim 2: Each set has O( ) discrepancy in expectation. Pf: For each S, x t (S) doing random walk with step size ¼ 

21 21/30 Analysis Consider time T = O(1/  2 ) Claim 1: With prob. ½, at least n/2 variables reach -1 or +1. ) Everything colored in O(log n) rounds. Claim 2: Each set has O( ) discrepancy in expectation per round. ) Expected discrepancy of a set at end = O( log n) Thm: Obtain a coloring with discrepancy O( log (mn)) Pf: By Chernoff, Prob. that disc(S) >= 2 Expectation + O( log m) = O( log (mn)) is tiny (poly(1/m)).

22 22/30 Recap At each step of walk, formulate SDP on unfixed variables. Use some (existential) property to argue SDP is feasible Rounding SDP solution -> Step of walk Properties of walk: High Variance -> Quick convergence Low variance for discrepancy on sets -> Low discrepancy

23 23/30 Refinements Spencer’s six std deviations result: Goal: Obtain O(n 1/2 ) discrepancy for any set system on m = O(n) sets. Random coloring has n 1/2 (log n) 1/2 discrepancy Previous approach seems useless: Expected discrepancy for a set O(n 1/2 ), but some random walks will deviate by up to (log n) 1/2 factor Need an additional idea to prevent this.

24 24/30 Spencer’s O(n 1/2 ) result Partial Coloring Lemma: For any system with m sets, there exists a coloring on ¸ n/2 elements with discrepancy O(n 1/2 log 1/2 (2m/n)) [For m=n, disc = O(n 1/2 )] Algorithm for total coloring: Repeatedly apply partial coloring lemma Total discrepancy O( n 1/2 log 1/2 2 ) [Phase 1] + O( (n/2) 1/2 log 1/2 4 ) [Phase 2] + O((n/4) 1/2 log 1/2 8 ) [Phase 3] + … = O(n 1/2 )

25 25/30 Proving Partial Coloring Lemma Beautiful Counting argument (entropy method + pigeonhole) Idea: Too many colorings (2 n ), but few “discrepancy profiles” Key Lemma: There exist k=2 4n/5 colorings X 1,…,X k such that every two X i, X j are “similar” for every set S 1,…,S n. Some X 1,X 2 differ on ¸ n/2 positions Consider X = (X 1 – X 2 )/2 Pf: X(S) = (X 1 (S) – X 2 (S))/2 2 [-10 n 1/2, 10 n 1/2 ] X 1 = ( 1,-1, 1, …,1,-1,-1) X 2 = (-1,-1,-1, …,1, 1, 1) X = ( 1, 0, 1, …,0,-1,-1)

26 26/30 A useful generalization There exists a partial coloring with non-uniform discrepancy bound  S for set S Even if  S =  ( n 1/2 ) in some average sense

27 27/30 An SDP Suppose there exists partial coloring X: 1. On ¸ n/2 elements 2. Each set S has |X(S)| ·  S SDP: Low discrepancy: |  i 2 S j v i | 2 ·  S 2 Many colors:  i |v i | 2 ¸ n/2 |v i | 2 · 1 Pick random Gaussian g = (g 1,g 2,…,g n ) each coordinate g i is iid N(0,1) For each i, consider  i = g ¢ v i Obtain v i 2 R n

28 28/30 Algorithm Initially write SDP with  S = c n 1/2 Each set S does random walk and expects to reach discrepancy of O(  S ) = O(n 1/2 ) Some sets will become problematic. Reduce their  S on the fly. Not many problematic sets, and entropy penalty low. 0 20n 1/2 30n 1/2 35n 1/2 … Danger 1 Danger 2 Danger 3 …

29 29/30 Concluding Remarks Construct coloring over time by solving sequence of SDPs (guided by existence results) Works quite generally Can be derandomized [Bansal-Spencer] (use entropy method itself for derandomizing + usual tech.) E.g. Deterministic six standard deviations can be viewed as a way to derandomize something stronger than Chernoff bounds.

30 30/30 Thank You!

31 31/30

32 32/30

33 33/30 Rest of the talk 1.How to generate  i with required properties. 2.How to update  S over time. Show n 1/2 (log log log n) 1/2 bound.

34 34/30 Why so few algorithms? Often algorithms rely on continuous relaxations. –Linear Program is useless. Can color each element ½ red and ½ blue. Improved results of Spencer, Beck, Srinivasan, … based on clever counting (entropy method). –Pigeonhole Principle on exponentially large systems (seems inherently non-constructive)

35 35/30 Partial Coloring Lemma Suppose we have discrepancy bound  S for set S. Consider 2 n possible colorings Signature of a coloring X: (b(S 1 ), b(S 2 ),…, b(S m )) Want partial coloring with signature (0,0,0,…,0)

36 36/30 Progress Condition Energy increases at each step: E(t) = \sum_i x_i(t)^2 Initially energy =0, can be at most n. Expected value of E(t) = E(t-1) + \sum_i \gamma_i(t)^2 Markov’s inequality.

37 37/30 Missing Steps 1.How to generate the \eta_i 2.How to update \Delta_S over time

38 38/30 Partial Coloring If exist two colorings X 1,X 2 1. Same signature (b 1,b 2,…,b m ) 2. Differ in at least n/2 positions. Consider X = (X 1 –X 2 )/2 1.-1 or 1 on at least n/2 positions, i.e. partial coloring 2.Has signature (0,0,0,…,0) X(S) = (X 1 (S) – X 2 (S)) / 2, so |X(S)| ·  S for all S. Can show that there are 2 4n/5 colorings with same signature. So, some two will differ on > n/2 positions. (Pigeon Hole) X 1 = (1,-1, 1, …, 1,-1,-1) X 2 = (-1,-1,-1, …, 1,1, 1)

39 39/30

40 40/30 Spencer’s O(n 1/2 ) result Partial Coloring Lemma: For any system with m sets, there exists a coloring on ¸ n/2 elements with discrepancy O(n 1/2 log 1/2 (2m/n)) [For m=n, disc = O(n 1/2 )] Algorithm for total coloring: Repeatedly apply partial coloring lemma Total discrepancy O( n 1/2 log 1/2 2 ) [Phase 1] + O( (n/2) 1/2 log 1/2 4 ) [Phase 2] + O((n/4) 1/2 log 1/2 8 ) [Phase 3] + … = O(n 1/2 ) Let us prove the lemma for m = n

41 41/30 Proving Partial Coloring Lemma Pf: Associate with coloring X, signature = (b 1,b 2,…,b n ) (b i = bucket in which X(S i ) lies ) Wish to show: There exist 2 4n/5 colorings with same signature Choose X randomly: Induces distribution  on signatures. Entropy (  ) · n/5 implies some signature has prob. ¸ 2 -n/5. Entropy (  ) ·  i Entropy( b i ) [Subadditivity of Entropy] b i = 0 w.p. ¼ 1- 2 e -50, = 1 w.p. ¼ e -50 = 2 w.p. ¼ e -450 …. -10 n 1/2 -30 n 1/2 10 n 1/2 30 n 1/2 021-2 Ent(b 1 ) · 1/5

42 42/30 A useful generalization Partial coloring with non-uniform discrepancy  S for set S SS  S  S  S  S 01 2 -2 For each set S, consider the “bucketing” Suffices to have  s Ent (b s ) · n/5 Or, if  S = s n 1/2, then  s g( s ) · n/5 g( ) ¼ e - 2 /2 > 1 ¼ ln(1/ ) < 1 Bucket of n 1/2 /100 has penalty ¼ ln(100)

43 43/30 Recap Partial Coloring:  S ¼ 10 n 1/2 gives low entropy ) 2 4n/5 colorings exist with same signature. ) some X 1,X 2 with large hamming distance. (X 1 – X 2 ) /2 gives the desired partial coloring. Trouble: 2 4n/5 /2 n is an exponentially small fraction. Only if we could find the partial coloring efficiently…


Download ppt "1/30 Constructive Algorithms for Discrepancy Minimization Nikhil Bansal (IBM)"

Similar presentations


Ads by Google