Pseudorandom Generators for Combinatorial Shapes 1 Parikshit Gopalan, MSR SVC Raghu Meka, UT Austin Omer Reingold, MSR SVC David Zuckerman, UT Austin.

Slides:



Advertisements
Similar presentations
Pseudorandom Walks: Looking Random in The Long Run or All The Way? Omer Reingold Weizmann Institute.
Advertisements

Rectangle-Efficient Aggregation in Spatial Data Streams Srikanta Tirthapura David Woodruff Iowa State IBM Almaden.
Parikshit Gopalan Georgia Institute of Technology Atlanta, Georgia, USA.
PRG for Low Degree Polynomials from AG-Codes Gil Cohen Joint work with Amnon Ta-Shma.
Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka.
Pseudorandom Generators for Polynomial Threshold Functions 1 Raghu Meka UT Austin (joint work with David Zuckerman)
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan And improvements with Kai-Min Chung.
Ryan Donnell Carnegie Mellon University O. 1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible.
Pseudorandom Generators from Invariance Principles 1 Raghu Meka UT Austin.
Extracting Randomness From Few Independent Sources Boaz Barak, IAS Russell Impagliazzo, UCSD Avi Wigderson, IAS.
NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka.
Extracting Randomness David Zuckerman University of Texas at Austin.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
1 Efficient Pseudorandom Generators from Exponentially Hard One-Way Functions Iftach Haitner, Danny Harnik, Omer Reingold.
DNF Sparsification and Counting Raghu Meka (IAS, Princeton) Parikshit Gopalan (MSR, SVC) Omer Reingold (MSR, SVC)
Interpolation A standard idea in interpolation now is to find a polynomial pn(x) of degree n (or less) that assumes the given values; thus (1) We call.
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
List decoding Reed-Muller codes up to minimal distance: Structure and pseudo- randomness in coding theory Abhishek Bhowmick (UT Austin) Shachar Lovett.
Counting Algorithms for Knapsack and Related Problems 1 Raghu Meka (UT Austin, work done at MSR, SVC) Parikshit Gopalan (Microsoft Research, SVC) Adam.
Discrepancy Minimization by Walking on the Edges Raghu Meka (IAS/DIMACS) Shachar Lovett (IAS)
Better Pseudorandom Generators from Milder Pseudorandom Restrictions Raghu Meka (IAS) Parikshit Gopalan, Omer Reingold (MSR-SVC) Luca Trevian (Stanford),
Using Nondeterminism to Amplify Hardness Emanuele Viola Joint work with: Alex Healy and Salil Vadhan Harvard University.
Yi Wu (CMU) Joint work with Parikshit Gopalan (MSR SVC) Ryan O’Donnell (CMU) David Zuckerman (UT Austin) Pseudorandom Generators for Halfspaces TexPoint.
Dimensionality Reduction
From PET to SPLIT Yuri Kifer. PET: Polynomial Ergodic Theorem (Bergelson) preserving and weakly mixing is bounded measurable functions polynomials,integer.
Minimaxity & Admissibility Presenting: Slava Chernoi Lehman and Casella, chapter 5 sections 1-2,7.
Arithmetic Hardness vs. Randomness Valentine Kabanets SFU.
The moment generating function of random variable X is given by Moment generating function.
1 On the Power of the Randomized Iterate Iftach Haitner, Danny Harnik, Omer Reingold.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
Discrepancy Minimization by Walking on the Edges Raghu Meka (IAS & DIMACS) Shachar Lovett (IAS)
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Simulating independence: new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors Boaz Barak Guy Kindler Ronen Shaltiel Benny Sudakov.
Beating the Union Bound by Geometric Techniques Raghu Meka (IAS & DIMACS)
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
On Constructing Parallel Pseudorandom Generators from One-Way Functions Emanuele Viola Harvard University June 2005.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Integrals  In Chapter 2, we used the tangent and velocity problems to introduce the derivative—the central idea in differential calculus.  In much the.
A PTAS for Computing the Supremum of Gaussian Processes Raghu Meka (IAS/DIMACS)
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Computing Inner and Outer Shape Approximations Joseph S.B. Mitchell Stony Brook University.
Polynomials Emanuele Viola Columbia University work partially done at IAS and Harvard University December 2007.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
4.8 Polynomial Word Problems. a) Define the variable, b) Write the equation, and c) Solve the problem. 1) The sum of a number and its square is 42. Find.
Pseudorandom generators for group products Michal Koucký Institute of Mathematics, Prague Prajakta Nimbhorkar Pavel Pudlák IMSC, Chenai IM, Prague IMSC,
Shorter Long Codes and Applications to Unique Games 1 Boaz Barak (MSR, New England) Parikshit Gopalan (MSR, SVC) Johan Håstad (KTH) Prasad Raghavendra.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
In Chapters 6 and 8, we will see how to use the integral to solve problems concerning:  Volumes  Lengths of curves  Population predictions  Cardiac.
RANDOMNESS VS. MEMORY: Prospects and Barriers Omer Reingold, Microsoft Research and Weizmann With insights courtesy of Moni Naor, Ran Raz, Luca Trevisan,
Pseudorandomness: New Results and Applications Emanuele Viola IAS April 2007.
The Subset-sum Problem
Coding, Complexity and Sparsity workshop
VLSI Quadratic Placement
Fooling intersections of low-weight halfspaces
Tight Fourier Tails for AC0 Circuits
Some Rules for Expectation
Faster Space-Efficient Algorithms for Subset Sum
Linear sketching with parities
ASV Chapters 1 - Sample Spaces and Probabilities
Linear sketching with parities
DNF Sparsification and Counting
Emanuele Viola Harvard University June 2005
ASV Chapters 1 - Sample Spaces and Probabilities
Zeev Dvir (Princeton) Shachar Lovett (IAS)
Presentation transcript:

Pseudorandom Generators for Combinatorial Shapes 1 Parikshit Gopalan, MSR SVC Raghu Meka, UT Austin Omer Reingold, MSR SVC David Zuckerman, UT Austin

PRGs for Small Space? Poly. width ROBPs. Nis-INW best. Is RL = L? 2 Saks-Zhou: Nis 90, INW94: PRGs for polynomial width ROBP’s with seed. Can do O(log n) for these! Small-Bias Comb. Rectangles Modular Sums 0/1 Halfspaces Combinatorial shapes: unifies and generalizes all.

What are Combinatorial Shapes? 3

Fooling Linear Forms 4 For Question: Can we have this “pseudorandomly”? Generate, Question: Can we have this “pseudorandomly”? Generate,

Why Fool Linear Forms? 5  Special case: small-bias spaces  Symmetric functions on subsets. Previous best: Nisan90, INW94. Been difficult to beat Nisan-INW barrier for natural cases. Previous best: Nisan90, INW94. Been difficult to beat Nisan-INW barrier for natural cases. Question: Generate, Question: Generate,

Combinatorial Rectangles 6 What about Applications: Volume estimation, integration.

Combinatorial Shapes 7

8

PRGs for Combinatorial Shapes 9 Unifies and generalizes Combinatorial rectangles – sym. function h is AND Small-bias spaces – m = 2, h is parity 0-1 halfspaces – m = 2, h is shifted majority

Thm: PRG for (m,n)-Comb. shapes with seed. Previous Results 10 ReferenceFunction ClassSeed Length Nis90, INW94 All Shapes LLSZ92 Comb. Rects, Hitting sets EGL+92, ASWZ96, Lu02 Comb. Rectangles NN93, LRTV09, MZ09 Modular Sums M., Zuckerman 10 Halfspaces Our Results

Discrete Central Limit Theorem Sum of ind. random variables ~ Gaussian 11 Thm:

Discrete Central Limit Theorem Close in stat. distance to binomial distribution 12 Optimal error:. Proof analytical - Stein’s method (Barbour-Xia98). Thm:

This Talk PRGs for Cshapes with m = 2. Illustrates main ideas for general case. 2. PRG for general Cshapes. 3. Proof of discrete central limit theorem.

14 Question: Generate, Question: Generate, Fooling Cshapes for m = 2 ~ Fooling 0/1 linear forms in TV. Fooling Cshapes for m = 2 ~ Fooling 0/1 linear forms in TV.

Fooling Linear Forms in TV Fool linear forms with small test sizes. Bounded independence, hashing. 2. Fool 0-1 linear forms in cdf distance. PRG for halfspaces: M., Zuckerman 3. PRG on n/2 vars + PRG fooling in cdf PRG for linear forms, large test sets. Thm MZ10: PRG for halfspaces with seed 3. Convolution Lem: close in cdf to close in TV.  Analysis of recursion  Elementary proof of discrete CLT. Question: Generate, Question: Generate,

Recursion Step for 0-1 Linear Forms 16  For intuition consider X1 Xn/2+1 Xn … … Xn/2 … … PRG -fool in TV PRG -fool in CDF PRG -fool in TV True randomness PRG -fool in TV

Recursion Step: Convolution Lemma 17     Lem:

Convolution Lemma 18  Problem: Y could be even, Z odd.  Define Y’:  Approach: Lem:

19

20  Convexity of : Enough to study

Recursion for General Case 21  Problem: Test set skewed to first half.  Solution: Do the partitioning randomly. Test set splits evenly to each half. Can’t use new bits for every step.

 Analysis: Induction. Balance out test set.  Final Touch: Use Nisan-INW across recursions. Recursion for General Case 22 X1 Xn X2 … … X3 X1 Xi … … MZ on n/2 Vars Xj … … MZ on n/4 Vars … … Truly random Geometric dec. blocks via Pairwise Permutations Fool 0-1 Linear forms in TV with seed

This Talk PRGs for Cshapes with m = 2. Illustrates main ideas for general case. 2. PRG for general Cshapes. 3. Proof of discrete central limit theorem.

From Shapes to Sums 24 

From m = 2 to General m 25 Test set Large vs Small For large: true ~ binomial For small: k-wise High or Low Variance Var. high: shift-invariance For small: k-wise

1. PRG fooling low variance CSums. Sandwiching poly., bounded independence. 2. PRG fooling high var. CSums in cdf. Same generator, similar analysis. 3. PRG on n/2 vars + PRG fooling in cdf PRG for high variance CSums PRGs for CShapes Convolution Lemma.  Work with shift invariance.  Balance out variances (ala test set sizes).

Low Variance Combinatorial Sums 27  Need to look at the generator for halfspaces.  Some notation: Pairwise-indep. hash family k-wise independent generator We use

INW on top to choose z’s. Core Generator x1 x2 x3 … … xn x5 x4 xk … … x1 x3 xk x5 x4 x2 12t … … xn … … x5 x4 x2 2t xn 28 Randomness:

Low Variance Combinatorial Sums 29  Why easy for m = 2? Low var. ~ small test set Test set well spread out: no bucket more than O(1). O(1)-independence suffices. x1 x3 xk 1 … … … … x5 x4 x2 2 t xn x3 xk x5

Low Variance Combinatorial Sums 30  For general m: can have small biases. Each coordinate has non-zero but small bias. x1 x3 xk 1 … … … … x5 x4 x2 2 t xn

Low Variance Combinatorial Sums 31  Total variance Variance in each bucket ! Let’s exploit that. x1 x3 xk 1 … … … … x5 x4 x2 2 t xn

Low Variance Combinatorial Sums 32  Use 22-wise independence in each bucket.  Union bound across buckets.  Proof of lemma: sandwiching polynomials.

Summary of PRG for CSums PRGs for low-var CSums Bounded independence, hashing Sandwiching polynomials 2. PRGs for high-var CSums in cdf PRG for halfspaces 3. PRG on n/2 vars + PRG in cdf PRG for high-var CSums. PRG for CSums

This Talk PRGs for Cshapes with m = 2. Illustrates main ideas for general case. 2. PRG for general Cshapes. 3. Proof of discrete central limit theorem.

Discrete Central Limit Theorem Close in stat. distance to binomial distribution 35 Thm:

Lem: Convolution Lemma 36    

Same mean, variance All four approx. same means, variances Discrete Central Limit Theorem 37    

Discrete Central Limit Theorem 38  By CLT: small.  By unimodality: shift invariant. Hence proved! General integer valued case similar. Hence proved! General integer valued case similar. All parts have similar means and variances

Open Problems 39 Optimal dependence on error rate? Non-explicit: Solve for halfspaces More general/better notions of symmetry? Capture “order oblivious” small space. Better PRGs for Small Space?

40 Thank You