Ryan Donnell Carnegie Mellon University O. 1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible.

Slides:



Advertisements
Similar presentations
Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
Advertisements

Numerical Linear Algebra in the Streaming Model Ken Clarkson - IBM David Woodruff - IBM.
Sublinear-time Algorithms for Machine Learning Ken Clarkson Elad Hazan David Woodruff IBM Almaden Technion IBM Almaden.
PRG for Low Degree Polynomials from AG-Codes Gil Cohen Joint work with Amnon Ta-Shma.
Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka.
Pseudorandom Generators for Polynomial Threshold Functions 1 Raghu Meka UT Austin (joint work with David Zuckerman)
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
Ryan ODonnell Carnegie Mellon University Karl Wimmer CMU & Duquesne Amir Shpilka Technion Rocco Servedio Columbia Parikshit Gopalan UW & Microsoft SVC.
Pseudorandom Generators from Invariance Principles 1 Raghu Meka UT Austin.
Presented by: Ryan ODonnell Carnegie Melloni by D. H. J. Polymath.
The Theory of Zeta Graphs with an Application to Random Networks Christopher Ré Stanford.
Subhash Khot IAS Elchanan Mossel UC Berkeley Guy Kindler DIMACS Ryan O’Donnell IAS.
Simple extractors for all min- entropies and a new pseudo- random generator Ronen Shaltiel Chris Umans.
Computational Applications of Noise Sensitivity Ryan O’Donnell.
May 5, 2010 MSRI 1 The Method of Multiplicities Madhu Sudan Microsoft New England/MIT TexPoint fonts used in EMF. Read the TexPoint manual.
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Uniqueness of Optimal Mod 3 Circuits for Parity Frederic Green Amitabha Roy Frederic Green Amitabha Roy Clark University Akamai Clark University Akamai.
Ryan O’Donnell & Yi Wu Carnegie Mellon University (aka, Conditional hardness for satisfiable 3-CSPs)
Ryan ’Donnell Carnegie Mellon University Institute for Advanced Study O.
Kevin Matulef MIT Ryan O’Donnell CMU Ronitt Rubinfeld MIT Rocco Servedio Columbia.
Using Nondeterminism to Amplify Hardness Emanuele Viola Joint work with: Alex Healy and Salil Vadhan Harvard University.
1/17 Optimal Long Test with One Free Bit Nikhil Bansal (IBM) Subhash Khot (NYU)
Dictator tests and Hardness of approximating Max-Cut-Gain Ryan O’Donnell Carnegie Mellon (includes joint work with Subhash Khot of Georgia Tech)
New Algorithms and Lower Bounds for Monotonicity Testing of Boolean Functions Rocco Servedio Joint work with Xi Chen and Li-Yang Tan Columbia University.
Yi Wu (CMU) Joint work with Parikshit Gopalan (MSR SVC) Ryan O’Donnell (CMU) David Zuckerman (UT Austin) Pseudorandom Generators for Halfspaces TexPoint.
Yi Wu (CMU) Joint work with Vitaly Feldman (IBM) Venkat Guruswami (CMU) Prasad Ragvenhdra (MSR)
Learning, testing, and approximating halfspaces Rocco Servedio Columbia University DIMACS-RUTCOR Jan 2009.
Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator Ronen Shaltiel (Hebrew U) & Chris Umans (MSR) 2001.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Arithmetic Hardness vs. Randomness Valentine Kabanets SFU.
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
1 On approximating the number of relevant variables in a function Dana Ron & Gilad Tsur Tel-Aviv University.
CS151 Complexity Theory Lecture 10 April 29, 2004.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
In a World of BPP=P Oded Goldreich Weizmann Institute of Science.
Some 3CNF Properties are Hard to Test Eli Ben-Sasson Harvard & MIT Prahladh Harsha MIT Sofya Raskhodnikova MIT.
Ryan ’Donnell Carnegie Mellon University O. Ryan ’Donnell Carnegie Mellon University.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
Estimation of Statistical Parameters
Pseudorandom Generators for Combinatorial Shapes 1 Parikshit Gopalan, MSR SVC Raghu Meka, UT Austin Omer Reingold, MSR SVC David Zuckerman, UT Austin.
Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s.
Ryan ’Donnell Carnegie Mellon University O. f : {−1, 1} n → {−1, 1} is “quasirandom” iff fixing O(1) input coords changes E[f(x)] by only o (1)
Primer on Fourier Analysis Dana Moshkovitz Princeton University and The Institute for Advanced Study.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
Forrelation: A Problem that Optimally Separates Quantum from Classical Computing.
CS151 Complexity Theory Lecture 16 May 20, The outer verifier Theorem: NP  PCP[log n, polylog n] Proof (first steps): –define: Polynomial Constraint.
Probability and Moment Approximations using Limit Theorems.
List Decoding Using the XOR Lemma Luca Trevisan U.C. Berkeley.
Pseudo-random generators Talk for Amnon ’ s seminar.
Non Linear Invariance Principles with Applications Elchanan Mossel U.C. Berkeley
usually unimportant in social surveys:
Dana Ron Tel Aviv University
Fooling intersections of low-weight halfspaces
Chapter 7: Sampling Distributions
Noise stability of functions with low influences:
Algebraic Codes and Invariance
Locally Decodable Codes from Lifting
The Empirical FT. What is the large sample distribution of the EFT?
General Strong Polarization
Learning, testing, and approximating halfspaces
Imperfectly Shared Randomness
Every set in P is strongly testable under a suitable encoding
CS151 Complexity Theory Lecture 10 May 2, 2019.
On Derandomizing Algorithms that Err Extremely Rarely
Explicit near-Ramanujan graphs of every degree
Presentation transcript:

Ryan Donnell Carnegie Mellon University O

1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems and an advertisement.

1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems and an advertisement.

Linear Threshold Functions

Learning Theory [O-Servedio08] Thm: Can learn LTFs f in poly(n) time, just from correlations E[f(x)x i ]. Key: G ~ N(0,1) when all |c i |.

Property Testing [Matulef-O-Rubinfeld-Servedio09] Thm: Can test if is -close to an LTF with poly(1/) queries. Key: when all |c i |.

Derandomization [Meka-Zuckerman10] Thm: PRG for LTFs with seed length O(log(n) log(1/)). Key: even when x i s not fully independent.

Multidimensional CLT? when all small compared to For

Derandomization+ [Gopalan-O-Wu-Zuckerman10] Thm: PRG for functions of O(1) LTFs with seed length O(log(n) log(1/)). Key: Derandomized multidimensional CLT.

Property Testing+ [Blais-O10] Thm: Testing if is a Majority of k bits needs k Ω(1) queries. Key: assuming E[X i ] = E[Y i ], Var[X i ] = Var[Y i ], and some other conditions. (actually, a multidimensional version)

Social Choice, Inapproximability [Mossel-O-Oleszkiewicz05] Thm: a) Among voting schemes where no voter has unduly large influence, Majority is most robust to noise. b) Max-Cut is UG-hard to.878-approx. Key: If P is a low-deg. multilin. polynomial, assuming P has small coeffs. on each coord.

1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems and an advertisement.

Gaussians Standard Gaussian: G ~ N(0,1). Mean 0, Var 1. a + bG also a Gaussian: N(a,b 2 ) Sum of independent Gaussians is Gaussian: If G ~ N(a,b 2 ), H ~ N(c,d 2 ) are independent, then G + H ~ N(a+c,b 2 +d 2 ). Anti-concentration: Pr[ G [u, u+] ] O().

X 1, X 2, X 3, … independent, ident. distrib., mean 0, variance σ 2, Central Limit Theorem (CLT)

CLT with error bounds X 1 + · · · + X n is close toN(0,1), assuming X i is not too wacky. X 1, X 2, …, X n independent, ident. distrib., mean 0, variance 1/n, wacky:

Niceness of random variables Say E[X] = 0, stddev[X] = σ. eg: ±1. N(0,1). Unif on [-a,a]. not nice: def: ( σ). def: X is nice if

Niceness of random variables Say E[X] = 0, stddev[X] = σ. eg: ±1. N(0,1). Unif on [-a,a]. not nice: def: ( σ). def: X is C-nice if

Y -close to Z: Berry-Esseen Theorem X 1, X 2, …, X n independent, ident. distrib., mean 0, variance 1/n, X 1 + · · · + X n is -close toN(0,1), assuming X i is C-nice, where [Shevtsova07]:.7056

General Case X 1, X 2, …, X n independent, ident. distrib., mean 0, X 1 + · · · + X n is -close toN(0,1), assuming X i is C-nice, [Shiganov86]:.7915

Berry-Esseen: How to prove? 1. Characteristic functions 2.Steins method 3.Replacement = think like a cryptographer X 1, X 2, …, X n indep., mean 0, S = X 1 + · · · + X n G ~ N(0,1).-close to

Indistinguishability of random variables S -close to G:

Indistinguishability of random variables S -close to G: u

Indistinguishability of random variables S -close to G: u t

Indistinguishability of random variables S -close to G:

Replacement method S -close to G: u δ

Replacement method X 1, X 2, …, X n indep., mean 0, S = X 1 + · · · + X n G ~ N(0,1) For smooth

Replacement method X 1, X 2, …, X n indep., mean 0, G = G 1 + · · · + G n For smooth S = X 1 + · · · + X n Hybrid argument

X 1, X 2, …, X n indep., mean 0, S Y = Y 1 + · · · + Y n For smooth S X = X 1 + · · · + X n Invariance principle Y 1, Y 2, …, Y n Var[X i ] = Var[Y i ] =

Hybrid argument Def: Z i = Y 1 + · · · + Y i + X i+1 + · · · + X n S X = Z 0, S Y = Z n X 1, X 2, …, X n, Y 1, Y 2, …, Y n, independent, matching means and variances. S X = X 1 + · · · + X n S Y = Y 1 + · · · + Y n vs.

Hybrid argument Z i = Y 1 + · · · + Y i + X i+1 + · · · + X n Goal: X 1, X 2, …, X n, Y 1, Y 2, …, Y n, independent, matching means and variances.

Z i = Y 1 + · · · + Y i + X i+1 + · · · + X n

where U = Y 1 + · · · + Y i1 + X i+1 + · · · + X n. Note: U, X i, Y i independent. Goal: U T

= by indep. and matching means/variances!

Variant Berry-Esseen: Say If X 1, X 2, …, X n & Y 1, Y 2, …, Y n indep. and have matching means/variances, then

Usual Berry-Esseen: If X 1, X 2, …, X n indep., mean 0, u δ Hack

Usual Berry-Esseen: If X 1, X 2, …, X n indep., mean 0, Variant Berry-Esseen + Hack Usual Berry-Esseen except with error O( 1/4 )

Extensions are easy! Vector-valued version: Use multidimensional Taylor theorem. Derandomized version: If X 1, …, X m C-nice, 3-wise indep., then X 1 +···+ X m is O(C)-nice. Higher-degree version: X 1, …, X m C-nice, indep., Q is a deg.-d poly., then Q(X 1, …, X m ) is O(C) d -nice.

1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems, advertisement, anecdote?

Open problems 1.Recover usual Berry-Esseen via the Replacement method. 2.Vector-valued: Get correct dependence on test sets K. (Gaussian surface area?) 3.Higher-degree: improve (?) the exponential dependence on degree d. 4.Find more applications in TCS.

Do you like LTFs and PTFs? Do you like probability and geometry?