Download presentation

Presentation is loading. Please wait.

Published byMarcel Lamkins Modified over 4 years ago

1
Ryan Donnell Carnegie Mellon University O

2
1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems and an advertisement.

3
1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems and an advertisement.

4
Linear Threshold Functions

6
Learning Theory [O-Servedio08] Thm: Can learn LTFs f in poly(n) time, just from correlations E[f(x)x i ]. Key: G ~ N(0,1) when all |c i |.

7
Property Testing [Matulef-O-Rubinfeld-Servedio09] Thm: Can test if is -close to an LTF with poly(1/) queries. Key: when all |c i |.

8
Derandomization [Meka-Zuckerman10] Thm: PRG for LTFs with seed length O(log(n) log(1/)). Key: even when x i s not fully independent.

9
Multidimensional CLT? when all small compared to For

10
Derandomization+ [Gopalan-O-Wu-Zuckerman10] Thm: PRG for functions of O(1) LTFs with seed length O(log(n) log(1/)). Key: Derandomized multidimensional CLT.

11
Property Testing+ [Blais-O10] Thm: Testing if is a Majority of k bits needs k Ω(1) queries. Key: assuming E[X i ] = E[Y i ], Var[X i ] = Var[Y i ], and some other conditions. (actually, a multidimensional version)

12
Social Choice, Inapproximability [Mossel-O-Oleszkiewicz05] Thm: a) Among voting schemes where no voter has unduly large influence, Majority is most robust to noise. b) Max-Cut is UG-hard to.878-approx. Key: If P is a low-deg. multilin. polynomial, assuming P has small coeffs. on each coord.

13
1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems and an advertisement.

14
Gaussians Standard Gaussian: G ~ N(0,1). Mean 0, Var 1. a + bG also a Gaussian: N(a,b 2 ) Sum of independent Gaussians is Gaussian: If G ~ N(a,b 2 ), H ~ N(c,d 2 ) are independent, then G + H ~ N(a+c,b 2 +d 2 ). Anti-concentration: Pr[ G [u, u+] ] O().

15
X 1, X 2, X 3, … independent, ident. distrib., mean 0, variance σ 2, Central Limit Theorem (CLT)

16
CLT with error bounds X 1 + · · · + X n is close toN(0,1), assuming X i is not too wacky. X 1, X 2, …, X n independent, ident. distrib., mean 0, variance 1/n, wacky:

17
Niceness of random variables Say E[X] = 0, stddev[X] = σ. eg: ±1. N(0,1). Unif on [-a,a]. not nice: def: ( σ). def: X is nice if

18
Niceness of random variables Say E[X] = 0, stddev[X] = σ. eg: ±1. N(0,1). Unif on [-a,a]. not nice: def: ( σ). def: X is C-nice if

19
Y -close to Z: Berry-Esseen Theorem X 1, X 2, …, X n independent, ident. distrib., mean 0, variance 1/n, X 1 + · · · + X n is -close toN(0,1), assuming X i is C-nice, where [Shevtsova07]:.7056

20
General Case X 1, X 2, …, X n independent, ident. distrib., mean 0, X 1 + · · · + X n is -close toN(0,1), assuming X i is C-nice, [Shiganov86]:.7915

21
Berry-Esseen: How to prove? 1. Characteristic functions 2.Steins method 3.Replacement = think like a cryptographer X 1, X 2, …, X n indep., mean 0, S = X 1 + · · · + X n G ~ N(0,1).-close to

22
Indistinguishability of random variables S -close to G:

23
Indistinguishability of random variables S -close to G: u

24
Indistinguishability of random variables S -close to G: u t

25
Indistinguishability of random variables S -close to G:

26
Replacement method S -close to G: u δ

27
Replacement method X 1, X 2, …, X n indep., mean 0, S = X 1 + · · · + X n G ~ N(0,1) For smooth

28
Replacement method X 1, X 2, …, X n indep., mean 0, G = G 1 + · · · + G n For smooth S = X 1 + · · · + X n Hybrid argument

29
X 1, X 2, …, X n indep., mean 0, S Y = Y 1 + · · · + Y n For smooth S X = X 1 + · · · + X n Invariance principle Y 1, Y 2, …, Y n Var[X i ] = Var[Y i ] =

30
Hybrid argument Def: Z i = Y 1 + · · · + Y i + X i+1 + · · · + X n S X = Z 0, S Y = Z n X 1, X 2, …, X n, Y 1, Y 2, …, Y n, independent, matching means and variances. S X = X 1 + · · · + X n S Y = Y 1 + · · · + Y n vs.

31
Hybrid argument Z i = Y 1 + · · · + Y i + X i+1 + · · · + X n Goal: X 1, X 2, …, X n, Y 1, Y 2, …, Y n, independent, matching means and variances.

32
Z i = Y 1 + · · · + Y i + X i+1 + · · · + X n

33
where U = Y 1 + · · · + Y i1 + X i+1 + · · · + X n. Note: U, X i, Y i independent. Goal: U T

34
= by indep. and matching means/variances!

35
Variant Berry-Esseen: Say If X 1, X 2, …, X n & Y 1, Y 2, …, Y n indep. and have matching means/variances, then

36
Usual Berry-Esseen: If X 1, X 2, …, X n indep., mean 0, u δ Hack

37
Usual Berry-Esseen: If X 1, X 2, …, X n indep., mean 0, Variant Berry-Esseen + Hack Usual Berry-Esseen except with error O( 1/4 )

38
Extensions are easy! Vector-valued version: Use multidimensional Taylor theorem. Derandomized version: If X 1, …, X m C-nice, 3-wise indep., then X 1 +···+ X m is O(C)-nice. Higher-degree version: X 1, …, X m C-nice, indep., Q is a deg.-d poly., then Q(X 1, …, X m ) is O(C) d -nice.

39
1. Describe some TCS results requiring variants of the Central Limit Theorem. Talk Outline 2. Show a flexible proof of the CLT with error bounds. 3. Open problems, advertisement, anecdote?

40
Open problems 1.Recover usual Berry-Esseen via the Replacement method. 2.Vector-valued: Get correct dependence on test sets K. (Gaussian surface area?) 3.Higher-degree: improve (?) the exponential dependence on degree d. 4.Find more applications in TCS.

41
Do you like LTFs and PTFs? Do you like probability and geometry?

Similar presentations

OK

Sublinear-time Algorithms for Machine Learning Ken Clarkson Elad Hazan David Woodruff IBM Almaden Technion IBM Almaden.

Sublinear-time Algorithms for Machine Learning Ken Clarkson Elad Hazan David Woodruff IBM Almaden Technion IBM Almaden.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google