Presentation is loading. Please wait.

Presentation is loading. Please wait.

Primer on Fourier Analysis

Similar presentations


Presentation on theme: "Primer on Fourier Analysis"— Presentation transcript:

1 Primer on Fourier Analysis
Dana Moshkovitz Princeton University and The Institute for Advanced Study

2 Fourier Analysis in Theoretical Computer Science

3 Fourier Analysis in Theoretical Computer Science (Unofficial List)
Polynomials multiplication (FFT) List Decoding [AGS03] Collective Coin Flipping [BL,KKL] Analysis of expansion/sampling (e.g., [MR06]) Computational Learning [KM] Linearity testing [BLR] Analysis of threshold phenomena Hardness of Approximation (dictator testing) [H97] Voting/social choice schemes Quantum Computing

4 “something that looks scary to analyze” “bunch of (in)equalities”
“The Fourier Magic” Fourier Analysis “something that looks scary to analyze” “bunch of (in)equalities”

5 Today: Explain the “Fourier Magic”
Why is it useful? What is it? What does it do? When to use it? What do we know about it?

6 It’s Just a Different Way to Look at Functions

7 It’s Changing Basis Background: Real/complex functions form vector space Idea: Represent functions in Fourier basis, which is the basis of the shift operators (representation by frequency). Advantage: Convolution (complicated “global” operation on functions) becomes simple (“local”) in Fourier basis Generality: Here will only consider the Boolean case – very-special case

8 Fourier Basis (Boolean Cube Case)
Boolean cube: additive group Z2n Space of functions: Z2n. Inner product space where f,g=Ex[f(x)g(x)]. Characters: (x+y)=(x)(y)

9 Foundations Claim (Characterization): The characters are the eigenvectors of the shift operators Ssf(x)→ f(x+s). Corollary (Basis): The characters form an orthonormal basis. Claim (Explicit): The characters are the functions S(x) = (-1)iSxi for S[n]. Shift operators occur “everywhere” [try to think of examples; some will be given in the sequel]. In particular, convolution involves shifting. Analysis in the eigenvectors basis seems “natural”. Notice: in the Boolean case, the characters correspond to the linear function <s,x> for the binary vectors s. This is what is used for “linearity testing”. [the (-1) is for addition <-->multiplication]

10 Fourier Transform = Polynomial Expansion
Fourier coefficients: f^(S) = f,S. Note: f^()=Ex[f(x)] Polynomial expansion: substitute yi=(-1)xi f(y1,…,yn) = Sµ[n]f^(S)i2Syi Fourier transform: f  f^

11 The Fourier Spectrum level n n-1 n/2 |S| 1

12 Degree-k Polynomial n n-1 n/2 |S| 1 k

13 k-Junta n n-1 n/2 |S| 1 k-junta is a degenerate case – instead of considering n vars, we could have considered k k

14 Orthonormal Bases Parseval Identity (generalized Pythagorean Thm): For any f, S(f^(S))2 = Ex[ (f (x))2] So, for Boolean f:{±1}n→{±1}, we have: x(f^(x))2 = 1 In general, for any f,g, f,g = 2nf^,g^

15 Convolution Convolution: (f*g)(x) = Ey[f(y)g(x-y)] Example Weighted average: (f*w)(0) = Ey[f(y)w(y)]

16 Convolution in Fourier Basis
Claim: For any f,g, (f*g)^  f^·g^ Proof: By expanding according to definition.

17 Things You Can Do with Convolution

18 Parts of The Spectrum Variance:
n/2 1 Variance: Varx[f(x)] = Ex[f(x)2] - Ex[f(x)]2 = S≠; f^(S)2 Influence of i’th variable: Infi(f) = Px[f(x)≠f(xei)] = S3i f^(S)2 Intuition: Variance = the non-constant part of the function Influence of i = the part of the spectrum that depends on i

19 Smoothening f Perturbation: x»±y : for each i, T±f(x) = Ex»±y[f(y)]
yi = xi with probability 1-± yi = 1-xi otherwise T±f(x) = Ex»±y[f(y)] Convolution: T±f  f*P(noise=µ) Fourier: (T±f)^  (1-2±)|S|·f^

20 Smoothed Function is Close to Low Degree!
Tail: Part of |T±f|22 on levels ¸ k is: · (1-2±)2k |f|22· e-c±k Hence, weight  on levels ¸ C · 1/ · log 1/

21 Hypercontractivity Theorem (Bonami, Gross): For f, for ± · √(p-1)/(q-1), |T±f|q · |f|p Roughly, and incorrectly ;-): “T±f much [in a “tougher” norm] smoother than f”

22 Noise Sensitivity and Stability
NS±(f) = Px»±y (f(x)f(y)) Correlation: NS±(f) = 2(E[f]-f,T±f) Stability: Set  := 1/2-/2 S½(f) = f,T±f Fourier: S±(f) = f^, |S|f^ = §S |S| f^(S)2 NS±(f) = 2f^() – 2S(1-2±)|S|f^(S)2

23 Thresholds Are Stablest and Hardness of Approximation
What is it? Isoperimetric inequality on noise stability [MOO05]. Applications to hardness of approximation (e.g., Max-Cut [KKMO04]). Derived from “Invariance Principle” (extended Central Limit Theorem), used by the [R08] extension of [KKMO04]. Isoperimetry = Largest area for specified boundary/[Here:]smallest boundary for given area. For more applications – look at [MOO]. The application go far beyond hardness of approximation.

24 Thresholds Are Stablest
Theorem [MOO’05]: Fix 0<<1. For balanced f (i.e., E[f]=0) where Infi(f)≤ for all i, Sρ(f) ≤ 2/π · arcsin ρ + O( (loglog 1/²)/log1/²) ≈ noise stability of threshold functions t(x)=sign(∑aixi), ∑ai2=1 Balanced is [somewhat] necessary: constant function is stable! Can “play” with that – consider somewhat non-constant function. In hardness of approximation – usually easy to ensure the function is balanced. Also, commonly in hardness constructions we consider negative rho. - All influences are small = no variable determines the function to a large extent Can “play” with that – first remove influential variables. But in hardness construction this is exactly what we exploit: stable balanced functions have an influential variable!

25 More Material There are excellent courses on Fourier Analysis available on the homepages of: Irit Dinur and Ehud Friedgut, Guy Kindler, Subhash Khot, Elchanan Mossel, Ryan O’Donnell, Oded Regev.


Download ppt "Primer on Fourier Analysis"

Similar presentations


Ads by Google