Coding, Complexity and Sparsity workshop

Slides:



Advertisements
Similar presentations
Walk the Walk: On Pseudorandomness, Expansion, and Connectivity Omer Reingold Weizmann Institute Based on join works with Michael Capalbo, Kai-Min Chung,
Advertisements

Pseudorandom Walks: Looking Random in The Long Run or All The Way? Omer Reingold Weizmann Institute.
Parikshit Gopalan Georgia Institute of Technology Atlanta, Georgia, USA.
Undirected ST-Connectivity in Log-Space Omer Reingold Weizmann Institute.
Lower Bounds for Additive Spanners, Emulators, and More David P. Woodruff MIT and Tsinghua University To appear in FOCS, 2006.
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
A Combinatorial Construction of Almost-Ramanujan Graphs Using the Zig-Zag product Avraham Ben-Aroya Avraham Ben-Aroya Amnon Ta-Shma Amnon Ta-Shma Tel-Aviv.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Lecture 24 MAS 714 Hartmut Klauck
Better Pseudorandom Generators from Milder Pseudorandom Restrictions Raghu Meka (IAS) Parikshit Gopalan, Omer Reingold (MSR-SVC) Luca Trevian (Stanford),
Artur Czumaj Dept of Computer Science & DIMAP University of Warwick Testing Expansion in Bounded Degree Graphs Joint work with Christian Sohler.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 25, 2006
(Omer Reingold, 2005) Speaker: Roii Werner TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A AA A A A A AA A.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 12 June 18, 2006
An Elementary Construction of Constant-Degree Expanders Noga Alon *, Oded Schwartz * and Asaf Shapira ** *Tel-Aviv University, Israel **Microsoft Research,
Constant Degree, Lossless Expanders Omer Reingold AT&T joint work with Michael Capalbo (IAS), Salil Vadhan (Harvard), and Avi Wigderson (Hebrew U., IAS)
Undirected ST-Connectivity 2 DL Omer Reingold, STOC 2005: Presented by: Fenghui Zhang CPSC 637 – paper presentation.
Michael Bender - SUNY Stony Brook Dana Ron - Tel Aviv University Testing Acyclicity of Directed Graphs in Sublinear Time.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Undirected ST-Connectivity in Log-Space By Omer Reingold (Weizmann Institute) Year 2004 Presented by Maor Mishkin.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
Zig-Zag Expanders Seminar in Theory and Algorithmic Research Sashka Davis UCSD, April 2005 “ Entropy Waves, the Zig-Zag Graph Product, and New Constant-
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
1 Constructing Pseudo-Random Permutations with a Prescribed Structure Moni Naor Weizmann Institute Omer Reingold AT&T Research.
The Power of Randomness in Computation 呂及人中研院資訊所.
Ramanujan Graphs of Every Degree Adam Marcus (Crisply, Yale) Daniel Spielman (Yale) Nikhil Srivastava (MSR India)
Undirected ST-Connectivity In Log Space
Undirected ST-Connectivity In Log Space Omer Reingold Slides by Sharon Bruckner.
1 Entropy Waves, The Zigzag Graph Product, and New Constant-Degree Expanders Omer Reingold Salil Vadhan Avi Wigderson Lecturer: Oded Levy.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
Pseudorandom Generators for Combinatorial Shapes 1 Parikshit Gopalan, MSR SVC Raghu Meka, UT Austin Omer Reingold, MSR SVC David Zuckerman, UT Austin.
Cryptographic hash functions from expander graphs Denis Charles, Microsoft Research Eyal Goren, McGill University Kristin Lauter, Microsoft Research ECC.
Pseudorandomness Emanuele Viola Columbia University April 2008.
Why Extractors? … Extractors, and the closely related “Dispersers”, exhibit some of the most “random-like” properties of explicitly constructed combinatorial.
Endre Szemerédi & TCS Avi Wigderson IAS, Princeton.
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
15-853:Algorithms in the Real World
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
Artur Czumaj DIMAP DIMAP (Centre for Discrete Maths and it Applications) Computer Science & Department of Computer Science University of Warwick Testing.
Pseudorandom generators for group products Michal Koucký Institute of Mathematics, Prague Prajakta Nimbhorkar Pavel Pudlák IMSC, Chenai IM, Prague IMSC,
RANDOMNESS VS. MEMORY: Prospects and Barriers Omer Reingold, Microsoft Research and Weizmann With insights courtesy of Moni Naor, Ran Raz, Luca Trevisan,
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
Derandomized Constructions of k -Wise (Almost) Independent Permutations Eyal Kaplan Moni Naor Omer Reingold Weizmann Institute of ScienceTel-Aviv University.
Presented by Alon Levin
1 Entropy Waves, The Zigzag Graph Product, and New Constant-Degree Expanders Omer Reingold Salil Vadhan Avi Wigderson Lecturer: Oded Levy.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Tali Kaufman (Bar-Ilan)
Information Complexity Lower Bounds
New Characterizations in Turnstile Streams with Applications
Approximating the MST Weight in Sublinear Time
Circuit Lower Bounds A combinatorial approach to P vs NP
Streaming & sampling.
Umans Complexity Theory Lectures
Joint work with Avishay Tal (IAS) and Jiapeng Zhang (UCSD)
Pseudorandomness when the odds are against you
Complexity of Expander-Based Reasoning and the Power of Monotone Proofs Sam Buss (UCSD), Valentine Kabanets (SFU), Antonina Kolokolova.
Tight Fourier Tails for AC0 Circuits
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Enumerating Distances Using Spanners of Bounded Degree
Linear sketching with parities
Y. Kotidis, S. Muthukrishnan,
K-wise vs almost K-wise permutations, and general group actions
Linear sketching over
On the effect of randomness on planted 3-coloring models
Linear sketching with parities
Undirected ST-Connectivity In Log Space
The Zig-Zag Product and Expansion Close to the Degree
The Weizmann Institute
Cryptography Lecture 18.
Explicit near-Ramanujan graphs of every degree
Presentation transcript:

Coding, Complexity and Sparsity workshop Pseudo-randomness Shachar Lovett (IAS) Coding, Complexity and Sparsity workshop August 2011

Overview Pseudo-randomness – what? why? Concrete examples Local independence Expander graphs Small space computations

What is pseudo-randomness? Explicit (mathematical) objects that “look like” random objects random objects? “look like”? explicit?

What is pseudo-randomness? Explicit (mathematical) objects that “look like” random objects Random object: random function, graph, … Look like: some functions (distinguishers / tests) cannot distinguish a random object from a pseudorandom one Explicit: can be generated efficiently

What is pseudo-randomness? Example: pseudo-random bits Random object: uniform bits Un{0,1}n Pseudo-random bits: random variable X{0,1}n Explicit: Pseudorandom generator - X=G(Ur) G:{0,1}r  {0,1}n “easy to compute” Seed length r<<n Tests: family of functions

What is pseudo-randomness good for? Derandomization of randomized algorithms Replace giving random bits to algorithm by “pseudo-random bits” “look like”: algorithm cannot distinguish random bits from pseudo-random bits Explicit: can generate pseudo-random bits efficiently Derandomization: enumerate all possibilities for pseudo-random bits

What is pseudo-randomness good for? Succinct randomized data structures Random data structures rely on randomness Problem: randomness cannot be compressed Solution: use pseudo-randomness instead “look like”: data structures still works Pseudo-randomness can be compressed

What is pseudo-randomness good for? Tool in extremal combinatorics: reduce proving thms to “random-looking” objects and “structured” objects Example: Szemeredi theorem k,>0, n large enough, s.t. any subset of {1,…,n} of size n contains an arithmetic progression of length k Proof [Gowers]: 1. Define good notion of pseudo-randomness for sets 2. Prove for pseudo-random sets (easy for random sets) 3. Non pseudo-random sets are structured: use structure

What is pseudo-randomness good for? Computational complexity: benchmark of how well we understand a computational model Computational model: family of functions Following are usually in increasing hardness: Find functions outside the computational model (i.e. cannot be computed exactly) Find functions which cannot be approximated by the computational model Construct pseudo-random generators for the model

pair-wise and k-wise independence Local independence: pair-wise and k-wise independence

Pair-wise independence Family of functions If is chosen uniformly, then for any , h(x), h(y) are uniform and independent, i.e. Pair-wise independent bits: U={1,..,n}, m=2

Pair-wise independence: construction Pair-wise independent bits: n=2m; Identify [n]={0,1}m Construction (all operation modulo 2): Proof: fix x≠y; choose h (i.e. a,b) uniformly If z≠0m: random variable <a,z>{0,1} is uniform Setting z=x+y: <a,x>+<a,y> uniform So: (<a,x>+b, <a,y>+b){0,1}2 uniform

Application: succinct dictionaries [Fredman-Komlós-Szemerédi’84] Problem: Dictionary with n entries, O(1) retrieval time Simple solution: hash to O(n2) locations FKS: two-step hashing, using O(n) locations Randomized procedures, so need to store randomness Pair-wise independence is crucial: Suffices to bound collisions (so algorithms still work) Can be represented succinctly (i.e. “compressed”)

k-wise independence k-wise independent bits: uniform Powerful derandomization tool, usually for k poly-logarithmic in n Many randomized data structures Some general computational models (AC0,PTFs,…)

k-wise independence AC0: constant depth circuits w. AND/OR gates Cannot distinguish uniform random bits from polylog(n)-wise independence [Braverman’09] AND OR x1 x2 x3 xn …

k-wise independence PTF: sign of real polynomials Basic building block in machine learning: neural network (degree 1) and support vector machines (higher degrees) Cannot distinguish uniform random bits from polylog(n)-wise independence [Kane’11]

Future research Which models fooled by k-wise independence? More complex local independence Designs: sequences of n bits of hamming weight m, each k indices “look uniform” K-wise independent permutations

Further reading Pairwise independence and derandomization, by Luby and Wigderson

Expander graphs

Expander graphs Expander graphs: graphs that “look random” Usually interested in sparse graphs Several notions of “randomness” Combinatorial: edge/vertex expansion Algebraic: eigenvalues Geometric: isoperimetric inequalities

Expander graphs Expander graphs: graphs that “look random” Usually interested in sparse graphs Several notions of “randomness” Combinatorial: edge/vertex expansion Algebraic: eigenvalues Geometric: isoperimetric inequalities

Applications of expander graphs Expander graphs have numerous applications; explicit instances of random-looking graphs Reliable networks Error reduction of randomized algorithms Codes with linear time decoding Metric embeddings Derandomization of small space algorithms …

Edge expansion G=(V,E) d-regular graph, |V|=n Edge boundary of SV: G is an -edge expander if h(G)≥

Edge expansion: high connectivity G=(V,E) d-regular graph, |V|=n G is an -edge expander if Thm: G is highly connected - if we remove n edges, largest connected component has ≥(1-)n vertices Proof: If then Union of connected components is C1,…,Ck connected components:

Spectral expansion G=(V,E) d-regular, |V|=n A = adjacency matrix of G Eigenvalues of A: d=1≥2≥…≥n Second eigenvalue: (G)=max(|2|, |n|) G is a spectral expander if (G)<d d-(G) : eigenvalue gap

Spectral expansion: rapid mixing G=(V,E) d-regular, |V|=n G is a spectral expander if (G)<d Thm: random walk on G mixes fast – from any initial vertex, a random walk of steps on edges of G ends in a nearly uniform vertex In particular, the diameter of G is

Spectral expansion: rapid mixing Thm: random walk on G mixes in steps Random walk matrix on G: Distribution on V: Random walk: Second eigenvalue: Probability of a random walk ending in vertex i:

Edge vs. Spectral expansion Edge expansion: high connectivity Spectral expansion: mixing of random walk Are they equivalent? No, but highly related [Cheeger’70,…;Dodziuk’84,Alon-Milman’85,Alon’86]

Vertex expansion: unique neighbors G is a unique neighbor expander if for every set S (of size ≤k) there exist an element in S which has a unique neighbor in G Crucial for some applications, e.g. graph-based codes If G is d regular, (,k)-vertex expander for >d/2 then G is a unique neighbor expander

Constructions of expanders Most constructions: spectral expanders Algebraic: explicit constructions [Margulis’73,…] Example (Selberg 3/16 theorem) Vertices: 3-regular graph: edges (x,x+1),(x,x-1),(x,1/x) Combinatorial: Zig-Zag [Reingold-Vadhan-Wigderson’02,…] Iterative construction of expanders from smaller ones

Optimal spectral expanders G: d-regular graph Optimal spectral expanders: Ramanujan Alon-Boppana bound: Achieved by some algebraic constructions Are most d-regular graphs Ramanujan? open Theorem[Friedman]: >0 any large enough n, most d-regular graphs have

Future research Spectral expanders: well understood, explicit optimal constructions Combinatorial expansion (edge/vertex): follow to some degree from spectral expansion, but not optimally Require other methods for construction (e.g. Zig-Zag) Current constructions far from optimal Some applications require tailor-made notions of “pseudo-random graphs”

Further reading Expander graphs and their applications, by Hoory, Linial and Wigderson

Derandomizing small space computations

Small space computation Model for computation with small memory, typically of logarithmic size Application: streaming algorithms Randomized computation: access to random bits Reads input from RAM, random bits from one way read-only tape Combinatorial model: branching programs

Branching programs Combinatorial model for small space programs Input incorporated into program, reads random bits as input Model: layered graph layer = all memory configurations of algorithm from each vertex two edges (labeled 0/1) go to next layer x1 x2 x3 x4 1 start accept width = 2memory reject accept

Pseudo-randomness for branching programs Branching programs “remember” only a small number of bits in each step Intuitively, this allows randomness to be recycled This can be made formal – main ingredient is expander graphs!

Pseudo-randomness for branching programs Generator for recycling randomness for branching programs [Nisan’92, Imapgliazzo-Nisan-Wigderson’94] Main idea: build generator recursively G0: generator that output 1 bit G1: generator that output 2 bits ... Gi: generator that outputs 2i bits … Glogn: generator that outputs n bits Main ingredient: efficient composition to get Gi+1 from Gi

Composition Assume Want Trivial composition: Yields: , i.e. trivial… Expander based composition [INW]: Let H be a d-regular expander, vertices = (x,y) = random edge in H; Yields: How good should the expander be?

Composition Expander based generator: Based on d-regular expanders seed = INW: to fool width poly(n), enough to take d=poly(n) (i.e. =poly(n)-1) seed length = O(logn)2 Main challenge: get seed length down to O(log n) Would allow enumeration of all seeds in poly(n) time

Limited models Can get optimal seed length, i.e. O(log n), for various limited models, e.g. Random walks on undirected graphs […,Reingold’05] Constant space reversible computations […,Koucky-Nimbhorkar-Pudlak’10] Symmetric functions […,Gopalan-Meka-Reingold-Zuckerman’11]

Future research Goal: seed length O(log n) for width poly(n) (i.e. log-space computations) Interesting limited models, open: Constant space computations (non-reversible) Reversible log-space Combinatorial rectangles (derandomize “independent random variables”)

Summary Pseudo-randomness: framework Applications: Context of “random objects” Define notion of pseudo-randomness via tests This talk: local independence, expander graphs Applications: Derandomization / compression of randomness This talk: small space algorithms Extremal combinatorics

THANK YOU