How to get more mileage from randomness extractors Ronen Shaltiel University of Haifa.

Slides:



Advertisements
Similar presentations
Randomness Conductors (II) Expander Graphs Randomness Extractors Condensers Universal Hash Functions
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Tight Bounds for Distributed Functional Monitoring David Woodruff IBM Almaden Qin Zhang Aarhus University MADALGO.
Optimal Bounds for Johnson- Lindenstrauss Transforms and Streaming Problems with Sub- Constant Error T.S. Jayram David Woodruff IBM Almaden.
Unconditional Weak derandomization of weak algorithms Explicit versions of Yao s lemma Ronen Shaltiel, University of Haifa :
Low-End Uniform Hardness vs. Randomness Tradeoffs for Arthur-Merlin Games. Ronen Shaltiel, University of Haifa Chris Umans, Caltech.
On the (Im)Possibility of Arthur-Merlin Witness Hiding Protocols Iftach Haitner, Alon Rosen and Ronen Shaltiel 1.
Invertible Zero-Error Dispersers and Defective Memory with Stuck-At Errors Ariel Gabizon Ronen Shaltiel.
Correlation Extractors and Their Applications Yuval Ishai Technion Based on joint work with Eyal Kushilevitz Rafail Ostrovsky Amit Sahai.
An Introduction to Randomness Extractors Ronen Shaltiel University of Haifa Daddy, how do computers get random bits?
Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka.
If NP languages are hard on the worst-case then it is easy to find their hard instances Danny Gutfreund, Hebrew U. Ronen Shaltiel, Haifa U. Amnon Ta-Shma,
Deterministic Extractors for Small Space Sources Jesse Kamp, Anup Rao, Salil Vadhan, David Zuckerman.
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan And improvements with Kai-Min Chung.
Randomness Extractors & their Cryptographic Applications Salil Vadhan Harvard University
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
Detection of Algebraic Manipulation with Applications to Robust Secret Sharing and Fuzzy Extractors Ronald Cramer, Yevgeniy Dodis, Serge Fehr, Carles Padro,
Short seed extractors against quantum storage Amnon Ta-Shma Tel-Aviv University 1.
Extracting Randomness From Few Independent Sources Boaz Barak, IAS Russell Impagliazzo, UCSD Avi Wigderson, IAS.
Foundations of Cryptography Lecture 7 Lecturer:Danny Harnik.
Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka.
Deterministic extractors for bit- fixing sources by obtaining an independent seed Ariel Gabizon Ran Raz Ronen Shaltiel Seedless.
Extracting Randomness David Zuckerman University of Texas at Austin.
Derandomization & Cryptography Boaz Barak, Weizmann Shien Jin Ong, MIT Salil Vadhan, Harvard.
Randomness Extraction and Privacy Amplification with quantum eavesdroppers Thomas Vidick UC Berkeley Based on joint work with Christopher Portmann, Anindya.
Foundations of Cryptography Lecture 2: One-way functions are essential for identification. Amplification: from weak to strong one-way function Lecturer:
1 Efficient Pseudorandom Generators from Exponentially Hard One-Way Functions Iftach Haitner, Danny Harnik, Omer Reingold.
Simple extractors for all min- entropies and a new pseudo- random generator Ronen Shaltiel Chris Umans.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
CIS 5371 Cryptography 3b. Pseudorandomness.
A survey on derandomizing BPP and AM Danny Gutfreund, Hebrew U. Ronen Shaltiel, Weizmann Inst. Amnon Ta-Shma, Tel-Aviv U.
Derandomized parallel repetition theorems for free games Ronen Shaltiel, University of Haifa.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Some Limits on Non-Local Randomness Expansion Matt Coudron and Henry Yuen /12/12 God does not play dice. --Albert Einstein Einstein, stop telling.
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator Ronen Shaltiel (Hebrew U) & Chris Umans (MSR) 2001.
ACT1 Slides by Vera Asodi & Tomer Naveh. Updated by : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Sergey Benditkis,
3-source extractors, bi-partite Ramsey graphs, and other explicit constructions Boaz barak rOnen shaltiel Benny sudakov avi wigderson Joint work with GUY.
CSE 830: Design and Theory of Algorithms Dr. Eric Torng.
1 Streaming Computation of Combinatorial Objects Ziv Bar-Yossef U.C. Berkeley Omer Reingold AT&T Labs – Research Ronen.
The Power of Randomness in Computation 呂及人中研院資訊所.
Extractors with Weak Random Seeds Ran Raz Weizmann Institute.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Simulating independence: new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors Boaz Barak Guy Kindler Ronen Shaltiel Benny Sudakov.
Why Extractors? … Extractors, and the closely related “Dispersers”, exhibit some of the most “random-like” properties of explicitly constructed combinatorial.
Private Approximation of Search Problems Amos Beimel Paz Carmi Kobbi Nissim Enav Weinreb (Technion)
1 New Coins from old: Computing with unknown bias Elchanan Mossel, U.C. Berkeley
One-way multi-party communication lower bound for pointer jumping with applications Emanuele Viola & Avi Wigderson Columbia University IAS work done while.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
1 Explicit Two-Source Extractors and Resilient Functions Eshan Chattopadhyay David Zuckerman UT Austin.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
When is Key Derivation from Noisy Sources Possible?
Pseudo-random generators Talk for Amnon ’ s seminar.
Error-Correcting Codes and Pseudorandom Projections Luca Trevisan U.C. Berkeley.
Almost SL=L, and Near-Perfect Derandomization Oded Goldreich The Weizmann Institute Avi Wigderson IAS, Princeton Hebrew University.
Umans Complexity Theory Lecturess Lecture 11: Randomness Extractors.
Complexity Theory and Explicit Constructions of Ramsey Graphs Rahul Santhanam University of Edinburgh.
Information Complexity Lower Bounds
Derandomization & Cryptography
Randomness and Computation
Pseudo-derandomizing learning and approximation
The Curve Merger (Dvir & Widgerson, 2008)
Quantum Information Theory Introduction
Non-Malleable Extractors New tools and improved constructions
Indistinguishability by adaptive procedures with advice, and lower bounds on hardness amplification proofs Aryeh Grinberg, U. Haifa Ronen.
Cryptography Lecture 4.
Presentation transcript:

How to get more mileage from randomness extractors Ronen Shaltiel University of Haifa

Outline of this talk Motivation for randomness extractors. Deterministic and seeded extractors. Our results. Something about the proof

Randomness extractors (motivation) Daddy, how do computers get random bits? Do we have to tell that same old story again.

Randomness extractors (motivation) Randomness is essential in Computer Science: Cryptography Distributed Protocols Probabilistic Algorithms Algorithm designers always assume that we have access to a stream of independent unbiassed coin tosses. How do computers get random bits?

Refining randomness from nature We have access to distributions in nature: Particle reactions Key strokes of user Timing of past events (Really used in real life) These distributions are “somewhat random” but not “truly random”. Solution: Randomness Extractors random coins Probabilistic algorithm input output Somewhat random Randomness Extractor

Outline of this talk Motivation for randomness extractors. Deterministic and seeded extractors. Our results. Something about the proof

Seeded Randomness Extractors: Definition and two flavors C is a class of distributions over n bit strings “containing” k bits of (min)-entropy. A deterministic (seedless) C- extractor is a function E such that for every XєC, E(X) is ε- close to uniform. A seeded C-extractor has an additional (short i.e. log n) independent random seed as input. source distribution from C Extractor seed random output Deterministic A distribution X has min-entropy ≥ k if ∀ x: Pr[X=x] ≤ 2 -k Two distributions are ε-close if the probability they assign to any event differs by at most ε. Extractors turn out to have lots of applications in TCS.

A brief survey of randomness extractors Deterministic von-Neumann sources [vN51]. Markov Chains [Blu84]. Several independent sources [SV86,V86,V87,VV88,CG88,DEOR04, BIW04,BKSSW05,R05,R06,BRSW06]. Bit-fixing sources [CGHFRS85,KZ03,GRS04] Samplable sources [TV00,KRVZ06]. Affine sources [BKSSW05,GR05]. Seeded C = {distributions with (min)-entropy k} [Z91,NZ93]. Lower bound of log n on the seed length [NZ93,RT99]. Explicit constructions coming close to matching bound (mass of work).

Outline of this talk Motivation for randomness extractors. Deterministic and seeded extractors. Our results. Something about the proof

Getting more mileage from (deterministic) extractors before Deterministic C-Extractor extracts few bits Our result: A general transformation (extending [GRS04]) Deterministic C-Extractor extracts many bits after Applies to many classes C: several independent sources, samplable sources, bit-fixing sources*, affine sources*. *Already follows from [GRS04,GR05].

2-source extractors [SV86]: Consider the class of distributions X=(X 1,X 2 ) s.t. X 1,X 2 are independent distributions over n bits. X 1,X 2 have (min)-entropy k. Dfn: A 2-source extractor (for threshold k) is a deterministic extractor for this class. X1X1 nn X2X2 2-source extractor Goals: Achieve low entropy threshold e.g. k=o(n), major open problem (related to Ramsey graphs). Extract as many bits as possible (for large threshold, say k= ¾ n ). There are 2k random bits in source.

Getting more mileage from 2-source extractors comment# of bits extracted reference E(x 1,x 2 )= mod 2. 1[CG88] Ω(n)[Vaz87] Almost all the bits from one source and some from the other. k+Ω(n)[DEOR04] 2k-O(log(1/ε))Our result Lower bound. (matched by probabilistic construction). <2k-2log(1/ε)[RT98] 2-source extractors for entropy k= ¾ n and ε<1/n. Optimal except for the precise constant multiplying log(1/ε)! Proof: Transform existing construction [Raz05] into an extractor which extracts many bits. ¾ can be replaced with any constant > ½

Outline of this talk Motivation for randomness extractors. Deterministic and seeded extractors. Our results. Something about the proof

Getting more mileage from extractors: naïve approach x1x1 x2x2 x3x3 xnxn k random bits Deterministic Extractor random output Seeded Extractor Seeded Extractors are only guaranteed to work when the source and seed are independent. correlated!

Getting more mileage by reusing the output [GRS04]: The naïve approach can work! For the restricted class of bit-fixing sources. Assuming some additional properties of the deterministic and seeded extractors. [GR05]: Also works for affine sources. This paper: Extends the ideas of [GRS04] General sufficient conditions for an arbitrary class of sources.

The main theorem Let C be a class of distributions. Let X be a distribution in C. Let dE be a deterministic ε-extractor for C. Let sE be a seeded extractor with seed length t. Assume the following closeness condition: For every y ∊{0,1} t and every value a: (X|sE(X,y)=a) is a distribution in C. Then dE’(x)=sE(x,dE(x)) is a deterministic O(ε2 t )-extractor for C. The na ï ve approach works if: closeness condition satisfied. ε < 2 t

Closer look at closeness condition Previous intuition for naïve construction: dE extracts few bits and therefore (X|dE(X)=y) is a high entropy distribution. ⇒ sE can extract from (X|dE(X)=y). Problem: it could be the case that ∀ y: y is a bad seed for the source (X|dE(X)=y). Closeness Condition: For every y ∊{0,1} t and every value a: (X|sE(X,y)=a) is a distribution in C. Comment: (X|sE(X,y)=a) has lower entropy then X ⇒ In order to extract from X we must use dE which extracts from lower entropy distributions. Intuition and proof are different.

Outline of proof of main theorem (Simplifiying assumption ε=0) Goal: prove that: sE(X,dE(X)) ≈ sE(X,Y) Follows from: ∀ y: (sE(X,dE(X))|dE(X)=y) ≈ (sE(X,Y)|Y=y)  (sE(X,y)|dE(X)=y) ≈ sE(X,y) Will follow if ∀ y: sE(X,y) is independent of dE(X). and this follows from closeness condition: Closeness Condition: For every y ∊{0,1} t and every value a: (X|sE(X,y)=a) is a distribution in C. Therefore dE extracts randomness from this distribution and (dE(X)|sE(X,y)=a) ≈ Uniform As this occurs ∀ a we get that ∀ y: sE(X,y) is independent of dE(X). Use recycled bits Use independent bits Uniform distribution Actual proof is more technical because ε≠0

Summary before Deterministic C-Extractor extracts few bits Our result: A general transformation (extending [GRS04]) Deterministic C-Extractor extracts many bits after Applies to many classes C: We ’ ve seen: 2-independent sources. In paper: Distributions samplable by small circuits (defined by [TV])

Conclusions and open problems Technique can be applied to many deterministic extraction scenarios. Some additional work is needed to meet the closeness condition in various cases. At the moment we don’t always have good deterministic extractors to start from (e.g. low entropy 2-source extractors, samplable sources). Come up with new constructions of 2-source extractors and extractors for samplable distributions. Can this technique be used to reduce the seed length of seeded extractors? We provide some counterexamples.

That’s it… … having extracted many random bits they lived happily ever after.