Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator Ronen Shaltiel (Hebrew U) & Chris Umans (MSR) 2001.

Slides:



Advertisements
Similar presentations
Low-End Uniform Hardness vs. Randomness Tradeoffs for Arthur-Merlin Games. Ronen Shaltiel, University of Haifa Chris Umans, Caltech.
Advertisements

Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka.
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
Extracting Randomness From Few Independent Sources Boaz Barak, IAS Russell Impagliazzo, UCSD Avi Wigderson, IAS.
Derandomization & Cryptography Boaz Barak, Weizmann Shien Jin Ong, MIT Salil Vadhan, Harvard.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Talk for Topics course. Pseudo-Random Generators pseudo-random bits PRG seed Use a short “ seed ” of very few truly random bits to generate a long string.
Simple extractors for all min- entropies and a new pseudo- random generator Ronen Shaltiel Chris Umans.
Uniform Hardness vs. Randomness Tradeoffs for Arthur-Merlin Games. Danny Gutfreund, Hebrew U. Ronen Shaltiel, Weizmann Inst. Amnon Ta-Shma, Tel-Aviv U.
Chapter Three: Closure Properties for Regular Languages
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
CS151 Complexity Theory Lecture 8 April 22, 2004.
Simple Affine Extractors using Dimension Expansion. Matt DeVos and Ariel Gabizon.
A survey on derandomizing BPP and AM Danny Gutfreund, Hebrew U. Ronen Shaltiel, Weizmann Inst. Amnon Ta-Shma, Tel-Aviv U.
Derandomized parallel repetition theorems for free games Ronen Shaltiel, University of Haifa.
Algorithms Recurrences. Definition – a recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs Example.
Time vs Randomness a GITCS presentation February 13, 2012.
Computability and Complexity 20-1 Computability and Complexity Andrei Bulatov Random Sources.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
1 Introduction to Computability Theory Lecture3: Regular Expressions Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture3: Regular Expressions Prof. Amos Israeli.
Yi Wu (CMU) Joint work with Parikshit Gopalan (MSR SVC) Ryan O’Donnell (CMU) David Zuckerman (UT Austin) Pseudorandom Generators for Halfspaces TexPoint.
1 1. Joint withA.Ta-shma & D.Zuckerman 2. Improved: R.Shaltiel and C. Umans Slides: Adi Akavia Extractors via Low- degree Polynomials.
Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.
ACT1 Slides by Vera Asodi & Tomer Naveh. Updated by : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Sergey Benditkis,
On Uniform Amplification of Hardness in NP Luca Trevisan STOC 05 Paper Review Present by Hai Xu.
Arithmetic Hardness vs. Randomness Valentine Kabanets SFU.
Testing Metric Properties Michal Parnas and Dana Ron.
1 Analysis of the Linux Random Number Generator Zvi Gutterman, Benny Pinkas, and Tzachy Reinman.
CS151 Complexity Theory Lecture 8 April 22, 2015.
1. 2 Overview Some basic math Error correcting codes Low degree polynomials Introduction to consistent readers and consistency tests H.W.
CS151 Complexity Theory Lecture 10 April 29, 2004.
The Power of Randomness in Computation 呂及人中研院資訊所.
1 The PCP starting point. 2 Overview In this lecture we’ll present the Quadratic Solvability problem. In this lecture we’ll present the Quadratic Solvability.
Copyright © Cengage Learning. All rights reserved.
On Everlasting Security in the Hybrid Bounded Storage Model Danny Harnik Moni Naor.
1 The PCP starting point. 2 Overview In this lecture we’ll present the Quadratic Solvability problem. We’ll see this problem is closely related to PCP.
1 2 Introduction In this lecture we’ll cover: Definition of strings as functions and vice versa Error correcting codes Low degree polynomials Low degree.
CS151 Complexity Theory Lecture 9 April 27, 2004.
Computer Security CS 426 Lecture 3
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
CS151 Complexity Theory Lecture 9 April 27, 2015.
SPANISH CRYPTOGRAPHY DAYS (SCD 2011) A Search Algorithm Based on Syndrome Computation to Get Efficient Shortened Cyclic Codes Correcting either Random.
CS555Spring 2012/Topic 51 Cryptography CS 555 Topic 5: Pseudorandomness and Stream Ciphers.
Private Approximation of Search Problems Amos Beimel Paz Carmi Kobbi Nissim Enav Weinreb (Technion)
Pseudorandom Generators and Typically-Correct Derandomization Jeff Kinne, Dieter van Melkebeek University of Wisconsin-Madison Ronen Shaltiel University.
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Umans Complexity Theory Lectures Lecture 17: Natural Proofs.
Pseudo-random generators Talk for Amnon ’ s seminar.
Error-Correcting Codes and Pseudorandom Projections Luca Trevisan U.C. Berkeley.
Technion Haifa Research Labs Israel Institute of Technology Underapproximation for Model-Checking Based on Random Cryptographic Constructions Arie Matsliah.
Umans Complexity Theory Lecturess Lecture 11: Randomness Extractors.
Pseudorandomness: New Results and Applications Emanuele Viola IAS April 2007.
Umans Complexity Theory Lectures Lecture 9b: Pseudo-Random Generators (PRGs) for BPP: - Hardness vs. randomness - Nisan-Wigderson (NW) Pseudo- Random Generator.
Complexity Theory and Explicit Constructions of Ramsey Graphs Rahul Santhanam University of Edinburgh.
(Proof By) Induction Recursion
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Derandomization & Cryptography
Introduction to Machine Learning
Umans Complexity Theory Lectures
Pseudorandomness when the odds are against you
Cryptography Lecture 6.
RS – Reed Solomon List Decoding.
The Curve Merger (Dvir & Widgerson, 2008)
Resolution Proofs for Combinational Equivalence
Indistinguishability by adaptive procedures with advice, and lower bounds on hardness amplification proofs Aryeh Grinberg, U. Haifa Ronen.
Impossibility of SNARGs
Presentation transcript:

Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator Ronen Shaltiel (Hebrew U) & Chris Umans (MSR) 2001

Motivation Good extractors exist, but are either: Very complex (recursive, iterated, composed) Work only in high min-entropy (TZS) Either ( n ), or n 1/ c with log n +O( c 2 log m ) seed All previously-known PRGs are based on the original NW construction One other construction exists but requires stronger assumptions

Contributions of This Paper New extractor construction Similar to TZS Requires less min-entropy New PRG construction Based on the above extractor No big improvement in parameters Both match the current best But simpler, self-contained construction

Overview of This Talk Introduction TZS Reminder New extractors New ideas Construction Proof Introduction to PRGs New PRGs

TZS Extractors Basic idea: view input x as a bivariate* polynomial 2 F q [ y 1, y 2 ] View seed y as a pair Extractor output is: This is a q-ary extractor (output alphabet is F q )

Reconstruction Paradigm Assume a next-symbol predictor f : F i ! F c, for small c = -2 Show there exists a function R f ( z ), s.t.: For large fraction of x 2 X, There exists z s.t. R f ( z )= x If k >| z |, we get a contradiction.

TZS Reconstruction Let L be a random line in F 2 x | L is a low-degree univariate polynomial: need only h =deg( x | L ) points to know value of x on all L. Get h ( i -1) values from advice string for i -1 successive parallel lines Use predictor f to predict next line

Details, Details … Predictor f is often wrong Points on L are pairwise-independent Can use Chebyshev to bound prob. that less than h will be correct f predicts lists of  -2 possible values add to advice string true value of x on random point on L W.h.p., agrees only with true candidate Requires O ( m ) more values

Last Comments We described a bivariate extractor; this can be generalized to d -variate Reduces h, which is good However, we need to predict h d values, so we end up losing more than we gain We ’ ve already seen how to convert a q-ary extractor to a binary one.

Pseudo-Random Generators The computational equivalent of extractors: Many (theoretical) applications ExtractorsPRGs Short random seed Weak random sourceNo random source Output statistically indistinguishable from U m Output computationally indistinguishable from U m

PRG: Formal Definition An -PRG for size s is a function G :{0,1} t ! {0,1} m, s.t. for any circuit C of size < s: Equivalent to next-bit predictors: no function f of size s can satisfy:

q-ary PRGs Analogous to q-ary extractors A -q-ary PRG has no next-symbol predictor f : F q i -1 ! F q c s.t.: Where c =  2 Like extractors, q-ary PRG ’ s can be converted to binary ones.

Main Idea Basically, same as extractor Use a hard predicate x ( i ) instead of a weak random source PRGs imply hard predicates: polytime function that require large circuits. Prove using reconstruction paradigm A predictor implies we can compute the hard function with a small circuit

Problem … And Solution We need too many prediction steps Need to compute x for any i Increases circuit size Solution: predict in jumps of growing sizes 1, m, m 2, …, m ` -1 Use ` different PRG “candidates” Each uses different step size If none is really a PRG, we can predict The XOR of all candidates is a PRG.

Some Definitions Let x :{0,1} log n ! {0,1} be a hard function (no circuit smaller than s ) Let F ’ be a subspace of F, | F ’|= h Need h d > n Let A be the successor matrix of F d, and A ’ of F ’ d Let 1 be the all-ones vector in F d 1 2 F ’ d as well

Construction Define, for j =1, …, ` : Each of these corresponds to one of the jump lengths. To get a PRG, we XOR all of them.

Proof Need h to be a prime power, q a power of h. We want a polynomial x( A ’ i 1)= x ( i ) F ’ d is big enough to find one x has degree  h in all variables, total degree  hd Only takes values in F ’, and these have order  h

Proof (Cont.) Assume none of ` candidates is good Let f ( j ) be the predictor for G x ( j ) We will reconstruct x from those using a small circuit (contradiction!) Advice string contains value of x on m consecutive places Actually m consecutive curves Use same overlapped prediction process as before (almost…)

Stepping Scheme Denote first advice value by A a 1, and we want to get to i = A b 1 First, predict A c1 1, where c 1 has the same lowest m -ary digit as b Now, predict A c2 1, where c 2 has the same two lowest m -ary digits as b Go on, until we can predict i.

Stepping Scheme: Example aa+m-1a+1 (a) m =134(b) m =302 m=5

Stepping Scheme: Example f (0) aa+m-1a+1 (a) m =134(b) m =302 m=5

Stepping Scheme: Example f (0) aa+m-1a+1a+m (a) m =134(b) m =302 m=5

Stepping Scheme: Example aa+m-1a+1a+ma+m+1 (a) m =134(b) m =302 m=5

Stepping Scheme: Example aa+m-1a+1a+ma+2m-1 (a) m =134(b) m =302 m=5

Stepping Scheme: Example aa+m-1 a+m 2 a+ma+1 (a) m =134(b) m =302 m=5

Stepping Scheme: Example ac1c1 a+ma+1 (a) m =134(b) m =302 (c 1 ) m =142 m=5 a+m 2

Stepping Scheme: Example (a) m =134(b) m =302 (c 1 ) m =142 a c 1 +m a+ma+1 c 1 +(m-1)m c 1 +3m m=5 a+m 2 c1c1

Stepping Scheme: Example f (1) c1c1 c 1 +4mc 1 +m (a) m =134(b) m =302 m=5 c 1 +m 2 (c 1 ) m =142

Stepping Scheme: Example (a) m =134(b) m =302 m=5 c1c1 c 1 +4mc 1 +mc 1 +m 2 c 1 +m 2 +m (c 1 ) m =142

Stepping Scheme: Example c 1 +m 3 (a) m =134(b) m =302 m=5 c1c1 c 1 +4mc 1 +mc 1 +m 2 (c 1 ) m =142

Stepping Scheme: Example c 1 +m 3 (a) m =134(b) m =302 m=5 c1c1 c 1 +4mc2c2 c 1 +m 2 (c 1 ) m =142 (c 2 ) m =202

Stepping Scheme: Example c 1 +m 3 (a) m =134(b) m =302 m=5 c1c1 c 1 +4mc2c2 c 1 +m 2 (c 1 ) m =142 C 2 +(m-1)m 2 (c 2 ) m =202

Stepping Scheme: Example c 1 +m 3 (a) m =134 (b) m =302 m=5 c1c1 c 1 +4mc2c2 c 1 +m 2 (c 1 ) m =142 C 2 +(m-1)m 2 (c 2 ) m =202 b

One More Snag We ’ re predicting along curves in interleaved fashion Curves need to intersect randomly But now we are changing step sizes For all i, and all step sizes S = m j, need A i p 1 and A i + S p 2 to intersect at r random points. Can be done if curve degree is `r.

Results Given a hard predicate on log n bits Computable in poly( n ) Minimum circuit size s We construct a 1/ m -PRG for size m m = s (1) Seed length t =O(log 2 n /log s ) Output length m Computable in poly( n )