# Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.

## Presentation on theme: "Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa."— Presentation transcript:

Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa

Outline of talk 1. Extractors as graphs with expansion properties 2. Extractors as functions which extract randomness 3. Applications 4. Explicit Constructions

Extractor graphs: Definition [NZ] An extractor is an (unbalanced) bipartite graph M< { "@context": "http://schema.org", "@type": "ImageObject", "contentUrl": "http://images.slideplayer.com/5/1572389/slides/slide_3.jpg", "name": "Extractor graphs: Definition [NZ] An extractor is an (unbalanced) bipartite graph M<

Extractor graphs: expansion properties (K,ε)-Extractor: set X of size K the dist. E(X,U) ε-close to uniform. => expansion property: set X of size K, |Γ)x)| (1-ε)M. Distribution versus Set size X N{0,1} n M{0,1} m K Γ(X) (1-ε)M *A distribution P is ε-close to uniform if ||P-U|| 1 2 ε => P supports 1-ε elements. x Identify X with the uniform distribution on X

Extractors and Expander graphs X N{0,1} n M{0,1} m Γ(X) (1-ε)M Extractor N{0,1} n X Γ(X) D=2 d edges (1+δ)-Expander K (1+δ)K K N{0,1} n

Requires degree log N Allows constant degree Extractors and Expander graphs X N{0,1} n M{0,1} m Γ(X) (1-ε)M Extractor N{0,1} n X Γ(X) (1+δ)-Expander (1+δ)K N{0,1} n Balanced graph Unbalanced graph Absolute expansion: K -> (1+δ)K Relative expansion: K -> (1-ε)M K/N -> (1-ε) Expands sets smaller than threshold K Expands sets larger than threshold K K K

Outline of talk 1. Extractors as graphs with expansion properties 2. Extractors as functions which extract randomness 3. Applications 4. Explicit Constructions

Successful Paradigm in CS: Probabilistic Algorithms. Probabilistic Algorithms/Protocols: Use an additional input stream of independent coin tosses. Helpful in solving computational problems. Where can we get random bits? The initial motivation: running probabilistic algorithms with real-life sources We have access to distributions in nature: Electric noise Key strokes of user Timing of past events These distributions are somewhat random but not truly random. Paradigm: [SV,V,VV,CG,V,CW,Z]. Randomness Extractors Assumption for this talk: Somewhat random = uniform over subset of size K. random coins Probabilistic algorithm input output Randomness Extractor Somewhat random

Parameters: (function view) Source length: n (= log N) Seed length: d ~ O(log n) Entropy threshold: k ~ n/100 Output length: m ~ k Required error: ε ~ 1/100 We allow an extractor to also receive an additional input of (very few) random bits. Extractors use few random bits to extract many random bits from arbitrary distributions which contain sufficient randomness. Extractors as functions that use few bits to extract randomness source distribution X Extractor seed Y random output Randomness Definition: A (K,ε)-extractor is a function E(x,y) s.t. For every set. X of size K, E(X,U) is ε-close * to uniform. Lower bounds [NZ,RT]: seed length (in bits) log n Probabilistic method [S,RT]: Exists optimal extractor which matches lower bound and extracts all the k=log K random bits in the source distribution. Explicit constructions: E(x,y) can be computed in poly-time.

Simulating probabilistic algorithms using weak random sources Goal: Run prob algorithm using a somewhat random distribution. Where can we get a seed? Idea: Go over all seeds. Given a source element x. y compute z y = E(x,y) Compute Alg(input,z y ) Answer majority vote. Seed=O(logn) => poly-time Explicit constructions. Probabilistic algorithm input output random coins Randomness Extractor seed Somewhat random

Outline of talk 1. Extractors as graphs with expansion properties 2. Extractors as functions which extract randomness 3. Applications 4. Explicit Constructions

Applications Simulating probabilistic algorithms using weak sources of randomness [vN,SV,V,VV,CG,V,CW,Z]. Constructing Graphs (Expanders, Super- concentrators) [WZ]. Oblivious sampling [S,Z]. Constructions of various pseudorandom generators [NZ,RR,STV,GW,MV]. Distributed algorithms [WZ,Z,RZ]. Cryptography [CDHK,L,V,DS,MST]. Hardness of approximations [Z,U,MU]. Error correcting codes [TZ].

Expanders that beat the eigenvalue bound [WZ] Goal: Construct low deg expanders with huge expansion. Line up two low degree extractors. set X of size K, |Γ)x)| (1-ε)M > M/2. sets X,X of size K X and X have common neighbour. Contract middle layer. Low degree (ND 2 /K) bipartite graph in which every set of size K sees N-K vertices. Better constructions for large K [CRVW]. N{0,1} n X X

v1v1 v2v2 v3v3 vDvD Randomness efficient (oblivious) sampling using expanders Random walk variables v 1..v D behave like i.i.d: A of size ½M Hitting property: Pr[i : v i A] δ = 2 -Ω(D). Chernoff style property: Pr[#i : v i A far from exp.] 2 -Ω(D). # of random bits used for walk: m+O(D)=m+O(log(1/δ)) # of random bits for i.i.d. mD=m O(log(1/δ)) M{0,1} m Random walk on constant degree expander

Randomness efficient (oblivious) sampling using extractors [S] Given parameters m, δ: Use E with K=M=2 m, N=M/δ and small D. Choose random x: m+log(1/δ) random bits. Set v i =E(x,i) Ext property Hitting property A of size ½M Call x bad if E(x) inside A. # of bad xs < K Pr[x is bad] < K/N = δ D edges x N{0,1} n M{0,1} m E(x,1) E(x,D).. bad x s (1-ε)M A

Every (oblivious) sampling scheme yields an extractor An (oblivious) sampling scheme uses a random n bit string x to generated D random variables with Chrnoff style property. Thm: [Z] The derived graph is an extractor. Extractors oblvs Sampling D=2 d edges x N{0,1} n M{0,1} m v1v1 vDvD..

Outline of talk 1. Extractors as graphs with expansion properties 2. Extractors as functions which extract randomness 3. Applications 4. Explicit Constructions

Constructions

Extractors from error correcting codes Can construct extractors from error-correcting code [ILL,SZ,T]. Short seed. Extract one additional bit Extractors that extract one additional bit List-decodable error-correcting codes Extractors that extract many bits codes with strong list-recovering properties [TZ].

20% errors List-decodable error-correcting codes [S] encoding noisy channeldecoding x xEC(x) x encoding EC(x) extremely noisy channel EC(x) x1x1 x2x2 x3x3 List decoding 49% errors EC(x) is 20%-decodable if for every w there is a unique x s.t. EC(x) differs from w in 20% of positions. EC(x) is (49%,t)-list-decodable if for every w there are at most t x s s.t. EC(x) differs from w in 49% of positions. There are explicit constructions of such codes.

Extractors from list-decodable error-correcting codes [ILL,T] Thm: If EC(x) is (½-ε,εK)-list-decodable then E(x,y)=(y,EC(x) y ) is a (K,2ε)-extractor. Note: E outputs its seed y. Such an extractor is called strong. E outputs only one additional output bit EC(x) y There are constructions of list-decodable error correcting codes with |y|=O(log n). Strong extractors with one additional bit List- decodable error correcting codes. Strong extractors with many additional bits translate into very strong error correcting codes [TZ].

Extractors from list-decodable error-correcting codes: proof Thm: If EC(x) is (½-ε,εK)-list-decodable then E(x,y)=(y,EC(x) y ) is a (K,2ε)-extractor. Proof: by contradiction. Let X be a distribution/set of size K s.t. E(X,Y)=(Y,EC(X) Y ) is far from uniform. Observation: Y and EC(X) Y are both uniform. They are correlated. Exists P s.t. P(Y)=EC(X) Y with prob > ½+2ε.

Extractors from list-decodable error-correcting codes: proof II Thm: If EC(x) is (½-ε,εK)-list-decodable then E(x,y)=(y,EC(x) y ) is a (K,2ε)-extractor. Exists P s.t. Pr X,Y [P(Y)=EC(X) Y ] > ½+2ε. By a Markov argument: For εK x s in X Pr Y [P(Y)=EC(x) Y ] > ½+ε. Think of P as a string P y =P(y). We have that P and EC(x) differ in ½-ε coordinates. Story so far: If E is bad then there is a string P s.t. for εK x s P and EC(x) differ in few coordinates.

Extractors from list-decodable error-correcting codes: proof III Thm: If EC(x) is (½-ε,εK)-list-decodable then E(x,y)=(y,EC(x) y ) is a (K,2ε)-extractor. Story so far: If E is bad then there is a string P s.t. for εK x s P and EC(x) differ in ½-ε coordinates. x encoding EC(x) noisy channel P=EC(x) x1x1 x2x2 x3x3 List decoding 49% errors By list-decoding properties of the code: # of such x s < εK. Contradiction!

Roadmap Can construct extractors from error-correcting code. Short seed. Output = Seed + 1. Next: How to extract more bits. General paradigm: Once you construct one extractor you can try to boost its quality.

Y Starting point: An extractor E that extracts only few bits. Idea: (X|E(X,Y)) contains randomness. We can apply E to extract randomness from (X|E(X,Y)). Need a fresh seed. E (X;(Y,Y ))=E(X,Y),E(X,Y ) Extract more randomness. Use larger seed. Extracting more bits [WZ] X Extractor Y Z Z Z X New Extractor Y Y

Trevisan s extractor: reducing the seed length Idea: Use few random bits to generate (correlated) seeds Y 1,Y 2,Y 3 … Walk on expander? Extractor? Works but gives small savings. Trevisan: use Nisan-Wigderson pseudorandom generator (based on combinatorial designs). [TZS,SU]: Use Y,Y+1,Y+2,... (based on the [STV] algorithm for list-decoding Reed-Muller code). X Extractor Y1Y1 Y2Y2 Y

The extractor designer tool kit Many ways to compose extractors with themselves and related objects. Arguments use entropy manipulations depend on function view of extractors. Impact on other graph construction problems: Expander graphs (zig-zag product) [RVW,CRVW]. Ramsey graphs that beat the Frankl-Wilson construction [BKSSW,BRSW].

Y Entropy manipulations: composing two extractors [Z,NZ] X 2 Small Extractor Z X 1 Large Extractor Observation: Can compose a small ext. and a large ext. and obtain ext. which inherits small seed and large output. Paradigm: If given only one source try to convert it into two sources that are sufficiently independent. Two independent sources

Summary: Extractors are X M{0,1} m K=2 k Γ(X) (1-ε)M source distribution X Extractor seed Y random output Randomness Functions Graphs

Conclusion Unifying role of extractors: Expanders, Oblivious samplers, Error correcting codes, Pseudorandom generators, hash functions … Open problems: More applications/connections. The quest for explicitly constructing the optimal extractor. (Current record [LRVW]). Direct and simple constructions. Things I didn t talk about: Seedless extractors for special families of sources.

That s it …

Download ppt "Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa."

Similar presentations