Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multiplicity Codes Swastik Kopparty (Rutgers) (based on [K-Saraf-Yekhanin ’11], [K ‘12], [K ‘14])

Similar presentations

Presentation on theme: "Multiplicity Codes Swastik Kopparty (Rutgers) (based on [K-Saraf-Yekhanin ’11], [K ‘12], [K ‘14])"— Presentation transcript:


2 Multiplicity Codes Swastik Kopparty (Rutgers) (based on [K-Saraf-Yekhanin ’11], [K ‘12], [K ‘14])

3 This talk Multiplicity codes – Generalize polynomial-evaluation codes – Based on evaluating polynomials & derivatives Local decoding – Gave first locally decodable codes with rate  1 List-decoding – Attain list-decoding capacity Beyond – Local list-decoding, pseudorandom constructions, …

4 Error-correcting codes Codewords c r

5 Local decoding/correction

6 Encoding Raw data Codeword Corrupted

7 Decoding Corrupted Codeword Original data What if we want to see the original data? Decoder

8 What if we are only interested in one bit (or a few bits) of the original data?

9 Decoding 1 bit Corrupted Codeword Message bit number i What if we want only message bit number i? Decoder

10 Locally Decoding 1 bit Corrupted Codeword Original bit number i

11 Locally Decoding 1 bit (slow motion) Corrupted Codeword Original bit number i

12 A multivariate polynomial code Large finite field F q of size q ± > 0 Interpret original data as a polynomial P(X,Y) – degree(P) · d = (1- ± ) q Encoding: – Evaluate P at each point of F q 2 Rate ¼ (d 2 /2)/q 2 = (1- ± ) 2 /2 Distance = ± – Two low degree polynomials cannot agree on many points of F q 2 Fq2Fq2

13 Locally correcting polynomial codes Main idea: – Restricting a low-degree multivariate polynomial to a line gives a low- degree univariate polynomial So: – The value of P(a,b) can be recovered from the values of P on any line L through (a,b) – Even if there are errors Picking the line L at random Less than ± /2 errors on the line (univariate polynomial decoding) Decoding queries: – # points on a line = q ¼ k 0.5 Fq2Fq2 (a,b) L

14 More variables Generalization: – m variables – Now m-variate polynomials with degree · (1- ± ) q = d Distance = ± Rate ¼ (d m /m!)/q m = (1- ± ) m /m! Local correction via lines again – Decoding time ¼ q ¼ O(k 1/m )

15 Multiplicity codes [KSY ’11]

16 Derivatives Given a polynomial P(X, Y) 2 F[X,Y] Define P X = dP/dX, P Y = dP/dY, P XX, P XY, … – Everything formally If F has small characteristic: – then use “Hasse derivatives” Multiplicity P(X,Y) vanishes at (a,b) with mult ¸ s if for all i,j with i+j < s, P X i Y j (a,b) = 0 Notation: Mult(P, (a,b)) Mult(P, (a,b)) ¸ 1, P(a,b) = 0

17 Agreement multiplicity Defn: If all derivatives of P and Q of order < s agree on (a,b) – then P and Q agree on (a,b) with multiplicity s. – Mult(P-Q, (a,b)) ¸ s

18 Multiplicity codes Fq2Fq2

19 Distance of multiplicity codes How many (a,b) s.t. – P(a,b) = Q(a,b), P X (a,b) = Q X (a,b), P Y (a,b) = Q Y (a,b) – “P and Q have a multiplicity 2 agreement at (a,b)” Lemma (see [Dvir-K-Saraf-Sudan 09]): “Even high degree polynomials cannot have too many high-multiplicity zeroes” – For every P of degree at most d: E x 2 F m [ mult(P, x) ] · d/q ) two polynomials P and Q cannot agree with multiplicity s in more than d/sq fraction of the points of F q 2 ) the previous multiplicity code has distance ±

20 Locally correcting multiplicity codes Main idea: – Restricting a multivariate polynomial along with its derivatives to a line gives a univariate polynomial along with its derivative As before: – Pick random line L through (a,b) – Looking at P, P X, P Y restricted to L is enough to give the univariate poly P| L Even if there are errors Univariate multiplicity decoding! – Nielsen, Rosenbloom-Tsfasman – This gives P(a,b) and dP/dL (a,b) We want P(a,b), P X (a,b), P Y (a,b) Pick another random line L’ – dP/dL and dP/.dL’ are enough to give P X and P Y Fq2Fq2 (a,b) L

21 Higher multiplicities Consider 2-variable polynomials of degree s(1- ± )q – Evaluate all derivatives upto order s Rate = s/(s+1) (1- ± ) 2 – This approaches 1 ! Decoding: – s random lines through (a,b) – Decoding time ¼ s q ¼ O(k 0.5 )

22 More variables, many derivatives

23 L x

24 Local decodability of multiplicity codes [KSY ‘11]

25 List-Decoding Multiplicity Codes [K’12]

26 List-Decoding Multiplicity Codes Codewords c r

27 Two theorems on list-decoding multiplicity codes

28 List-Decoding Univariate Multiplicity Codes

29 List-Decoding Univariate Polynomial Codes [Sudan, Guruswami-Sudan] Given r: F q  F q, – find f(X) of degree at most d which is close to r. Based on Interpolation + Root-finding Step 1: Interpolation: find Q(X,Y) s.t. Q(x, r(x)) = 0 for each x – Claim: any f(X) which is close to r satisfies Q(X, f(X)) = 0 Step 2: Solving a polynomial equation: – Solve Q(X, f(X)) = 0 (as a formal polynomial equation) – Find all such f with degree at most d. (can be done by standard algorithms)

30 List-Decoding Univariate Multiplicity Codes

31 List-Decoding Multivariate Multiplicity Codes

32 List-decoding multivariate multiplicity codes

33 Other results

34 Summary and Questions Codes of rate approaching 1 – Decodable in O(k ² ) time – List-decodable beyond half the minimum distance in O(k ² ) time – Alternative construction of codes achieving list-decoding capacity For all we know, there exist codes with: – Rate approaching 1 – Decodable in polylog(k) time List-decoding upto capacity with constant list-size? Use multiplicity codes in practice? – Already theoretically practical Further applications/generalizations?

Download ppt "Multiplicity Codes Swastik Kopparty (Rutgers) (based on [K-Saraf-Yekhanin ’11], [K ‘12], [K ‘14])"

Similar presentations

Ads by Google