Presentation is loading. Please wait.

Presentation is loading. Please wait.

Secure Computation (Lecture 5) Arpita Patra. Recap >> Scope of MPC > models of computation > network models > modelling distrust (centralized/decentralized.

Similar presentations


Presentation on theme: "Secure Computation (Lecture 5) Arpita Patra. Recap >> Scope of MPC > models of computation > network models > modelling distrust (centralized/decentralized."— Presentation transcript:

1 Secure Computation (Lecture 5) Arpita Patra

2 Recap >> Scope of MPC > models of computation > network models > modelling distrust (centralized/decentralized adversary) > modelling adversary > Various Parameters/questions asked in MPC >> Defining Security of MPC > Ideal World & Real world > Indistinguishability of the view of adversary in real and ideal (with the help of simulator) world > Indistinguishability of the joint dist. of the output of the honest parties and the view of adversary in real and ideal (with the help of simulator) world.

3 Ideal World MPC x1x1 x2x2 x3x3 x4x4 Any task (y 1,y 2,y 3,y 4 ) = f(x 1,x 2,x 3,x 4 )

4 Ideal World MPC Any task y1y1 y2y2 y4y4 y3y3 The Ideal World y1y1 y2y2 y4y4 y3y3 The Real World (y 1,y 2,y 3,y 4 ) = f(x 1,x 2,x 3,x 4 ) x1x1 x2x2 x3x3 x4x4 x1x1 x2x2 x3x3 x4x4

5 How do you compare Real world with Ideal World? >> Fix the inputs of the parties, say x 1,….x n >> Real world view of adv contains no more info than ideal world view View Real i : The view of P i on input (x 1,….x n ) - Leaked Values {x 3, y 3, r 3, protocol transcript} The Real World y1y1 y2y2 y4y4 {View Real i } Pi in C {x 3, y 3 } y1y1 y2y2 y4y4 The Ideal World View Ideal i : The view of P i on input (x 1,….x n ) - Allowed values {View Ideal i } Pi in C Our protocol is secure if the leaked values contains no more info than allowed values

6 Real world (leaked values) vs. Ideal world (allowed values) {x 3, y 3, r 3, protocol transcript} The Real World y1y1 y2y2 y4y4 {x 3, y 3 } y1y1 y2y2 y4y4 The Ideal World >> If leaked values can be efficiently computed from allowed values. >> Such an algorithm is called SIMULATOR (simulates the view of the adversary in the real protocol). >> It is enough if SIM creates a view of the adversary is “close enough” to the real view so that adv. can not distinguish from its real view.

7 Definition1: View indistinguishability of Adversary in Real and Ideal world {View Real i } Pi in C The Real World y1y1 y2y2 y4y4 {x 3, y 3 } y1y1 y2y2 y4y4 The Ideal World SIM Interaction on behalf of the honest parties SIM: Ideal Adversary {View Ideal i } Pi in C  Random Variable/distribution (over the random coins of parties) Random Variable/distribution (over the random coins of SIM and adv) {x 3, y 3, r 3, protocol transcript}

8 Definition 2: Indistinguishability of Joint Distributions of Output and View >> Joint distribution of output & view of the honest & corrupted parties in both the worlds can not be told apart Output Real i : The output of P i on input (x 1,….x n ) when P i is honest. View Real i : As defined before when P i is corrupted. The Real World [ {View Real i } Pi in C, {Output Real i } Pi in H ] The Ideal World Output Ideal i : The output of P i on input (x 1,….x n ) when P i is honest. View Ideal i : As defined before when P i is corrupted. [ {View Ideal i } Pi in C, {Output Ideal i } Pi in H ]  >> First note that this def. subsumes the previous definition (stronger) >> Captures randomized functions as well

9 Randomized Function and Definition 1 The Real WorldThe Ideal World >> Is this protocol secure? The proof says the protocol is secure! f(, ) = (r, ) r is a random bit r... Sample r randomly.. Sample r randomly and output r No! {View Real i } Pi in C SIM Interaction on behalf of the honest party Sample and send a random r’ {View Ideal i } Pi in C  r : r is random r’ : r’ random

10 Randomized Function and Definition 2 The Real WorldThe Ideal World >> Is this protocol secure? The proof says the protocol is insecure! f(, ) = (r, ) r is a random bit r... Sample r randomly.. Sample r randomly and output r No! {View Real i } Pi in C SIM {View Ideal i } Pi in C  [ {View Real i } Pi in C, {Output Real i } Pi in H ] [ {View Ideal i } Pi in C, {Output Ideal i } Pi in H ]  [{r’, r} | r,r’ random and independent] [{r, r} | r random] Interaction on behalf of the honest party Sample and send a random r’

11 Definition 1 is Enough! { {View Real i } Pi in C } {x1,.,xn, k} { {View Ideal i } Pi in C } {x1,.,xn, k}  { {View Real i } Pi in C,{Output Real i } Pi in H } {x1,..xn, k} { {View Ideal i } Pi in C,{Output Ideal i } Pi in H } {x1,.,xn, k}  >> For deterministic Functions: > View of the adversary and output are NOT co-related. > We can separately consider the distributions > Output of the honest parties are fixed for inputs >> For randomized functions: > We can view it as a deterministic function where the parties input randomness (apart from usual inputs). Compute f((x 1,r 1 ), (x 2,r 2 )) to compute g(x 1,x 2 ;r) where r 1 +r 2 can act as r.

12 Making Indistinguishability Precise Notations: o Security parameter k (natural number) o We wish security to hold for all inputs of all lengths, as long as k is large enough Definition (Function  is negligible): If for every polynomial p(  ) there exists an N such that for all k > N we have  (k) < 1/p(k) Definition (Probability ensemble X={X(a,k)}): o Infinite series, indexed by a string a and natural k o Each X(a,k) is a random variable In our context: o X(x 1,.,x n, k) = { {View Real i } Pi in C } {x1,.,xn, k} (Probability space: randomness of parties) o Y(x 1,.,x n, k) = { {View Ideal i } Pi in C } {x1,.,xn, k} (Probability space: randomness of the corrupt parties and simulator)

13 Computational Indistinguishability o X(x 1,.,x n, k) = { {View Real i } Pi in C } {x1,.,xn, k} (Probability space: randomness of parties) o Y(x 1,.,x n, k) = { {View Ideal i } Pi in C } {x1,.,xn, k} (Probability space: randomness of the corrupt parties and simulator) Definition (Computational indistinguishability of X = {X(a,k)}  c Y = {Y(a,k)}) For every polynomial-time distinguisher* D there exists a negligible function  such that for every a and all large enough k’s: |Pr[D(X(a,k) = 1 ] - Pr[D(Y(a,k) = 1 ]| <  (k) For our case a: (x 1,.,x n ) Alternative def. Adv D (X,Y) = The prob of D guessing the correct distribution |Adv D (X,Y)| < ½ +  (k) distinguisher D = Real Adv A

14 Statistical Indistinguishability o X(x 1,.,x n, k) = { {View Real i } Pi in C } {x1,.,xn, k} (Probability space: randomness of parties) o Y(x 1,.,x n, k) = { {View Ideal i } Pi in C } {x1,.,xn, k} (Probability space: randomness of the corrupt parties and simulator) Definition (Statistical indistinguishability of X = {X(a,k)}  s Y = {Y(a,k)}) For every* distinguisher D there exists a negligible function  such that for every a and all large enough k’s: |Pr[D(X(a,k) = 1 ] - Pr[D(Y(a,k) = 1 ]| <  (k) For our case a: (x 1,.,x n ) Alternative def. Adv D (X,Y) = The prob of D guessing the correct distribution |Adv D (X,Y)| < ½ +  (k) May have unbounded power

15 Perfect Indistinguishability o X(x 1,.,x n, k) = { {View Real i } Pi in C } {x1,.,xn, k} (Probability space: randomness of parties) o Y(x 1,.,x n, k) = { {View Ideal i } Pi in C } {x1,.,xn, k} (Probability space: randomness of the corrupt parties and simulator) Definition (Perfect indistinguishability of X = {X(a,k)}  P Y = {Y(a,k)}) For every* distinguisher D such that for every a and for all k: |Pr[D(X(a,k) = 1 ] - Pr[D(Y(a,k) = 1 ]| = 0 For our case a: (x 1,.,x n ) Alternative def. Adv D (X,Y) = The prob of D guessing the correct distribution |Adv D (X,Y)| = ½ May have unbounded power

16 Definition Applies for Dimension 2 (Networks) Complete Synchronous Dimension 3 (Distrust) Centralized Dimension 4 (Adversary) Threshold/non- threshold Polynomially Bounded and unbounded powerful Semi-honest Static

17 What is so great about the definition paradigm? >> One definition for all > Sum: (x 1 + x 2 + … + x n ) = f(x 1, x 2, …, x n ) > OT: (-, x b ) = f((x 1, x 2 ), b) > BA: (y, y, …,y) = f(x 1, x 2, …, x n ): y = majority(x 1, x 2, …, x n )/default value >> Easy to tweak the ideal world and weaken/strengthen security Real world protocol achieves whatever ideal world achieves >> Coming up with the right ideal world is tricky and requires skill > Will have fun with it in malicious world!

18 Information Theoretic MPC with Semi-honest Adversary and honest majority [BGW88] Dimension 2 (Networks) Complete Synchronous Dimension 3 (Distrust) Centralized Dimension 4 (Adversary) Threshold (t) Unbounded powerful Semi-honest Static Dimension 1 (Models of Computation) Arithmetic Michael Ben-Or, Shafi Goldwasser, Avi Wigderson:Shafi Goldwasser, Avi Wigderson: Completeness Theorems for Non-Cryptographic Fault-Tolerant Distributed Computation (Extended Abstract). STOC 1988.STOC 1988.

19 (n, t) - Secret Sharing Scheme [Shamir 1979, Blackley 1979] Secret s Dealer

20 (n, t) - Secret Sharing Scheme [Shamir 1979, Blackley 1979] Secret s Dealer v1v1 v2v2 v3v3 vn vn Sharing Phase …

21 (n, t) - Secret Sharing Scheme [Shamir 1979, Blackley 1979] Secret s Dealer v1v1 v2v2 v3v3 vn vn Sharing Phase … Less than t +1 parties have no info’ about the secret Reconstruction Phase

22 (n, t) - Secret Sharing Scheme [Shamir 1979, Blackley 1979] Secret s Dealer v1v1 v2v2 v3v3 vn vn Sharing Phase …  t +1 parties can reconstruct the secret Secret s Reconstruction Phase

23 Shamir-sharing: (n,t) - Secret Sharing for Semi-honest Adversaries Secret x is Shamir-Shared if x2x2 x3x3 x n x1x1 … Random polynomial of degree t over F p s.t p>n P1P1 P2P2 PnPn P3P3

24 Reconstruction of Shamir-sharing: (n,t) - Secret Sharing for Semi-honest Adversaries x2x2 x3x3 x n x1x1 P1P1 P2P2 PnPn P3P3 PiPi The same is done for all P i Lagrange’s Interpolation

25 Shamir-sharing Properties Property 2: Any t parties have ‘no’ information about the secret. Pr[secret =s before secret sharing] – Pr[secret =s after secret sharing] = 0 Property 1: Any (t+1) parties have ‘complete’ information about the secret. >> Both proof can be derived from Lagrange’s Interpolation.

26 Lagrange’s Interpolation >> Assume that h(x) is a polynomial of degree at most t and C is a subset of F p of size t+1 >> Poly of degree t >> At i, it evaluates to 1 >> At any other point, it gives 0. Theorem: h(x) can be written as >> Assume for simplicity C = {1,……,t+1} where Proof: Consider LHS – RHS: Both LHS and RHS evaluates to h(i) for every i in C Both LHS and RHS has degree at most t LHS - RHS evaluates to 0 for every i in C and has degree at most t More zeros (roots) than degreeZero polynomial! LHS = RHS

27 Lagrange’s Interpolation >> Poly of degree t >> At i, it evaluates to 1 >> At any other point, it gives 0. Theorem: h(x) can be written as C = {1,……,t+1} where >> are public polynomials >> are public values, denote by r i >> Can be written as the linear combination of h(i)s The combiners are (recombination vector): r 1,….r t+1 Property 1: Any (t+1) parties have ‘complete’ information about the secret.

28 Lagrange’s Interpolation Property 2: Any t parties have ‘no’ information about the secret. Pr[secret =s before secret sharing] – Pr[secret =s after secret sharing] = 0 Proof: For any secret s from F p if we sample f(x) of degree at most t randomly s.t. f(0) = s and consider the following distribution for any C that is subset of F p \ {0} and of size t : ( {f(i)} i in C )uniform distribution in F p t For a fixed s, If not then two different sets of t+1 values will define the same polynomial t coefficients from F p t A unique element from the above distribution

29 Lagrange’s Interpolation Property 2: Any t parties have ‘no’ information about the secret. Pr[secret =s before secret sharing] – Pr[secret =s after secret sharing] = 0 Proof: For any secret s from F p if we sample f(x) of degree at most t randomly s.t. f(0) = s and consider the following distribution for any C that is subset of F p \ {0} and of size t : ( {f(i)} i in C )uniform distribution in F p t For a fixed s, t coefficients from F p t A unique element from the above distribution f s : F p t FptFpt Uniform distribution bijective Uniform distribution For every s, uniform dist and independent of dist. of s

30 Lagrange’s Interpolation Property 2: Any t parties have ‘no’ information about the secret. Pr[secret =s before secret sharing] – Pr[secret =s after secret sharing] = 0 Proof:

31

32 (n,t) Secret Sharing s : (n,t) Secret Sharing of secret s For MPC: Linear (n,t) Secret Sharing s 1 s 2 c s s1s1 s2s2 c : public constant s   from Linearity: The parties can do the following Linear

33 Linearity of (n, t) Shamir Secret Sharing a 1 a 2 a 3 a 11 22 33 a 1 a 2 a 3 a b 1 b 2 b 3 b b b 1 b 2 b 3 +++ each party does locally c 1 c 2 c 3

34 Linearity of (n, t) Shamir Secret Sharing a 1 a 2 a 3 a 11 22 33 a 1 a 2 a 3 a b 1 b 2 b 3 b b b 1 b 2 b 3 ++ + c 1 c 2 c 3 a  b c 1 c 2 c 3 a+b Addition is Absolutely free

35 Linearity of (n, t) Shamir Secret Sharing 11 22 33 a 1 a 2 a 3 a 1 a 2 a 3 a a cc cc cc c is a publicly known constant d 1 d 2 d 3 c  a d 1 d 2 d 3 c  a Multiplication by public constants is Absolutely free

36 Non-linearity of (n, t) Shamir Secret Sharing a 1 a 2 a 3 b 1 b 2 b 3 b a 11 22 33 a 1 a 2 a 3 a b b 1 b 2 b 3    d 1 d 2 d 3 a  b d 1 d 3 d 2 a  b Multiplication of shared secrets is not free

37 Secure Circuit Evaluation x1x1 x2x2 x3x3 x4x4     c y

38 2 1 5 9     y 3

39 1.(n, t)- secret share each input     2159 3

40 Secure Circuit Evaluation     2159 2. Find (n, t)-sharing of each intermediate value 1.(n, t)- secret share each input 3

41 Secure Circuit Evaluation     3 215934814445 2. Find (n, t)-sharing of each intermediate value 1.(n, t)- secret share each input

42 Secure Circuit Evaluation     2159348 Linear gates: Linearity of Shamir Sharing - Non-Interactive 14445 3 2. Find (n, t)-sharing of each intermediate value 1.(n, t)- secret share each input

43 Secure Circuit Evaluation     2159348 Non-linear gate: Require degree- reduction Technique. Interactive 45144 3 2. Find (n, t)-sharing of each intermediate value 1.(n, t)- secret share each input Linear gates: Linearity of Shamir Sharing - Non-Interactive

44 Secure Circuit Evaluation     2159 1.No inputs of the honest parties are leaked. 2. No intermediate value is leaked. 34845 144 Privacy follows (intuitively) because: 3


Download ppt "Secure Computation (Lecture 5) Arpita Patra. Recap >> Scope of MPC > models of computation > network models > modelling distrust (centralized/decentralized."

Similar presentations


Ads by Google