Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Information Security – Theory vs. Reality 0368-4474-01, Winter 2012-2013 Lecture 10: Garbled circuits and obfuscation Eran Tromer Slides credit: Boaz.

Similar presentations


Presentation on theme: "1 Information Security – Theory vs. Reality 0368-4474-01, Winter 2012-2013 Lecture 10: Garbled circuits and obfuscation Eran Tromer Slides credit: Boaz."— Presentation transcript:

1 1 Information Security – Theory vs. Reality 0368-4474-01, Winter 2012-2013 Lecture 10: Garbled circuits and obfuscation Eran Tromer Slides credit: Boaz Barak

2 2 Recall our high-level goal Ensure properties of a distributed computation when parties are mutually untrusting, faulty, leaky & malicious.

3 3 PrimitiveAttacksGuaranteesFunction ality Communi cation Assumpti ons Leakag e TamperingCorrectne ss SecrecyFunction class Output form FHEANYnoneYES CircuitsEncryptedMinimalComputati onal ANYno Arguments (CS proofs / PCD / SNARG) ANY YESnoRAM, distributed PlaintextMinimalExotic computati onal / oracle MPCANY YES ANYPlaintextHeavy interaction Mild computati onal Garbled circuits ANYnoneYES CircuitsPlaintextPreproces sing + minimal Mild computati onal ANYno Leakage resilience VariesnoneYES VariesPlaintextMinimalVaries anyno Tamper resilience VariesVARIESVaries PlaintextMinimalVaries ObfuscationANY YES PlaintextMinimal0=1 TPMSecure hardware

4 4 Garbled circuits: variants of functionality (summary of whiteboard discussion) “Honest-but-curious” model Offline-online evaluation for public circuits Circuit U is public, Alice chooses x, Bob learns U(x) and nothing else. Offline-online evaluation for secret circuits Alice chooses C and x, Bob learns C(x) and nothing else. Obtained from previous by making U a universal circuit and plugging in the description of C.

5 5 Garbled circuits: construction (summary of whiteboard discussion) The garbled circuits Choose random keys for each value for each wire. Output: Gate tables (double-encryption of output keys under input keys, permuted) Keys of output wires The garbled inputs Keys for chosen values in input wires Evaluation Gate-by-gate, using double decryption.

6 6 An obfuscator: an algorithm O such that for any program P, O(P) is a program such that: O(P) has the same functionality as P O(P) is infeasible to analyze / “reverse-engineer”. Intuition: an obfuscator should provide a “virtual black- box” in the sense that giving someone O(P) should be equivalent to giving her a black-box that computes P. What Is an Obfuscator?

7 7 Practical Reasons: Understanding code is very difficult Obfuscation used (successfully?) in practice for security purposes Theoretical Reasons: All canonical hard problems are problems of reverse engineering: SAT, HALTING Rice’s Theorem: You can’t look at the code (Turing Machine description) of a function and find out a non-trivial property of it. Why might obfuscators exist?

8 8 “Digital right management” Converting symmetric-key encryption to asymmetric- key encryption Removing Random Oracles for specific natural protocols. Give someone ability to sign/decrypt a restricted subset of the message space. Applications for obfuscators

9 9 Definition 1 An algorithm O is an obfuscator if for any circuit C: 1.(functionality) O(C) ~ C (i.e., O(C) computes the same function as C) 2.(polynomial slowdown) |O(C)|  p(|C|) for some polynomial p( ). We say that O is efficient if it runs in polynomial time. Defining obfuscators

10 10 A Natural Formal Interpretation: For any adversary A there’s a simulator S such that for any circuit C A(O(C))  C.I. S C (1 |C| ) “Anything that can be learned from the obfuscated form, could have been learned by merely observing the circuit’s input-output behavior (i.e., by treating the circuit as a black-box)’’ This definition is impossible to meet! Defining security

11 11 Relaxation: simulator should only compute a specific function (even predicate) rather than generate an indistinguishable output. Weak Obfuscators:  p.p.t. adversary A  (poly time) predicate p:{0,1} *  {0,1}  S such that for all circuits C Pr [ A(O(C)) = p(C) ]  Pr [ S C (1 |C| ) = p(C) ] + negl(|C|) Note: may be too weak for desired applications, but still we’ll prove that it is impossible to meet. Defining security (2)

12 12 Definition 2 A (efficiently computable) function ensemble { F t } ( F t :{0,1} |t|  {0,1} |t| ) is an unobfuscatable function ensemble (UF) if it satisfies: There’s a poly time predicate p:{0,1} *  {0,1} such that: (a) (p easy to compute with a circuit) There’s a p.p.t A such that for any circuit C such that C ~ F t :A(C) = p(F t ) (b) (p hard to compute with black-box access) For any p.p.t S, for random t  {0,1} n : Pr [ S F t (1 n ) = p(t) ]  ½ + negl(n) Theorem 1:  unobfuscatable functions   “very weak” obfuscators. Inherently Unobfuscatable Functions

13 13 There exist unobfuscatable functions (if there exist OWFs).  Efficient (even weak) obfuscators do not exist. Moreover: There exist unobfuscatable encryption schemes (if any exist). There exist unobfuscatable signature schemes (if any exist). Natural relaxations of obfuscation (e.g., approximate correctness) are still impossible. State of the art Constructions for very simple classes (e.g., point functions) In practice, heuristics to slow down reverse engineering. Results (summary of whiteboard discussion)


Download ppt "1 Information Security – Theory vs. Reality 0368-4474-01, Winter 2012-2013 Lecture 10: Garbled circuits and obfuscation Eran Tromer Slides credit: Boaz."

Similar presentations


Ads by Google