Cryptography Lecture 2 Stefan Dziembowski

Slides:



Advertisements
Similar presentations
Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Advertisements

Computational Privacy. Overview Goal: Allow n-private computation of arbitrary funcs. –Impossible in information-theoretic setting Computational setting:
Foundations of Cryptography Lecture 2: One-way functions are essential for identification. Amplification: from weak to strong one-way function Lecturer:
Many-to-one Trapdoor Functions and their Relations to Public-key Cryptosystems M. Bellare S. Halevi A. Saha S. Vadhan.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
CMSC 414 Computer (and Network) Security Lecture 4 Jonathan Katz.
CIS 5371 Cryptography 3b. Pseudorandomness.
1 Introduction CSE 5351: Introduction to cryptography Reading assignment: Chapter 1 of Katz & Lindell.
Foundations of Cryptography Lecture 13 Lecturer: Moni Naor.
Princeton University COS 433 Cryptography Fall 2007 Boaz Barak COS 433: Cryptography Princeton University Fall 2007 Boaz Barak Lectures 1-6: Short Recap.
Introduction to Cryptography and Security Mechanisms: Unit 5 Theoretical v Practical Security Dr Keith Martin McCrea
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
Complexity and Cryptography
Analysis of Security Protocols (V) John C. Mitchell Stanford University.
Asymmetric Cryptography part 1 & 2 Haya Shulman Many thanks to Amir Herzberg who donated some of the slides from
Lecture 1 Introduction to Cryptography Stefan Dziembowski University of Rome La Sapienza BiSS 2009 Bertinoro International Spring School 2-6 March 2009.
Foundations of Network and Computer Security J J ohn Black Lecture #3 Aug 28 th 2009 CSCI 6268/TLEN 5550, Fall 2009.
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
CS555Spring 2012/Topic 41 Cryptography CS 555 Topic 4: Computational Approach to Cryptography.
On Everlasting Security in the Hybrid Bounded Storage Model Danny Harnik Moni Naor.
Cramer-Shoup is Plaintext Aware in the Standard Model Alexander W. Dent Information Security Group Royal Holloway, University of London.
Ruhr-Universität Bochum, Germany
Computer Security CS 426 Lecture 3
Foundations of Cryptography Lecture 9 Lecturer: Moni Naor.
Lecture 1 Introduction to Cryptography Stefan Dziembowski MIM UW ver 1.0.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Cryptography Lecture 8 Stefan Dziembowski
1 CIS 5371 Cryptography 3. Private-Key Encryption and Pseudorandomness B ased on: Jonathan Katz and Yehuda Lindel Introduction to Modern Cryptography.
One-Time Pad Or Vernam Cipher Sayed Mahdi Mohammad Hasanzadeh Spring 2004.
Cryptography Dec 29. This Lecture In this last lecture for number theory, we will see probably the most important application of number theory in computer.
Lecture 11 Chosen-Ciphertext Security Stefan Dziembowski MIM UW ver 1.0.
Lecture 2 Symmetric Encryption I Stefan Dziembowski MIM UW ver 1.0.
Ryan Henry I 538 /B 609 : Introduction to Cryptography.
Cryptography Lecture 9 Stefan Dziembowski
CS555Spring 2012/Topic 111 Cryptography CS 555 Topic 11: Encryption Modes and CCA Security.
Cryptography Lecture 2 Arpita Patra. Summary of Last Class  Introduction  Secure Communication in Symmetric Key setting >> SKE is the required primitive.
Introduction to Quantum Key Distribution
On Forward-Secure Storage Stefan Dziembowski Warsaw University and University of Rome La Sapienza.
CS555Spring 2012/Topic 31 Cryptography CS 555 Topic 3: One-time Pad and Perfect Secrecy.
Introduction to the Bounded- Retrieval Model Stefan Dziembowski University of Rome La Sapienza Warsaw University.
Cryptography Lecture 4 Arpita Patra. Recall o Various Definitions and their equivalence (Shannon’s Theorem) o Inherent Drawbacks o Cannot afford perfect.
Pseudo-random generators Talk for Amnon ’ s seminar.
CS555Spring 2012/Topic 81 Cryptography CS 555 Topic 8: Pseudorandom Functions and CPA Security.
Cryptography Lecture 3 Arpita Patra © Arpita Patra.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Complexity Theory and Explicit Constructions of Ramsey Graphs Rahul Santhanam University of Edinburgh.
Topic 36: Zero-Knowledge Proofs
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Modern symmetric-key Encryption
Secrecy of (fixed-length) stream ciphers
Cryptography Lecture 12.
B504/I538: Introduction to Cryptography
Topic 5: Constructing Secure Encryption Schemes
B504/I538: Introduction to Cryptography
Cryptography Lecture 5.
Topic 3: Perfect Secrecy
CMSC 414 Computer and Network Security Lecture 3
Cryptography Lecture 6.
Topic 7: Pseudorandom Functions and CPA-Security
Cryptography Lecture 7.
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Cryptography Lecture 5.
Cryptography Lecture 8.
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Cryptography Lecture 9.
Cryptography Lecture 6.
Cryptography Lecture 7.
Cryptography Lecture 6.
Presentation transcript:

Cryptography Lecture 2 Stefan Dziembowski

Plan 1.Information-theoretic cryptography 2.Introduction to cryptography based on the computational assumptions 3.Provable security 4.Pseudorandom generators

The scenario from the previous lecture Eve AliceBob Shannon’s theorem  perfect secrecy is possible only if the key is as long as the plaintext In real-life it is completely impractical

What to do? Idea: limit the power of the adversary. How? Classical (computationally-secure) cryptography: bound his computational power. Alternative options exists (but are not very practical)

Quantum cryptography Stephen Wiesner (1970s), Charles H. Bennett and Gilles Brassard (1984) quantum link Eve AliceBob Quantum indeterminacy: quantum states cannot be measured without disturbing the original state. Hence Eve cannot read the bits in an unnoticeable way.

Quantum cryptography Advantage: security is based on the laws of quantum physics Disadvantage: needs a dedicated equipment. Practicality? Currently: successful transmissions for distances of length around 150 km. Commercial products are available. Warning: Quantum cryptography should not be confused with quantum computing.

A satellite scenario Eve AliceBob A third party (a satellite) is broadcasting random bits. Does it help? No... (Shannon’s theorem of course also holds in this case.)

Ueli Maurer (1993): noisy channel Assumption: the data that the adversary receives is noisy. (The data that Alice and Bob receive may be even more noisy.) some bits get flipped (because of the noise)

Bounded-Storage Model Another idea: bound the size of adversary’s memory too large to fit in Eve’s memory

Real (computationally-secure) cryptography starts here: Eve is computationally-bounded But what does it mean? Ideas : 1.She has can use at most 1000 Intel Core 2 Extreme X6800 Dual Core Processors for at most 100 years... 2.She can buy equipment worth 1 million euro and use it for 30 years... it’s hard to reason formally about it

A better idea ”The adversary has access to a Turing Machine that can make at most steps.” More generally, we could have definitions of a type: “a system X is (t,ε)-secure if every Turing Machine that operates in time t can break it with probability at most ε.” This would be quite precise, but... We would need to specify exactly what we mean by a “Turing Machine”: how many tapes it has? how does it access these tapes (maybe a “random access memory” is a more realistic model..)... Moreover, this approach often leads to ugly formulas...

What to do? “(t,ε)-security” Idea: t steps of a Turing Machine = “efficient computation” ε – a value “very close to zero”. How to formalize it? Use the asymptotics!

Efficiently computable? “polynomial-time computable on a Turing Machine” “efficiently computable” = that is: running in time O(n c ) (for some c) Here we assume that the Turing Machines are the right model for the real-life computation. Not true if a quantum computer is built...

Very small? “very small” = “negligible” = approaches 0 faster than the inverse of any polynomial Formally:

Negligible or not? yes no yes

Security parameter The terms “negligible” and “polynomial” make sense only if X (and the adversary) take an additional input n called a security parameter. In other words: we consider an infinite sequence X(1),X(2),... of schemes. Typically, we will say that a scheme X is secure if A polynomial-time Turing Machine M P (M breaks the scheme X) is negligible

Example Consider the authentication scheme from the last week:

Nice properties of these notions A sum of two polynomials is a polynomial: poly + poly = poly A product of two polynomials is a polynomial: poly * poly = poly A sum of two negligible functions is a negligible function: negl + negl = negl Moreover: A negligible function multiplied by a polynomial is negligible negl * poly = negl

A new definition of an encryption scheme

Is this the right approach? Advantages 1.All types of Turing Machines are “equivalent” up to a “polynomial reduction”. Therefore we do need to specify the details of the model. 2.The formulas get much simpler. Disadvantage Asymptotic results don’t tell us anything about security of the concrete systems. However Usually one can prove formally an asymptotic result and then argue informally that “the constants are reasonable” (and can be calculated if one really wants).

Provable security We want to construct schemes that are provably secure. But... why do we want to do it? how to define it? and is it possible to achieve it?

Provable security – the motivation In many areas of computer science formal proofs are not essential. For example, instead of proving that an algorithm is efficient, we can just simulate it on a “typical input”. In cryptography it’s not true, because there cannot exist an experimental proof that a scheme is secure. Why? Because a notion of a “typical adversary” does not make sense.

How did we define the perfect secrecy? Experiment (m – a message) 1.the key k is chosen randomly 2.message m is encrypted using k: c := Enc k (m) 3.c is given to the adversary Idea 1 The adversary should not be able to compute k. Idea 2 The adversary should not be able to compute m. Idea 3 The adversary should not be able to compute any information about m. Idea 4 The adversary should not be able to compute any additional information about m. makes more sense

Idea The adversary should not be able to compute any additional information about m.

A m 0,m 1 P(C = c | M = m 0 ) = P(C = c | M = m 1 ) A c P(Enc(K,M) = c | M = m 0 ) = P(Enc(K,M) = c | M = m 1 ) Towards the definition of computational secrecy... P(Enc(K,m 0 ) = c | M = m 0 ) = P(Enc(K,m 1 ) = c | M = m 1 ) P(C = c) = P(C = c | M=m) P(Enc(K,m 0 ) = c) = P(Enc(K,m 1 ) = c) A mc A A m 0,m 1 A c A A c A A c

Indistinguishability P(Enc(K,m 0 ) = c) = P(Enc(K,m 1 ) = c) A m 0,m 1 A c In other words: the distributions of Enc(K,m 0 ) = Enc(K,m 1 ) are identical IDEA change it to: are indistinguishable by a polynomial time adversary

A game adversary (polynomial-time Turing machine) oracle chooses m 0,m 1 such that |m 0 |=|m 1 | m 0,m 1 1.selects k := G(1 n ) 2.chooses a random b = 0,1 3.calculates c := Enc(k,m b ) (Gen,Enc,Dec) – an encryption scheme c has to guess b Security definition: We say that (Gen,Enc,Dec) has indistinguishable encryptions if any polynomial time adversary guesses b correctly with probability at most ε(n), where ε is negligible. security parameter 1 n Alternative name: semantially-secure (sometimes we will say: “is computationally-secure”, if the context is clear)

Testing the definition 1.Suppose the adversary can compute k from some Enc(k,m). Can he win the game? 2.Suppose the adversary can compute some bit of m from Enc(k,m). Can he win the game? YES!

Is it possible to prove security? (Gen,Enc,Dec) -- an encryption scheme. For simplicity suppose that: 1.for a security parameter n the key is of length n. 2.Enc is deterministic Consider the following language: Q: What if L is polynomial-time decidable? A: Then the scheme is broken (exercise) On the other hand: L is in NP.(k is the NP-witness) So, if P = NP, then any semantically-secure encryption is broken. Is it really true?

“If P=NP, then the semantically-secure encryption is broken” Is it 100% true? Not really... This is because even if P=NP we do not know what are the constants. Maybe P=NP in a very “inefficient way”...

In any case, to prove security of a cryptographic scheme we would need to show a lower bound on the computational complexity of some problem. In the “asymptotic setting” that would mean that at least we show that P ≠ NP. Does the implication in the other direction hold? (that is: does P ≠ NP imply anything for cryptography?) No! (at least as far as we know) Intuitively: because NP is a notion from the “worst case complexity”, and cryptography concerns the “average case complexity”. Therefore proving that an encryption scheme is secure is probably much harder than proving that P ≠ NP.

What can we prove? We can prove conditional results. That is, we can show theorems of a type: Suppose that some scheme Y is secure then scheme X is secure. Suppose that some “computational assumption A” holds then scheme X is secure.

Research program in cryptography Base the security of cryptographic schemes on a small number of well-specified “computational assumptions”. then scheme X is secure. Some “computational assumption A” holds in this we have to “believe” the rest is provable Examples of A: “decisional Diffie-Hellman assumption” “strong RSA assumption”

Example We are now going to show an example of such reasoning: then scheme X is secure. Suppose that some “computational assumption A” holds we G can construct a secure encryption scheme Suppose that G is a “cryptographic pseudorandom generator”

Pseudorandom generators s G(s)

If we use a “normal PRG” – this idea doesn’t work (exercise). It works only with the cryptographic PRGs.

“Looks random” What does it mean? Non-cryptographic applications: should pass some statistical tests. Cryptography: should pass all polynomial-time tests.

Cryptographic PRG a polynomial-time distinguisher D a random string R G(S) (where S random) or Should not be able to distinguish... outputs: 0 if he thinks it’s R 1 if he thinks it’s G(S)

Constructions There exists constructions of cryptographic pseudorandom-generators, that are conjectured to be secure. Some of them are extremely efficient, and widely used in practice. They are called the “stream ciphers” (we will discuss them later).

Theorem If G is a cryptographic PRG then the encryption scheme constructed before is semantically- secure (i.e. it has indistinguishable encryptions). cryptographic PRGs computationally-secure encryption Proof (sketch) Suppose that it is not secure. Therefore there exists an adversary that wins the “guessing game” with probability δ(n), where δ(n) is not negligible.

X chooses m 0,m 1 m 0,m 1 1.b = 0,1 random 2.c := x xor m b c has to guess b simulates If the adversary guessed b correctly then output 1: “x is pseudorandom”. Otherwise output 0: “x is random”.

x is a random string Rx = G(S) the adversary guesses b correctly with probability 0.5 the adversary guesses b correctly with probability δ(n) prob. 0.5 prob δ(n) prob δ(n) 1010 outputs: QED

Moral To construct secure encryption it suffices to construct a secure PRG. cryptographic PRGs semantically-secure encryption

Outlook Cryptography one time pad, quantum cryptography, “the satellite scenario” often called: “information-theoretic”, “unconditional” “computationally-secure” based on 2 assumptions: 1.some problems are computationally difficult 2.our understanding of what “computational difficulty” means is correct.