Lecture 1 Introduction to Cryptography Stefan Dziembowski University of Rome La Sapienza BiSS 2009 Bertinoro International Spring School 2-6 March 2009.

Slides:



Advertisements
Similar presentations
Computational Privacy. Overview Goal: Allow n-private computation of arbitrary funcs. –Impossible in information-theoretic setting Computational setting:
Advertisements

Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
CMSC 414 Computer (and Network) Security Lecture 4 Jonathan Katz.
CS 6262 Spring 02 - Lecture #7 (Tuesday, 1/29/2002) Introduction to Cryptography.
1 Introduction CSE 5351: Introduction to cryptography Reading assignment: Chapter 1 of Katz & Lindell.
Lecture 2.1: Private Key Cryptography -- I CS 436/636/736 Spring 2013 Nitesh Saxena.
Great Theoretical Ideas in Computer Science.
Session 5 Hash functions and digital signatures. Contents Hash functions – Definition – Requirements – Construction – Security – Applications 2/44.
Introduction to Cryptography and Security Mechanisms: Unit 5 Theoretical v Practical Security Dr Keith Martin McCrea
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Analysis of Security Protocols (V) John C. Mitchell Stanford University.
CMSC 414 Computer and Network Security Lecture 4 Jonathan Katz.
Asymmetric Cryptography part 1 & 2 Haya Shulman Many thanks to Amir Herzberg who donated some of the slides from
CMSC 414 Computer and Network Security Lecture 2 Jonathan Katz.
Intro To Encryption Exercise 1. Monoalphabetic Ciphers Examples:  Caesar Cipher  At Bash  PigPen (Will be demonstrated)  …
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
Overview of Cryptography and Its Applications Dr. Monther Aldwairi New York Institute of Technology- Amman Campus INCS741: Cryptography.
CS555Spring 2012/Topic 41 Cryptography CS 555 Topic 4: Computational Approach to Cryptography.
On Everlasting Security in the Hybrid Bounded Storage Model Danny Harnik Moni Naor.
Introduction to Computer and Network Security Iliano Cervesato 26 August 2008 – Modern Cryptography.
CMSC 414 Computer and Network Security Lecture 2 Jonathan Katz.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Cramer-Shoup is Plaintext Aware in the Standard Model Alexander W. Dent Information Security Group Royal Holloway, University of London.
Ruhr-Universität Bochum, Germany
Computer Security CS 426 Lecture 3
Lecture 1 Introduction to Cryptography Stefan Dziembowski MIM UW ver 1.0.
3.1 Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter 3 Traditional Symmetric-Key Ciphers.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 9: Cryptography.
Cryptography Lecture 8 Stefan Dziembowski
Cryptography Lecture 1: Introduction Piotr Faliszewski.
CIS 5371 Cryptography Introduction.
Cryptography Lecture 1 Stefan Dziembowski
One-Time Pad Or Vernam Cipher Sayed Mahdi Mohammad Hasanzadeh Spring 2004.
Cryptography Dec 29. This Lecture In this last lecture for number theory, we will see probably the most important application of number theory in computer.
Lecture 11 Chosen-Ciphertext Security Stefan Dziembowski MIM UW ver 1.0.
Cryptography Lecture 9 Stefan Dziembowski
Symmetric-Key Cryptography
CS555Spring 2012/Topic 111 Cryptography CS 555 Topic 11: Encryption Modes and CCA Security.
Cryptography Lecture 2 Stefan Dziembowski
Cryptography Part 1: Classical Ciphers Jerzy Wojdyło May 4, 2001.
Traditional Symmetric-Key Ciphers
Cryptography Lecture 2 Arpita Patra. Summary of Last Class  Introduction  Secure Communication in Symmetric Key setting >> SKE is the required primitive.
Introduction to Quantum Key Distribution
Overview of Cryptography & Its Applications
Lectures so far: Today’s lecture: Discrete probability Proving things
On Forward-Secure Storage Stefan Dziembowski Warsaw University and University of Rome La Sapienza.
CS555Spring 2012/Topic 31 Cryptography CS 555 Topic 3: One-time Pad and Perfect Secrecy.
Cryptography Lecture 2 Arpita Patra. Recall >> Crypto: Past and Present (aka Classical vs. Modern Cryto) o Scope o Scientific Basis (Formal Def. + Precise.
1 CIS 5371 Cryptography 1.Introduction. 2 Prerequisites for this course  Basic Mathematics, in particular Number Theory  Basic Probability Theory 
CS526Topic 2: Classical Cryptography1 Information Security CS 526 Topic 2 Cryptography: Terminology & Classic Ciphers.
Cryptography Lecture 3 Arpita Patra © Arpita Patra.
1 CIS 5371 Cryptography 1.Introduction. 2 Prerequisites for this course  Basic Mathematics, in particular Number Theory  Basic Probability Theory 
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Modern symmetric-key Encryption
Cryptography Lecture 2.
Cryptography Lecture 3.
B504/I538: Introduction to Cryptography
Cryptography Lecture 4.
Topic 3: Perfect Secrecy
Cryptography Lecture 3 Arpita Patra © Arpita Patra.
Cryptography Lecture 6.
Cryptography Lecture 4.
Cryptography Lecture 5.
Cryptography Lecture 8.
Cryptography Lecture 6.
Cryptography Lecture 3.
Presentation transcript:

Lecture 1 Introduction to Cryptography Stefan Dziembowski University of Rome La Sapienza BiSS 2009 Bertinoro International Spring School 2-6 March 2009 Modern Cryptography

Plan 1.Introduction 2.Historical ciphers 3.Information-theoretic security 4.Computational security

Cryptography In the past: the art of encrypting messages (mostly for the military applications). Now: the science of securing digital communication and transactions (encryption, authentication, digital signatures, e-cash, auctions, etc..)

Terminology Cryptology = cryptography + cryptoanalysis This convention is slightly artificial and often ignored. Common usage: “cryptoanalysis of X” = “breaking X” constructing secure systems breaking the systems

Cryptography – general picture encryptionauthentication private key private key encryption private key authentication public keypublic key encryption signatures advanced cryptographic protocols plan of the course:

Encryption schemes (a very general picture) Encryption scheme (cipher) = encryption & decryption encryptionciphertext cdecryptionm plaintext m should not learn m In the past: a text in natural language. Now: a string of bits. In the past: a text in natural language. Now: a string of bits. Alice Bob Eve

Art vs. science In the past: lack of precise definitions, ad-hoc design, usually insecure. Nowadays: formal definitions, systematic design, very secure constructions.

We want to construct schemes that are provably secure. We want to construct schemes that are provably secure. Provable security But... why do we want to do it? how to define it? and is it possible to achieve it? why do we want to do it? how to define it? and is it possible to achieve it?

there cannot exist an experimental proof that a scheme is secure. In many areas of computer science formal proofs are not essential. For example, instead of proving that an algorithm is efficient, we can just simulate it on a “typical input”. Provable security – the motivation Why? Because a notion of a “typical adversary” does not make sense. In cryptography it’s not true, because Security definitions are useful also because they allow us to construct schemes in a modular way...

Kerckhoffs' principle 10 Auguste Kerckhoffs (1883): The enemy knows the system The cipher should remain secure even if the adversary knows the specification of the cipher. The only thing that is secret is a short key k that is usually chosen uniformly at random

11 A more refined picture plaintext mencryptionciphertext cdecryptionm key k Let us assume that k is unifromly random (Of course Bob can use the same method to send messages to Alice.) (That’s why it’s called the symmetric setting) doesn’t know k should not learn m Alice Bob Eve

Kerckhoffs' principle – the motivation 1.In commercial products it is unrealistic to assume that the design details remain secret (reverse- engineering!) 2.Short keys are easier to protect, generate and replaced. 3.The design details can be discussed and analyzed in public. Not respecting this principle = ``security by obscurity”.

A mathematical view K – key space M – plaintext space C - ciphertext space K – key space M – plaintext space C - ciphertext space An encryption scheme is a pair (Enc,Dec), where Enc : K × M → C is an encryption algorithm, Dec : K × C → M is an decryption algorithm. An encryption scheme is a pair (Enc,Dec), where Enc : K × M → C is an encryption algorithm, Dec : K × C → M is an decryption algorithm. We will sometimes write Enc k (m) and Dec k (c) instead of Enc(k,m) and Dec(k,c). Correctness for every k we should have Dec k (Enc k (m)) = m. Correctness for every k we should have Dec k (Enc k (m)) = m.

Plan 1.Introduction 2.Historical ciphers 3.Information-theoretic security 4.Computational security

Shift cipher M = words over alphabet {A,...,Z} ≈ {0,...,25} K = {0,...,25} Enc k (m 0,...,m n ) = (k+m 0 mod 25,..., k+m n mod 25) Dec k (c 0,...,c n ) = (c 0 - k mod 25,..., c n - k mod 25) Cesar: k = 3

Security of the shift cipher How to break the shift cipher? Check all possible keys! Let c be a ciphertext. For every k Є {0,...,25} check if Dec k (c) “makes sense”. Most probably only one such k exists. Thus Dec k (c) is the message. This is called a brute force attack. Moral: the key space needs to be large!

17 Substitution cipher ABCDEFGHIJKLMNOPRSTUWVXYZ ABCDEFGHIJKLMNOPRSTUWVXYZ M = words over alphabet {A,...,Z} ≈ {0,...,25} K = a set of permutations of {0,...,25} π Enc π (m 0,...,m n ) = (π(m 0 ),..., π(m n )) Dec π (c 0,...,c n ) = (π -1 (c 0 ),..., π -1 (c n ))

18 How to break the substitution cipher? Use statistical patterns of the language. For example: the frequency tables. Texts of 50 characters can usually be broken this way.

19 Other famous historical ciphers Vigenère cipher: Blaise de Vigenère ( ) Leon Battista Alberti (1404 – 1472) Enigma Marian Rejewski ( ) Alan Turing ( )

In the past ciphers were designed in an ad-hoc manner In contemporary cryptography the ciphers are designed in a systematic way. Main goals: 1.define security 2.construct schemes that are “provably secure”

Plan 1.Introduction 2.Historical ciphers 3.Information-theoretic security 4.Computational security

Defining “security of an encryption scheme” is not trivial. (m – a message) 1.the key K is chosen uniformly at random 2.C := Enc K (m) is given to the adversary (m – a message) 1.the key K is chosen uniformly at random 2.C := Enc K (m) is given to the adversary consider the following experiment how to define security ?

Idea 1 “The adversary should not be able to compute K.” the encryption scheme that “doesn’t encrypt”: Enc K (m) = m satisfies this definition! A problem An idea (m – a message) 1. the key K is chosen uniformly at random 2. C := Enc K (m) is given to the adversary (m – a message) 1. the key K is chosen uniformly at random 2. C := Enc K (m) is given to the adversary

Idea 2 “The adversary should not be able to compute m.” What if the adversary can compute, e.g., the first half of m? A problem An idea m1m1...m |m|/2 ?...? (m – a message) 1. the key K is chosen uniformly at random 2. C := Enc K (m) is given to the adversary (m – a message) 1. the key K is chosen uniformly at random 2. C := Enc K (m) is given to the adversary

Idea 3 “The adversary should not learn any information about m.” But he may already have some a priori information about m! For example he may know that m is a sentence in English... (m – a message) 1. the key K is chosen uniformly at random 2. c := Enc k (m) is given to the adversary (m – a message) 1. the key K is chosen uniformly at random 2. c := Enc k (m) is given to the adversary A problem An idea

Idea 4 “The adversary should not learn any additional information about m.” This makes much more sense. But how to formalize it? (m – a message) 1. the key K is chosen randomly 2. c := Enc K (m) is given to the adversary (m – a message) 1. the key K is chosen randomly 2. c := Enc K (m) is given to the adversary An idea

We will use the language of the probability theory. A : Ω → A - a random variable then P A : A → [0,1] denotes the distribution of A: P A (a) = P(A = a) For two distributions P A and P B we write P A = P B if they are equal (as functions). if X is an event then P A|X denotes the distribution of A conditioned on X: P A|X (a) = P (A = a | X). A : Ω → A - a random variable then P A : A → [0,1] denotes the distribution of A: P A (a) = P(A = a) For two distributions P A and P B we write P A = P B if they are equal (as functions). if X is an event then P A|X denotes the distribution of A conditioned on X: P A|X (a) = P (A = a | X). Notation So, it is the same as saying: “for every a P(A = a) = P(B = a)” So, it is the same as saying: “for every a P(A = a) = P(B = a)”

Notation Two (discrete) random variables A : Ω → A and B : Ω → B are independent if for every a and b: P(A = a and B = b) = P(A = a) · P(B = b). Two (discrete) random variables A : Ω → A and B : Ω → B are independent if for every a and b: P(A = a and B = b) = P(A = a) · P(B = b).

Independence: equivalent formulations for every a and b: P(A = a and B = b) = P(A = a) · P(B = b). for every a and b: P(A = a) P(A = a | B = b). for every a and b: P(A = a) P(A = a | B = b). for every a and b 0,b 1 : P(A = a | B = b 0 ) = P(A = a | B = b 1 ). equals to P(A = a and B = b) / P(B = b) equals to P(A = a and B = b) / P(B = b) for every b 0,b 1 : P A | B = b 0 = P A | B = b 1 technical assumptions: P(B = b) ≠ 0 P(B = b 0 ) ≠ 0 P(B = b 1 ) ≠ 0 technical assumptions: P(B = b) ≠ 0 P(B = b 0 ) ≠ 0 P(B = b 1 ) ≠ 0 =

More notation For example if A has a uniform distribution over {1,2,3,4,5} then A 2 has a uniform distribution over {1,4,9,16,25}. If A : Ω → A is a random variable, and f : A → B is a function, then f(A) denotes a random variable Ω → B, defined as f(A)( ω) = f(A( ω)). If A : Ω → A is a random variable, and f : A → B is a function, then f(A) denotes a random variable Ω → B, defined as f(A)( ω) = f(A( ω)).

Notation If A is a set then Y ← A means that Y is chosen uniformly at random from the set A. If A is a set then Y ← A means that Y is chosen uniformly at random from the set A.

How to formalize the “Idea 4”? An encryption scheme is perfectly secret if for every random variable M and every m Є M and c Є C P(M = m) = P(M = m | (Enc(K,M))= c) “The adversary should not learn any additional information about m.” such that P(C = c) > 0 equivalently: M and Enc(K,M) are independent also called: information-theoretically secret

Equivalently: for every M we have that: M and Enc(K,M) are independent for every m 0 and m 1 we have: P Enc(K,M) | M = m 0 = P Enc(K,M) | M = m 1 for every m 0 and m 1 we have P Enc(K,m 0 ) = P Enc(K,m 1 ) intuitive...

34 A perfectly secret scheme: one-time pad Gilbert Vernam (1890 –1960) t – a parameter K = M = {0,1} t Enc k (m) = k xor m Dec k (c) = k xor c Vernam’s cipher: component-wise xor Correctness is trivial: Dec k (Enc k (m))=k xor (k xor m) m

Perfect secrecy of the one-time pad Perfect secrecy of the one time pad is also trivial. This is because for every m the distribution P Enc(K,m) is uniform (and hence does not depend on m). for every c: P(Enc(K,m) = c) = P(K = m xor c) = 2 -t

Observation One time pad can be generalized as follows. Let (G,+) be a group. Let K = M = C = G. The following is a perfectly secret encryption scheme: Enc(k,m) = m + k Dec(k,m) = m – k

37 Why the one-time pad is not practical? 1.The key has to be as long as the message. 2.The key cannot be reused This is because: Enc k (m 0 ) xor Enc k (m 1 )=(k xor m 0 ) xor (k xor m 1 ) =m 0 xor m 1

Without loss of generality we can assume that (for each m) C = {Enc(k,m)} k Є K Hence: | K | ≥ | C |. Fact: we always have that | C | ≥ | M |. This is because for every k we have that Enc k : M → C is an injection (otherwise we wouldn’t be able to decrypt). 38 One time-pad is optimal in the class of perfectly secret schemes Theorem (Shannon 1949) In every perfectly secret encryption scheme Enc : K × M → C, Dec : K × C → M we have | K | ≥ | M |. Proof Perfect secrecy implies that the distribution of Enc(K,m) does not depend on m |K| ≥ |M||K| ≥ |M|

39 Practicality? Generally, the one-time pad is not very practical, since: the key has to be as long as the total length of the encrypted messages, it is hard to generate truly random strings. However, it is sometimes used (e.g. in the military applications), because of the following advantages: perfect secrecy, short messages can be encrypted using pencil and paper. In the 1960s the Americans and the Soviets established a hotline that was encrypted using the one-time pad.(additional advantage: they didn’t need to share their secret encryption methods) a KGB one-time pad hidden in a walnut shell

40 Venona project (1946 – 1980) American National Security Agency decrypted Soviet messages that were transmitted in the 1940s. That was possible because the Soviets reused the keys in the one-time pad scheme. Ethel and Julius Rosenberg

41 Outlook We constructed a perfectly secret encryption scheme Our scheme has certain drawbacks (| K | ≥ | M |). But by Shannon’s theorem this is unavoidable. Can we go home and relax? maybe the secrecy definition is too strong?

How? Classical (computationally-secure) cryptography: bound his computational power. Alternative options: quantum cryptography, bounded-storage model,... (not too practical) How? Classical (computationally-secure) cryptography: bound his computational power. Alternative options: quantum cryptography, bounded-storage model,... (not too practical) What to do? Idea use a model where the power of the adversary is limited. Idea use a model where the power of the adversary is limited.

Quantum cryptography Stephen Wiesner (1970s), Charles H. Bennett and Gilles Brassard (1984) quantum link Eve Alice Bob Quantum indeterminacy: quantum states cannot be measured without disturbing the original state. Hence Eve cannot read the bits in an unnoticeable way.

Quantum cryptography Advantage: security is based on the laws of quantum physics Disadvantage: needs a dedicated equipment. Practicality? Currently: successful transmissions for distances of length around 150 km. Commercial products are available. Warning: Quantum cryptography should not be confused with quantum computing.

A satellite scenario Eve Alice Bob A third party (a satellite) is broadcasting random bits. Does it help? No... (Shannon’s theorem of course also holds in this case.) Does it help? No... (Shannon’s theorem of course also holds in this case.)

Ueli Maurer (1993): noisy channel Assumption: the data that the adversary receives is noisy. (The data that Alice and Bob receive may be even more noisy.) some bits get flipped (because of the noise) some bits get flipped (because of the noise)

Bounded-Storage Model Another idea: bound the size of adversary’s memory too large to fit in Eve’s memory

Plan 1.Introduction 2.Historical ciphers 3.Information-theoretic security 4.Computational security

Computing power of the adversary In practice, the adversary has always a limited computing power. Therefore, for the real-life applications, it is enough if the schemes are secure against the computationally-limited adversaries. How to model this? In practice, the adversary has always a limited computing power. Therefore, for the real-life applications, it is enough if the schemes are secure against the computationally-limited adversaries. How to model this?

50 How can this be formalized? We will use the complexity theory! How can this be formalized? We will use the complexity theory! We required that M and Enc K (M) are independent, We required that M and Enc K (M) are independent, How to reason about the bounded computing power? It is enough to require that M and Enc K (M) are independent “from the point of view of a computationally-limited adversary’’. It is enough to require that M and Enc K (M) are independent “from the point of view of a computationally-limited adversary’’.

Practical cryptography starts here: Eve is computationally-bounded We will construct schemes that in principle can be broken if the adversary has a huge computing power. For example, the adversary will be able to break the scheme by enumerating all possible secret keys. (this is called a “brute force attack”)

Ideas : 1.“She has can use at most 1000 Intel Core 2 Extreme X6800 Dual Core Processors for at most 100 years...” 2.“She can buy equipment worth 1 million euro and use it for 30 years..”. Ideas : 1.“She has can use at most 1000 Intel Core 2 Extreme X6800 Dual Core Processors for at most 100 years...” 2.“She can buy equipment worth 1 million euro and use it for 30 years..”. Computationally-bounded adversary Eve is computationally-bounded it’s hard to reason formally about it But what does it mean?

A better idea We would need to specify exactly what we mean by a “Turing Machine”: how many tapes does it have? how does it access these tapes (maybe a “random access memory” is a more realistic model..)... We would need to specify exactly what we mean by a “Turing Machine”: how many tapes does it have? how does it access these tapes (maybe a “random access memory” is a more realistic model..)... ”The adversary has access to a Turing Machine that can make at most steps.” “a system X is (t,ε)-secure if every Turing Machine that operates in time t can break it with probability at most ε.” “a system X is (t,ε)-secure if every Turing Machine that operates in time t can break it with probability at most ε.” More generally, we could have definitions of a type: This would be quite precise, but... Moreover, this approach often leads to ugly formulas...

What to do? t steps of a Turing Machine → “efficient computation” ε → a value “very close to zero”. t steps of a Turing Machine → “efficient computation” ε → a value “very close to zero”. How to formalize it? Use the asymptotics! How to formalize it? Use the asymptotics! Idea:

Efficiently computable? “polynomial-time computable on a Probabilistic Turing Machine” “polynomial-time computable on a Probabilistic Turing Machine” “efficiently computable” = that is: running in time O(n c ) (for some c) that is: running in time O(n c ) (for some c) Here we assume that the poly-time Turing Machines are the right model for the real-life computation. Not true if a quantum computer is built... Here we assume that the poly-time Turing Machines are the right model for the real-life computation. Not true if a quantum computer is built...

Probabilistic Turing Machines A standard Turing Machine has some number of tapes: A probabilistic Turing Machine has an additional tape with random bits.

Some notation If M is a Turing Machine then M(X) is a random variable denoting the output of M assuming that the contents of the random tape was chosen uniformly at random.

More notation Y ← M(X) means that the variable Y takes the value that M outputs on input X (assuming the random input is chosen uniformly).

Interactive Turing Machines A A B B A has read-only access to the “B’s output tape”. B has read-only access to the “A’s output tape”.

Interactive Turing Machines Of course, we can generalize it to a group of n machines interacting with each other. (we can also model: private channels, broadcast channels, partial broadcast channels, etc.) Usually, we consider only poly-time, probabilistic machines. A group of interactive Turing Machines is sometimes called a protocol. Of course, we can generalize it to a group of n machines interacting with each other. (we can also model: private channels, broadcast channels, partial broadcast channels, etc.) Usually, we consider only poly-time, probabilistic machines. A group of interactive Turing Machines is sometimes called a protocol.

“very small” = “negligible” = approaches 0 faster than the inverse of any polynomial “very small” = “negligible” = approaches 0 faster than the inverse of any polynomial Very small? Formally A function µ : N → R is negligible if

Nice properties of these notions A sum of two polynomials is a polynomial: poly + poly = poly A product of two polynomials is a polynomial: poly * poly = poly A sum of two negligible functions is a negligible function: negl + negl = negl Moreover: A negligible function multiplied by a polynomial is negligible negl * poly = negl

Security parameter The terms “negligible” and “polynomial” make sense only if X (and the adversary) take an additional input 1 n called a security parameter. In other words: we consider an infinite sequence X(1),X(2),... of schemes. Typically, we will say that a scheme X is secure if A polynomial-time Turing Machine M P (M breaks the scheme X) is negligible

Example security parameter n = the length of the secret key k in other words: k is always a random element of {0,1} n security parameter n = the length of the secret key k in other words: k is always a random element of {0,1} n The adversary can always guess k with probability 2 -n. This probability is negligible. The adversary can always guess k with probability 2 -n. This probability is negligible. He can also enumerate all possible keys k in time 2 n. (the “brute force” attack) This time is exponential. He can also enumerate all possible keys k in time 2 n. (the “brute force” attack) This time is exponential.

Is this the right approach? Advantages 1.All types of Turing Machines are “equivalent” up to a “polynomial reduction”. Therefore we do need to specify the details of the model. 2.The formulas get much simpler. Disadvantage Asymptotic results don’t tell us anything about security of the concrete systems. However Usually one can prove formally an asymptotic result and then argue informally that “the constants are reasonable” (and can be calculated if one really wants).

How to change the security definition? An encryption scheme is perfectly secret if for every m 0,m 1 Є M P Enc(K, m 0 ) = P Enc(K, m 1 ) we will require that m 0,m 1 are chosen by a poly-time adversary we will require that no poly-time adversary can distinguish Enc(K, m 0 ) from Enc(K, m 1 )

A game adversary (polynomial-time probabilistic Turing machine) oracle chooses m 0,m 1 such that |m 0 |=|m 1 | m 0,m 1 1.selects k randomly from {0,1} n 2.chooses a random b = 0,1 3.calculates c := Enc(k,m b ) (Enc,Dec) – an encryption scheme c has to guess b Security definition: We say that (Enc,Dec) is semantically-secure if any polynomial time adversary guesses b correctly with probability at most ε(n), where ε is negligible. security parameter 1 n Alternative name: has indistinguishable encryptions (sometimes we will say: “is computationally-secure”, if the context is clear) Alternative name: has indistinguishable encryptions (sometimes we will say: “is computationally-secure”, if the context is clear)

Testing the definition Suppose the adversary can compute k from Enc(k,m). Can he win the game? YES! Suppose the adversary can compute some bit of m from Enc(k,m). Can he win the game? YES!

Multiple messages In real-life applications we need to encrypt multiple messages with one key. The adversary may learn something about the key by looking at ciphertexts c 1,...,c t of some messages m 1,...,m t. How are these messages chosen? let’s say: the adversary can choose them! (good tradition: be as pessimistic as possible)

A chosen-plaintext attack (CPA) oracle chooses m’ 1 m’ 1 c 1 = Enc(k,m’ 1 ) has to guess b chooses m’ t m’tm’t c t = Enc(m’ t ) m 0,m 1 c = Enc(k,m b ) chooses m 0,m 1 the interaction continues... security parameter 1 n 1.selects random k Є {0,1} n 2.chooses a random b = 0,1... challenge phase:

CPA-security We say that (Enc,Dec) has indistinguishable encryptions under a chosen-plaintext attack (CPA) if every randomized polynomial time adversary guesses b correctly with probability at most ε(n), where ε is negligible. Alternative name: CPA-secure Every CPA-secure encryption has to be randomized, or “have a state”. Observation Security definition

CPA in real-life Q: Aren’t we too pessimistic? A: No! CPA can be implemented in practice. Example: routing m kk Enc k (m)

Is it possible to prove security? Bad news: Theorem P ≠ NP If semantically-secure encryption exists (with |k| < |m| ) If semantically-secure encryption exists (with |k| < |m| ) then Intuition: if P = NP then the adversary can guess the key...

(Enc,Dec) -- an encryption scheme. For simplicity suppose that Enc is deterministic (Enc,Dec) -- an encryption scheme. For simplicity suppose that Enc is deterministic Proof [1/3] Consider the following language: Clearly L is in NP. (k is the NP-witness) chooses random m 0,m 1 such that |m i |=n+1 chooses random m 0,m 1 such that |m i |=n+1 m 0,m 1 1.selects k randomly 2.chooses a random b = 0,1 3.calculates c := Enc(k,m b ) 1.selects k randomly 2.chooses a random b = 0,1 3.calculates c := Enc(k,m b ) c If (c,m 0 ) Є L then output 0 else output 1 If (c,m 0 ) Є L then output 0 else output 1 Suppose P = NP and hence L is poly-time decidable.

Proof [2/3] The adversary guesses b incorrectly only if b = 1 and (c,m 0 ) Є L. What is the probability that this happens? In other words: “there exists k’ such that Enc k (m 1 ) = Enc k’ (m 0 )” In other words: “there exists k’ such that Enc k (m 1 ) = Enc k’ (m 0 )”

Proof [3/3] m1m1 k c = Enc k (m 1 ) c c c c messages keys From the correctness of encryption: c can appear in each column at most once. Hence the probability that it appears in a randomly chosen row is at most: | K | /| M | = 1/2. So, the adversary wins with probability at least 3/4 m0m0

Moral: “If P=NP, then the semantically-secure encryption is broken” Is it 100% true? Not really... This is because even if P=NP we do not know what are the constants. Maybe P=NP in a very “inefficient way”...

To prove security of a cryptographic scheme we need to show a lower bound on the computational complexity of some problem. To prove security of a cryptographic scheme we need to show a lower bound on the computational complexity of some problem. In the “asymptotic setting” that would mean that at least we show that P ≠ NP. In the “asymptotic setting” that would mean that at least we show that P ≠ NP. Does the implication in the other direction hold? (that is: does P ≠ NP imply anything for cryptography?) Does the implication in the other direction hold? (that is: does P ≠ NP imply anything for cryptography?) No! (at least as far as we know) Therefore proving that an encryption scheme is secure is probably much harder than proving that P ≠ NP.

What can we prove? We can prove conditional results. That is, we can show theorems of a type: Suppose that some scheme Y is secure then scheme X is secure. Suppose that some “computational assumption A” holds Suppose that some “computational assumption A” holds then scheme X is secure.

Research program in cryptography Base the security of cryptographic schemes on a small number of well-specified “computational assumptions”. then scheme X is secure. Some “computational assumption A” holds Some “computational assumption A” holds in this we have to “believe” the rest is provable Examples of A: “decisional Diffie-Hellman assumption” “strong RSA assumption” interesting only if this is “far from trivial”

© 2009 by Stefan Dziembowski. Permission to make digital or hard copies of part or all of this material is currently granted without fee provided that copies are made only for personal or classroom use, are not distributed for profit or commercial advantage, and that new copies bear this notice and the full citation.