Cryptography and Authentication A.J. Han Vinck Essen, 2008

Slides:



Advertisements
Similar presentations
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Advertisements

Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY A Framework for Partial Secrecy.
Public Key Crytography1 From: Introduction to Algorithms Cormen, Leiserson and Rivest.
Ref. Cryptography: theory and practice Douglas R. Stinson
Information Theory and Security. Lecture Motivation Up to this point we have seen: –Classical Crypto –Symmetric Crypto –Asymmetric Crypto These systems.
CSE331: Introduction to Networks and Security Lecture 17 Fall 2002.
McGraw-Hill©The McGraw-Hill Companies, Inc., Security PART VII.
Introduction to Symmetric Block Cipher Jing Deng Based on Prof. Rick Han’s Lecture Slides Dr. Andreas Steffen’s Security Tutorial.
Shannon ’ s theory part II Ref. Cryptography: theory and practice Douglas R. Stinson.
CryptographyPerfect secrecySlide 1 Today What does it mean for a cipher to be: –Computational secure? Unconditionally secure? Perfect secrecy –Conditional.
Copyright © Cengage Learning. All rights reserved.
CS526Topic 2: Classical Cryptography1 Information Security CS 526 Topic 2 Cryptography: Terminology & Classic Ciphers.
Chapter 13: Electronic Commerce and Information Security Invitation to Computer Science, C++ Version, Fourth Edition SP09: Contains security section (13.4)
Ger man Aerospace Center Gothenburg, April, 2007 Coding Schemes for Crisscross Error Patterns Simon Plass, Gerd Richter, and A.J. Han Vinck.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Secure Communication for Distributed Systems.
Hamming Codes 11/17/04. History In the late 1940’s Richard Hamming recognized that the further evolution of computers required greater reliability, in.
A Cryptography Tutorial Jim Xu College of Computing Georgia Tech
L1.1. An Introduction to Classical Cryptosystems Rocky K. C. Chang, February 2013.
Public Key Model 8. Cryptography part 2.
EE5552 Network Security and Encryption block 4 Dr. T.J. Owens CEng MIET Dr T. Itagaki MIET, MIEEE, MAES.
§1 Entropy and mutual information
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Chapter 2 Basic Encryption and Decryption. csci5233 computer security & integrity 2 Encryption / Decryption encrypted transmission AB plaintext ciphertext.
Lecture 2 Overview.
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
symmetric key cryptography
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION 2-dimensional transmission A.J. Han Vinck May 1, 2003.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany Data communication line codes and constrained sequences A.J. Han Vinck Revised.
A Few Simple Applications to Cryptography Louis Salvail BRICS, Aarhus University.
One-Time Pad Or Vernam Cipher Sayed Mahdi Mohammad Hasanzadeh Spring 2004.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany On STORAGE Systems A.J. Han Vinck January 2011.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany On STORAGE Systems A.J. Han Vinck June 2004.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Introduction to Information theory A.J. Han Vinck University of Duisburg-Essen April 2012.
Message Authentication and Hash Functions Chapter 11.
Bit Cipher 1. Example of bit Cipher 2 Practical Stream Cipher 3.
Channel Capacity.
Cryptographic Attacks on Scrambled LZ-Compression and Arithmetic Coding By: RAJBIR SINGH BIKRAM KAHLON.
Cryptography Part 1: Classical Ciphers Jerzy Wojdyło May 4, 2001.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
CSCI 172/283 Fall 2010 Hash Functions, HMACs, and Digital Signatures.
CRYPTANALYSIS OF STREAM CIPHER Bimal K Roy Cryptology Research Group Indian Statistical Institute Kolkata.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
CS426Fall 2010/Lecture 61 Computer Security CS 426 Lecture 6 Cryptography: Message Authentication Code.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
1 Information Theory Nathanael Paul Oct. 09, 2002.
DIGITAL COMMUNICATIONS Linear Block Codes
15-499Page :Algorithms and Applications Cryptography I – Introduction – Terminology – Some primitives – Some protocols.
Source Coding Efficient Data Representation A.J. Han Vinck.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
K. Salah1 Cryptography Module I. K. Salah2 Cryptographic Protocols  Messages should be transmitted to destination  Only the recipient should see it.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
INFORMATION THEORY Pui-chor Wong.
Lecture 3 Overview. Ciphers The intent of cryptography is to provide secrecy to messages and data Substitutions – ‘hide’ letters of plaintext Transposition.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
CS526Topic 2: Classical Cryptography1 Information Security CS 526 Topic 2 Cryptography: Terminology & Classic Ciphers.
CHAPTER 14 ENCRYPTION AND DECRYPTION Sajina Pradhan
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Department of Computer Science Chapter 5 Introduction to Cryptography Semester 1.
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
Chapter 2 Basic Encryption and Decryption
Introduction to Information theory
COT 5611 Operating Systems Design Principles Spring 2012
Information-Theoretic Secrecy
COT 5611 Operating Systems Design Principles Spring 2014
Information-Theoretic Security
Presentation transcript:

Cryptography and Authentication A.J. Han Vinck Essen, 2008 Information theory Cryptography and Authentication A.J. Han Vinck Essen, 2008

Cryptographic model sender receiver attacker secrecy encrypt M decrypt M read M find key authentication sign test validity modify generate

General (classical) Communication Model K Secure key channel K source M encrypter C decrypter M destination analyst M‘

no information providing ciphers Shannon (1949): Perfect secrecy condition Prob. distribution (M) = Prob. distribution (M|C) and thus: H(M|C) = H(M) (no gain if we want to guess the message given the cipher)

Perfect secrecy condition Furthermore: for perfect secrecy H(M)  H(K) H(M|C)  H(MK|C) = H(K|C) + H(M|CK) C and K  M = H(K|C)  H(K) H(M) = H(M|C)  H(K) perfect secrecy!

Imperfect secrecy How much ciphertext do we need to find unique key-message pair given the cipher? Minimum is called unicity distance.

Imperfect secrecy Suppose we observe a piece of ciphertext K and ML determineCL Key K, H(K) source M, H(M) Cipher C H(CL )  Llog2|C| CL Key equivocation: H(K| CL) = H(K,CL) – H(CL)

Question: When is H(K| CL) = 0 ? ( K ML   CL ) H(K| CL) = H(K) + H(CL|K) – H(CL) = H(K) + H(ML) – H(CL) Using: H(CL )  Llog2|C|; H(ML)  LHS(M) where HS(M) is the normalized entropy per output symbol H(K| CL)  H(K) + L[HS(M)- log2|C|] = 0 for L = H(K) / [ log2|C|- HS (M)] Define U = the least value of L such that H(K| CL) = 0 Hence: U  H(K) / [ log2|C|- HS (M)]

conclusion Make HS (M) as large as possible: USE DATA REDUCTION !! H(K) H(K| CL) U is called the unicity point L H(K) / [ log2|C|- HS (M)]

examples: U  H(K) / [ log2|C|- HS(M) ] Substitution cipher: H(K) = log2 26! English: HS(M)  2; |M|= |C|=26; U  32 symbols DES: U  56 / [ 8 – 2 ]  9 ASCII symbols

examples: U  H(K) / [ log2|C|- HS(M) ] Permutation cipher: period 26; H(K) = log226! English: H(M)  2; |M|= |C|=26; U  32 symbols Vigenere: key length 80, U  13 symbols ! check!

Plaintext-ciphertext attack H(K,ML, CL) = H(K|ML, CL) + H( CL|ML ) + H(ML) = H(CL|K,ML) + H( K|ML ) + H(ML) H(K|ML, CL) = H(K) - H( CL|ML ) H( CL|ML )  Llog2|C| thus: U  H(K) / log2|C| CL ← K,ML K independent from ML

Wiretapping model Xn Xn sender noiselesss channel receiver S noise Zn wiretapper Send: n binary digits 0: Xn of even weight 1: Xn of odd weight Wiretapper: Pe = P(Zn has odd # of errors) = 1- ½(1+(1-2p)n)

Wiretapping Pe = 1- ½(1+(1-2p)n) Result: for p  ½ Pe  ½ and H(S|Zn) = 1 for p 0, Pe  np and H(S|Zn)  h(np)

Wiretapping general strategy Encoder: use R = k/n error correcting code C carrier c  { 2k codewords } message m  { 2nh(p) vectors as correctable noise } select c at random and transmit c  m Note: 2k 2nh(p)  2n  k/n  1 – h(p)

Communication sender-receiver transmit = receive : c  m first decode: c calculate m = c  m  c = m m c c m

Wiretapper: c  m n‘ receive z = c  m  n‘ - first decode: c possible when m  n‘ is decodable noise - calculate: c  m  n‘  c = m  n‘ m‘ = m  n‘ is one of 2nh(p‘) messages the # of noise sequences n‘ is |n‘ | ~ 2nh(p‘)

Wiretapping general strategy Result: information rate R = h(p) p‘ small: c decodable and H(Sk|Zn) = nh(p‘) p‘ p: H(Sk|Zn) = nh(p) H(Sk|Zn) nh(p) P P‘

Wiretapping general strategy picture 2k codewords Volume 2nh(p‘) Volume 2nh(p) codeword 2n vectors

authentication Encryption table: message X Key K ( X, K )  Y unique cipher: Y

Authentication: impersonation message: 0 1 select y at random 00 00 10 Pi (y = correct) = ½ key 01 01 00 P(key = i ) = 1/4 10 11 01 P(message = i ) = 1/2 11 10 11 cipher Pi is probability that an injected cipher is valid

Authentication: bound |X| Let : |X| # messages |K| # keys |Y| # ciphers Pi  prob (random cipher = valid)  |X|/|Y| = probability that we choose one of the colors in a specific row ´ specified by the key |K|

Cont‘d Since: ( Y, K)  X  H(X) = H(Y|K) An improved (Simmons) bound gives: Pi  2H(X)/2H(Y) = 2H(Y|K)-H(Y) = 2-I(Y;K)

Cont‘d Pi  2-I(Y;K) = 2+ H(K |Y) - H(K) For low probability of success: H(K|Y) = 0 For perfect secrecy: H(K|Y) = H(K) Contradiction!

Cont‘d Prob ( key = 0 ) = Prob ( key = 0 ) = ½; Prob (X = 0) = Prob(X = 1) = 1/2 0 1 0 00 01 1 10 11 0 1 0 00 01 1 01 00 prob success = ½ prob success = 1 H(K|Y) = 0 H(K|Y) = 1 no secrecy perfect secrecy

Authentication: impersonation X= 0 1 P(X=0) = P(X=1)= ½ K= 0 0 1 P(K=0) = P(K=1)= ½ 1 1 2 H(K) =1; H(K|Y) = ½ Pi =1 (send always 1) Pi  2H(Y|K)-H(Y) = 21-1.5 = 2 –0.5 = 0.7

Authentication: substitution message 0 1 0 0 2 Key 1 1 3 2 0 3 3 1 2 cipher Active wiretapping: replace an observed cipher by another cipher Example: observe 0  replace by 3 probability of success = ½ (accepted only if key = 2)

Authentication: substitution examples H(K) = 2; H(K|Y) = 1; Pi  ½ 0 1 0 0 2 1 1 3 2 0 3 3 1 2 0 1 0 3 1 2 2 1 3 0 0 1 0 2 1 0 3 1 2 3 Ps = ½ H(X|Y) = 0 Ps = 1 H(X|Y) = 1 Ps = ½ H(X|Y) = 1 Ps = probability( substitution is successful)