Download presentation

Presentation is loading. Please wait.

Published byTaryn Hovel Modified over 2 years ago

1
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 1 Reasonably Secure: An Analysis of the Expected Cost of Crypto-analytic Attacks through 2020 Jesse Walker, Intel Corporation

2
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 2 “Making predictions is foolish” –Bruce Schneier, Applied Cryptography, on estimating the cost to break cryptographic primitives

3
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 3 Goals Estimate the cost of crypto-analytic attacks against primitives on which authentication is based Use results to suggest requirements or guidelines for “reasonably secure” algorithms and key sizes –Identify possible “reasonably secure” compliance classes –Identify a timetable for transitioning key lengths within each compliance class

4
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 4 Agenda Background The Lenstra-Verheul model Results Discussion Call to Action Summary

5
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 5 Background (1) wants “reasonable security” but has not quantified what this means TGi authentication discussion bogged down arguing over what this means We need estimates of the actual cost of attacking various authentication algorithms, to help: –Quantify the challenge –Provide a more concrete frame of reference for the requirements discussion –Lead us to a decision the market will accept

6
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 6 Background (2) TGi split into 3 camps on authentication –Legacy RADIUS-based methods camp –Kerberos over EAP camp –TLS over EAP camp Recent discussions add SRP to the mix, too Question: How secure is each of these? Which are “reasonably secure”?

7
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 7 Background (3) We can dismiss legacy authentication as insecure: –RADIUS PAP, CHAP, WEP authentication If a legacy authentication exchange is visible, it can be broken by a single machine: –Cost to break an observed PAP exchange: 0 cycles –Cost to break an observed WEP authentication exchange: 48 cycles –Cost to break an observed CHAP exchange: O(2 32 ) cycles (3.33 seconds on a 1.2 GHz Pentium IV) Legacy authentication doesn’t meet functional requirements anyway: –No mutual authentication –No key agreement/distribution

8
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 8 Background (4) We can dismiss legacy Kerberos as insecure, too: –Existing Kerberos implementations based on passwords If a legacy Kerberos exchange is visible, it can be broken by a single machine: –Cost to break observed legacy password-based Kerberos AS_REP packet: O(2 34 ) cycles (14.5 seconds on a 1.2 GHz Pentium IV) Legacy Kerberos is not “reasonably secure” even if future Kerberos will be –Kerberos needs PKInit to advance to Proposed Standard and be deployed before it becomes secure –or it needs some other unstandardizable, out-of-band channel to distribute real keys, not passwords

9
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 9 Background (5) Techniques that can be broken in seconds or minutes by brute force search on a single stock CPUs are not “reasonably secure” Stock CPUs will only be even faster when the first TGi hardware finally ships Random observation: a single 1.2 GHz Pentium IV makes available about 3 MIP Years instructions every day –These things can be networked together –It is easy to harvest spare MIPs from un-firewalled networked machines –And practical: successful attack on 512-bit RSA demonstrated in 1999 using this technique with much less powerful Pentium IIs and IIIs

10
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 10 Background (6) PKInit, TLS, SRP rely on public key schemes Crypto-analytic attack cost estimates for public key schemes notoriously difficult But we can’t reach consensus without more tangible security estimates –Can’t over-provision too much, or the market will rebel –Can’t under-provision too much, or security will be crucified in the press How much is safe?

11
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 11 Agenda Background The Lenstra-Verheul model Results Discussion Call to Action Summary

12
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 12 Approach This submission uses the Lenstra-Verheul model to estimate crypto-analytic costs of public key algorithms The Lenstra-Verheul model can be found at: –Crypto 2000 proceedings –http://www.cryptosavvy.com/ –It is sketched in the Backup section of these slides

13
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 13 Lenstra-Verheul Model (1) Developed by financial industry to estimate their e-business risk and plan their investments Provides model to estimate key sizes needed in year y in the future Thorough model: –Takes into account Moore’s law, rate of crypto-analytic progress, economic growth, etc. Estimates cost of breaking DES and extrapolates this result to RSA, Discrete Log algorithms, ECC, based on instruction counts of fastest published attacks

14
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 14 Lenstra-Verheul Model (2) Model’s most interesting parameter is security margin s: in what year do you no longer trust 56- bit DES? –This is different than assuming DES is broken in year s; rather it is merely when you aren’t willing to assume the risk of using it any longer Budget required to build a one-day DES cracker for that year can be calculated. Model extrapolates this budget value to any year in future using normal compound interest

15
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 15 Lenstra-Verheul Model (2) Model uses Moore’s law and assumption about rate of crypto-analytic progress to predict number of symmetric key bits same adversary can break in one-day attack at any year in the future Model then translates this into approximate RSA and DH key sizes same adversary could break with comparable hardware –Based on instruction count of fastest known published attack

16
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 16 Agenda Background The Lenstra-Verheul model Results Discussion Call to Action Summary

17
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 17 Some Milestone Dates 1997: Kocher-Gilmer DES-cracker. Public realizes 56-bit DES is not secure –Call attackers capable of these attacks only Consumer Grade Adversaries 1992: Last year 56-bit DES is certified as safe. Industry at large admits it needs a stronger cipher to protect burgeoning e-commerce –Call attackers capable of these attacks Commercial Grade Adversaries 1985: 3DES ratified as an ANSI standard. Financial community confirms it cannot meet it legal obligations using 56-bit DES –Call attackers capable of these attacks Enterprise Grade Adversaries

18
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 18 Sample Results (1) Model predicts future budgets of various adversaries, based on when they could first mount a 1-day attack on 56-bit DES:

19
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 19 Sample Results (2) Model predicts maximum Elliptic Curve field size an adversary can break in a 1-day attack based on this budget:

20
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 20 Sample Results (3) Model predicts maximum RSA key size or Diffie-Hellman group size an adversary can break in a 1-day attack based on this budget:

21
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 21 Agenda Background The Lenstra-Verheul model Results Discussion Call to Action Summary

22
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 22 Applications to Authentication Algorithms Each of the discussed public key-based authentication algorithms rely on RSA or Discrete Log techniques –PKInit uses RSA or Diffie-Hellman; could use ECC –TLS uses RSA or Diffie-Hellman or ECC methods –SRP uses Diffie-Hellman Therefore the model can be used to quantify security needs of each And we can use this information to arrive at a definition of “reasonably secure”

23
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 23 Lessons Not plausible to avoid public key operations entirely and still provide “reasonable security” –Radically unpleasant implications for the cost of stand-alone APs and hand-helds unless the number of public key operations can be minimized Feasible to make plausible estimates of public key sizes needed to provide “reasonable security” One key size will not work for the entire market –The minimum security requirements for one market segment are drastic overkill for other market segments One key size model will not work through all time –The required key size gets worse every year

24
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 24 Agenda Background The Lenstra-Verheul model Results Discussion Call to Action Summary

25
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 25 How do we proceed? (1) Act now: we have to support public key methods somewhere; let’s admit this and move on. –Symmetric key schemes by themselves cannot provide any notion of “reasonable security” –The right issue is where and how to use these algorithms, not if

26
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 26 Some Problems Customers can’t deal with different algorithms and key sizes –Non-cryptographers don’t know which to use and when Unlikely a single protection level can be accepted –Each vendor addresses a different market niche Minimally acceptable security for the enterprise is overkill for other markets, e.g., public access. –Consumers, public access won’t pay this price. Maximally acceptable security level (because of cost) for, e.g., consumers, is unacceptable to enterprises. –Enterprises don’t deploy schemes that don’t protect their IP

27
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 27 How do we proceed? (2) Adopt a model to estimate public key costs based on agreed upon assumptions Define a range of conformance classes and key sizes needed for each for conformance class Define review cycle, where key size estimates and the standard are updated –Implies a new definition of conformance: a product can claim to provide a level of security only through year N. –There are no valid unqualified claims of security in this model Precedent: Bank vaults are rated in hours required for penetration

28
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 28 Example (1) Define, e.g., 3 conformances classes –“Public” or “Consumer”: protect against attacks by individuals, such as script kiddies and grandmothers. Take s = 1997 –“Commercial”: defend against attacks by small organizations, such private investigators and small time organized crime. Take s = 1992 –“Enterprise”: attempt to deter professional grade industrial espionage. Take s = 1985

29
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 29 Example (2) Products certified through year N for the Public Conformance Class must support –N = 2010: 768-bit asymmetric key, or 768-bit group and 128-bit discrete log key, or 131-bit Elliptic Curve –N = 2015: 1024-bit asymmetric key, or 1024-bit group and 144-bit discrete log key, or 163-bit Elliptic Curve –N = 2020: 1236-bit asymmetric key, or 1236-bit group and 160-bit discrete log key, or 163-bit Elliptic Curve Define similar scales for other conformance classes

30
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 30 Example (3) Review required algorithms, key lengths, and conformance classes every 5 years –Cryptographic breakthrough may render estimates wildly optimistic –Moore’s law may fail in , rendering further improvements less necessary Review would not “revoke” certification of already shipped equipment –Only addresses what kind security claims can be made for new equipment.

31
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 31 Agenda Background The Lenstra-Verheul model Results Discussion Call to Action Summary

32
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 32 Summary We cannot provide secure authentication and key distribution w/o public key operations somewhere We can estimate the cost required for the operations –But cost changes over time –And different costs are acceptable to different markets We should specify the minimum key sizes conformant implementations have to support for particular markets.

33
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 33 Feedback?

34
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 34 Backup

35
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 35 CHAP Assumptions The O(2 32 ) cycle estimate on Slide 7 for breaking CHAP is based on following assumptions: –An MD5 of at most 64 bytes takes 1250 cycles (cost of OpenSSL MD5) –Password used for authentication (the legacy configuration) –A dictionary of 3,000,000 entries can recover most passwords (dictionary available at –Reproducing a recorded CHAP exchange using brute force dictionary search costs at most 1250 = 2 32 cycles

36
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 36 Legacy Kerberos Assumptions A 3DES operation requires 145 cycles/byte (cost of OpenSSL DES) Password used to 3DES encrypt AS_REP data (the legacy configuration) Typical encrypted AS_REP data is 40 bytes A dictionary of 3,000,000 entries can recover most passwords (dictionary available at Reproducing a recorded AS_REP reply using brute force dictionary search costs at most 145 40 = 17,400,000,000 2 34 cycles

37
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 37 Lenstra-Verheul Sketch (1) Major Parameters: –Security margin s: last year user was willing to trust 56- bit DES. Default: s = 1982 (arbitrary) –Number of months m to double processor speed, memory side. Default: m = 18 (empirical observation) –Number of years b for attacker’s budget to double. Default, b = 10 (approximate empirical observation, based on general economic growth) –Number of months r for cryptanalytic techniques to become twice as effective. Default: r = 18 (empirical observation)

38
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 38 Lenstra-Verheul Sketch (2) Major Parameters (Continued): –Wholesale price p of a stripped down 450 MHz Pentium II with 64 MB and NIC. Default: p = $100 (empirical observation: approximate cost of an SBC) –Number of CPU cycles v to perform one encryption. Default: v = 1 (valid for DES chips, ludicrously low for software, but this is inconsequential) If you don’t agree with a parameter value, change it!

39
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 39 Lenstra-Verheul Sketch (3) Major Formulas: –Infeasible number of MIP years y (i.e., number of MIP years required for attack on n-bit DES in year y): IMY(y) = 5 10 5 2 12(y s)/m 2 t(y s)/b –Symmetric key size d needed until year y: d = 56+(12/m+t/b) log 2 (v) –Formulas to translate d into appropriate size for the following: RSA key size and Discrete Log group size Discrete Log key size ECC key size

40
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 40 Lenstra-Verheul Sketch (4) Note: other parameters also affect the computed key sizes and lead to different (but same order of magnitude) results –They are ignored here This presentation –Uses defaults (but varies security margin parameter s) to suggest order of magnitude costs, to help focus the decision process –Focuses only on expected lower bound cost of attacks, not on safety margin needed as a hedge, or any of the other useful values Lenstra-Verheul computes

41
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 41 Minimal Safety in 2005

42
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 42 Minimal Safety in 2010

43
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 43 Minimal Safety in 2015

44
doc.: IEEE /374 Submission June 2001 Jesse Walker, Intel CorporationSlide 44 Minimal Safety in 2020

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google