Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Number Generators for Cryptographic Applications (Part 2) Werner Schindler Federal Office for Information Security (BSI), Bonn Bonn, January 24,

Similar presentations


Presentation on theme: "Random Number Generators for Cryptographic Applications (Part 2) Werner Schindler Federal Office for Information Security (BSI), Bonn Bonn, January 24,"— Presentation transcript:

1 Random Number Generators for Cryptographic Applications (Part 2) Werner Schindler Federal Office for Information Security (BSI), Bonn Bonn, January 24, 2008

2 Schindler24.01.2008Slide 2 Outline (Part 2)  Repetition of crucial facts  Design and evaluation criteria for physical RNGs  general advice  stochastic model  entropy  online tests, tot test, self test  AIS 31 and ISO 18031  Conclusion

3 Schindler24.01.2008Slide 3 Classification RNG deterministicnon-deterministic (true) purehybrid purehybrid purehybrid physical non-physical

4 Schindler24.01.2008Slide 4 General Requirements (I) R1: Random numbers should not show statistical weaknesses (i,e., they should pass statistical “standard tests”). R2: The knowledge of subsequences of random numbers shall not allow to practically compute predecessors or successors or to guess them with non-negligibly larger probability than without knowledge of these subsequences.

5 Schindler24.01.2008Slide 5 General Requirements (II) R3: It shall not be practically feasible to compute preceding random numbers from the internal state or to guess them with non-negligibly larger probability than without knowledge of the internal state. R4: It shall not be practically feasible to compute future random numbers from the internal state or to guess them with non-negligible larger probability than without knowledge of the internal state.

6 Schindler24.01.2008Slide 6 Ideal RNGs  Even with maximum knowhow, most powerful technical equipment and unlimited computational power an attacker has no better strategy than “blind guessing” (brute force attack).  Guessing n random bits costs 2 n-1 trials in average.  The guess work remains invariant in the course of the time.  An ideal RNG clearly meets Requirements R1 - R4  An ideal RNG is a mathematical construct.

7 Schindler24.01.2008Slide 7 PTRNG (schematic design) noise source analog digitised analog signal (das-random numbers) digital internal r.n. algorithmic postprocessing (optional; with or without memory) external r.n. external interface buffer (optional)

8 Schindler24.01.2008Slide 8 Noise source  The noise source is given by dedicated hardware.  The noise source exploits, for example,  noisy diodes  free-running oscillators  radioactive decay  quantum photon effects ... Physical RNGs (should) aim at information theoretical security.

9 Schindler24.01.2008Slide 9 Development and Security Evaluation of PTRNGs The challenge is threefold:  Development and implementation of a strong RNG design  Verification that design and implementation are indeed strong  effective detection mechanisms for possible non- tolerable weaknesses of random numbers while the PTRNG is in operation

10 Schindler24.01.2008Slide 10  The central part of a PTRNG security evaluation is to verify Requirement R2.  R1 is easy to fulfil and to check. Apart from very unusual designs R3 and R4 are “automatically” fulfilled.

11 Schindler24.01.2008Slide 11 Optimal guessing strategy  Let X denote a random variable that assumes values in a finite set S = {s 1,...,s t }.  The optimal guessing strategy begin with those values that are assumed with the largest probability.

12 Schindler24.01.2008Slide 12  Goal of a PTRNG security evaluation: Try to estimate the expected number of guesses that is necessary to find a random number with a certain (reasonable) probability.  For real-world PTRNGs it is usually yet not feasible to determine this value directly.  Instead, entropy shall provide a reliable estimator.  Goal: Determine (at least) a lower bound for the entropy per bit (general purpose RNG). Guess work and entropy (I)

13 Schindler24.01.2008Slide 13 Evaluation Note: Entropy is a property of random variables and not of values that are assumed by these random variables (here: random numbers).  In particular, entropy cannot be measured as temperature, voltage etc.  General entropy estimators for random numbers do not exist.

14 Schindler24.01.2008Slide 14 Warning Warning Warning  The test values of Maurer‘s „universal entropy test“ and of Coron‘s refinement are closely related to the entropy per random bit if the respective random variables fulfil several conditions.  If these conditions are not fulfilled (e.g. for pure DRNGs!) the test value need not be related to the entropy.  The adjective “universal” has caused a lot of confusion in the past.

15 Schindler24.01.2008Slide 15 Fundamental model assumptions  We interpret random numbers as realizations of random variables.  Although entropy is a property of random variables in the following we loosely speak of the “(average) entropy per random number” instead of “(average) gain of entropy per corresponding random variable”.  A reliable security evaluation of a PTRNG should be based on a stochastic model.

16 Schindler24.01.2008Slide 16 Guess work and entropy (II)  The min entropy is the most conservative entropy measure. For arbitrary distribution of X a lower bound for the guesswork can be expressed in terms of the min entropy while the Shannon entropy may pretend larger guess work.

17 Schindler24.01.2008Slide 17 Conditional entropy (I) Let X 1,X 2,... denote random variables that assume values in a finite set S = {s 1,...,s t }. The conditional entropy quantifies the increase of the overall entropy when augmenting X n+1 to the sequence X 1,...,X n. H(X n+1 | X 1,...,X n ) =  H(X n+1 | X 1 =x 1,...,X n =x n ) * Prob(X 1 = x 1,…,X n = x n ) x 1,...,x n  S

18 Schindler24.01.2008Slide 18 Conditional entropy (II) This formula is very useful to determine the entropy of dependent sequences. There is no pendant for the min entropy. H(X 1,...,X n, X n+1 ) = H(X 1,...,X n ) H(X n+1 | X 1,...,X n ) =... = H(X 1 ) H(X 2 | X 1 )... H(X n+1 | X 1,...,X n )

19 Schindler24.01.2008Slide 19 Guess work and entropy (III)  Assume that X 1,X 2,... denotes a sequence of binary- valued iid (identically and identically distributed) random variables. Unless n is too small H(X 1,X 2,...,X n )/n  log 2 (E(number of guesses per bit))  The assumption “iid” may be relaxed, e.g. to “stationary with finite memory”.  If the random variables X 1,X 2,...,X n are ‘close’ to the uniform distribution all parameters  give similar Renyi entropy values.

20 Schindler24.01.2008Slide 20 Guess work and entropy (IV)  (At least) the internal random numbers usually fulfil both the second and the third condition.  Hence we use the Shannon entropy in the following since it is easier to handle than the min entropy (  conditional entropy).

21 Schindler24.01.2008Slide 21 The stochastic model (I)  Goal: Estimate the increase of entropy per internal random number  Ideally, a stochastic model should specify a family of probability distributions that contains the true distribution of the internal random numbers.  It should at least specify a family of distributions that contain the distribution of the  das random numbers or  of ‚auxiliary‘ random variables that allow to estimate the increase of entropy per internal random number.

22 Schindler24.01.2008Slide 22 Example 5: Coin Tossing  PTRNG: A single coin is tossed repeatedly. "Head" (H) is interpreted as 1, "tail" (T) as 0.  Stochastic model:  The observed sequence of random numbers (here: heads and tails) are interpreted as values that are assumed by random variables X 1,X 2,….  The random variables X 1,X 2, … are assumed to be independent and identically distributed. (Justification: Coins have no memory.)  p : = Prob(X j = H)  [0,1] with unknown parameter p

23 Schindler24.01.2008Slide 23 Example 5: Coin Tossing (II)  Note: A physical model of this experiment considered the impact of the mass distribution of the coin on the trajectories. The formulation and verification of the physical model is much more complicated than for the stochastic model.

24 Schindler24.01.2008Slide 24 The stochastic model (II)  A stochastic model is not equivalent to a physical model. In particular, it does not provide the exact distribution of the das random numbers or the internal numbers in dependency of the characteristics of the components of the analog part.  Instead, the stochastic model only specifies a class of probability distributions which shall contain the true distribution (see also Example 5).  The class of probability distributions usually depends on one or several parameters.

25 Schindler24.01.2008Slide 25 The stochastic model (III)  The stochastic model shall be verified by measurements / experiments.  The parameter(s) of the true distribution are guessed on basis of measurements.  The stochastic model allows the design of effective online tests that are tailored to the specific RNG design.

26 Schindler24.01.2008Slide 26 Example 5: Coin Tossing (III) Entropy estimation (based on the stochastic model): Observe a sample x 1,x 2, …, x N. Set p := #  j  N | x j = H  / N To obtain an estimator H(X 1 ) for H(X 1 ) substitute p into the entropy formula: H(X 1 ) = - ( p* log 2 (p) + (1-p) * log 2 (1-p)) ~ ~ ~ ~ ~ ~ ~~

27 Schindler24.01.2008Slide 27 The stochastic model (IV)  For PTRNGs the justification of the stochastic model is usually more complicated and requires more sophisticated arguments.  To estimate entropy the parameter(s) are estimated first, and therefrom an entropy estimate is computed (cf. Example 5).  If the random numbers are not independent the conditional entropy per random bit has to be considered.

28 Schindler24.01.2008Slide 28 Example 6 (I) 43 bit LFSR ring oscillator 1 37 bit CASR ring oscillator 2 permutation k bit permutation k bit CASR = Cellular Automaton Shift Register (GF(2)-linear) output (internal random number) PTRNG design proposed in [Tk]

29 Schindler24.01.2008Slide 29 Example 6 (II) Two free-running ring oscillators clock the LFSR and the CASR, respectively. The intermediate time between two outputs of random numbers shall exceed a minimum number of LFSR and CASR cycles.  noise source: ring oscillators  das random numbers: number of cycles of the LFSR and CASR between subsequent calls of the RNG  internal state: current states of the LFSR and CASR  internal random number: k-bit output string

30 Schindler24.01.2008Slide 30 Example 6 (III): Dichtl’s attack  Original parameters: k =32  Assume that the attacker knows  the three following 32-bit random numbers  the number of intermediate cycles of the LFSR and CASR.  all implementation details (permutation etc.)  Notation: state of the LFSR at time t=0: (a 1,...,a 43 ) state of the CASR at time t=0: (a´ 1,...,a´ 37 )  This gives an (over-determined!) system of 96 GF(2)-linear equations in 80 variables with solution (a 1,...,a 43,a´ 1,...,a´ 37 ).

31 Schindler24.01.2008Slide 31 Example 6 (IV)  Original parameters: k = 32; minimum waiting time between two outputs: 86 LFSR cycles, 74 CASR cycles Goal: Determine the conditional entropy H(Y n+1 | Y 1,...,Y n ) (internal random numbers) in dependency of k and the time between two outputs of random numbers.

32 Schindler24.01.2008Slide 32 Example 6 (V) Upper entropy bound: At least in average, H(Y n+1 | Y 1,...,Y n )  H(V 1,n+1 ) + H(V 2,n+1 ) where the random variables V 1,n+1 and V 2,n+1 describe the number of cycles of the two ring oscillators between two calls of the RNG Lower entropy bound: Estimate the entropy from below that the output function extracts from the internal state.

33 Schindler24.01.2008Slide 33 Example 6 (VI)  In [Sch3] a thorough analysis of the PTRNG design is given. Both a lower and an upper entropy bound per internal random bit are given that depend on  the intermediate time between subsequent outputs in multiples of the average cycle time  the jitter of the ring oscillators  the bit length k of the internal random numbers.

34 Schindler24.01.2008Slide 34 Example 6 (VII): Numerical values   = average cycle length   = 0.01  (optimistic assumption) Time k internal state entropy per random number (increase of entropy) 10000  1 4,2090.943 10000  2 4.2090.914 10000  3 4.2090.866 60000  1 6.6980.991

35 Schindler24.01.2008Slide 35 Example 7 internal random numbers

36 Schindler24.01.2008Slide 36 | = clock signal | = (0-1)-comparator switch sw n = # (0-1) crossings in time interval ((n-1)s,ns] das random number: r n = sw n internal random number: y n+1 = y n + r n+1 (mod 2) sw(1)=5 sw(2)=5 sw(3)=7 sw(4)=6 sw(5)=8 time 0 s 2s 3s 4s 5s

37 Schindler24.01.2008Slide 37 Strategy  Goal: Determine (at least a lower bound for) H(Y n+1 | Y 0,...,Y n )  Remark: Example 7 contains joint research work with Wolfgang Killmann.

38 Schindler24.01.2008Slide 38 Notation  t 1,t 2,....: time periods between subsequent switches of the comparator  z n : smallest index m for which t 0 +t 1 +t 2 +...+t m >sn  w n = t 0 +t 1 +t 2 +...+t m -sn (or, equivalently, w n  t 0 +t 1 +t 2 +...+t m (mod s) ) w n denotes the time between sn and the next comparator switch

39 Schindler24.01.2008Slide 39 Stochastic model (II)  It is reasonable to assume that the noise source is in equilibrium state shortly after start-up.   The stochastic process T 1,T 2,... is assumed to be stationary (not necessarily independent !).

40 Schindler24.01.2008Slide 40 Stochastic Model (III) The das random numbers in Example 6 and Example 7 (and the das random numbers of other RNGs designs) can be modelled as follows: T 1,T 2,… are stationary R n := Z n - Z n-1 with Z n := min {m  N | T 0 + … + T m >sn}

41 Schindler24.01.2008Slide 41 General remark  Even if two different RNG designs fit to this model the distribution of the random variables T 1,T 2,… and thus of R 1,R 2,... may be very different.

42 Schindler24.01.2008Slide 42 Stationarity  The stochastic process T 1,T 2,... is assumed to be stationary  Under weak assumptions (essentially, the partial sums T 1 +T 2 +... + T m (mod s) should tend to the uniform distribution on [0,s)) it can be shown that the sequences W 1,W 2,... and R 1,R 2,... are stationary, too.  Strategy: Study the stochastic process R 1, R 2,... first.

43 Schindler24.01.2008Slide 43 Stochastic model (IV) Definition  V (u) := # 0-1 switchings in [0,u]  μ = E(T j )  σ² := generalized variance of T 1,T 2,...   (. ):= cumulative distribution function of the standard normal distribution

44 Schindler24.01.2008Slide 44 Auxiliary variables Lemma: For u = v ∙ μ

45 Schindler24.01.2008Slide 45 Entropy (II) Under mild regularity assumptions on the random variables T j the term with moderate a>0 should provide a conservative estimate for H(R n+1 | R 0,...R n ). The size of a depends on the distribution of the random variables T j.

46 Schindler24.01.2008Slide 46 Theorem 1 : Let G W (.) denote the cumulative distribution function of W n. Then Autocovariance  js 0 kk   u j VE RR E )( 1 ( ) +...+ ( +1|W 0 =u) G W (du) (ii) For k = 2, j=1,2,... this provides the auto- correlation function of the stationary process R 1,R 2,... (iii) If the T j are iid the above formula is exact. If these T j have continuous cumulative distribution function F(.) then G W (u) = (1-F(u)) / . (i) js 0    ujs VE )( ( +1) k G W (du)

47 Schindler24.01.2008Slide 47 Entropy (III) Theorem 2 (special case): Assume that the random variables T 1,T 2,... are iid with continuous cumulative distribution function F(∙). Then     s usnn du u)F VHRRRH 0 )(01 (1  (),...,|( )

48 Schindler24.01.2008Slide 48 Periods between successive 0-1-switchings

49 Schindler24.01.2008Slide 49 Example 7 (II)  Experimental results: 10 5 random bits per second with entropy per bit > 1-10 -5.  Numerical values: Erlang (2, )-distribution  a) s = 6*E(T 1 ): Var(R 1 )=3.4, corr(R 1,R 2 )=-0.06, corr(R 1,R 3 )=-0.004,...  b) s = 9*E(T 1 ): Var(R 1 )= 4.95, corr(R 1,R 2 )= -0.046, corr(R 1,R 3 )= -0.00005,...

50 Schindler24.01.2008Slide 50 PTRNGs in operation: Potential risks  Worst case: total breakdown of the noise source  Ageing effects, tolerances of components of the noise source and external influences might cause the generation of random numbers with unacceptably low quality. Such events must be detected certainly so that appropriate responses can be initiated.

51 Schindler24.01.2008Slide 51 Security measures goal shall detect a total breakdown of the noise source very soon tot-test shall ensure the functionality of the PTRNG when it is started startup test shall detect non-tolerable weaknesses or deterioration of the quality of random numbers sufficiently soon online test

52 Schindler24.01.2008Slide 52 Example 8: LFSR (linear feedback shift register)... das-r.n.internal r.n. worst case scenario: total breakdown of the noise source, inducing constant das-random numbers entropy / bit = 0,... but... internal r.n.s: good statistical properties!!! algorithmic postprocessing

53 Schindler24.01.2008Slide 53 Example 8 (II)  Statistical blackbox tests that are applied to the internal random numbers will not even detect a total breakdown of the noise source (unless the linear complexity profile is tested).  Instead, the online test should be applied to the das random numbers (typical situation).

54 Schindler24.01.2008Slide 54 General Remark  The online test should be tailored to the particular RNG (  stochastic model).  In Example 5 (coin tossing) a monobit test would be appropriate.  In [Sch1] a generic two-step procedure for the tot test, the startup test and the online test is introduced.

55 Schindler24.01.2008Slide 55 Noise alarms  A failure of the online test, the tot test or the startup test causes a noise alarm.  (At least) the online test is usually realized by statistical tests. Consequently, „false noise alarms“ may occur.  The probability for a false noise alarm depends on the significance level of the statistical tests and the evaluation rules.

56 Schindler24.01.2008Slide 56 Consequences of a noise alarm The consequences of a noise alarm may depend on the conditions of use. Possible reactions:  The PTRNG is shut down.  A human operator or the PTRNG software initiates an „emergency test“. Depending on the outcome of this emergency test the PTRNG is either shut down or put into operation again.  audit of noise alarms  …

57 Schindler24.01.2008Slide 57 ITSEC and CC ITSEC (Information Technology Security Evaluation Criteria) and CC (Common Criteria)  provide evaluation criteria for IT products which shall permit the comparability between independent security evaluations.  A product or system that has successfully been evaluated is awarded with an internationally recognized IT security certificate.

58 Schindler24.01.2008Slide 58 CC: Evaluation of Random Number Generators ITSEC, CC and the corresponding evaluation manuals do not specify evaluation criteria for random number generators. In the German evaluation and certification scheme the evaluation guidance document AIS 31: Functionality Classes and Evaluation Methodology for Physical Random Number Generators has been effective since September 2001

59 Schindler24.01.2008Slide 59 AIS 31 distinguishes between two functionality classes with increasing requirements P1 (for less sensitive applications as, for instance, IVs that are transmitted in clear) P2 (for sensitive applications as, for instance, the generation of session keys, signature parameters, ephemeral keys)

60 Schindler24.01.2008Slide 60 AIS 31: Reference Implementation  The AIS 31 is technically neutral. The applicant for a certificate has to give evidence that the PTRNG meets all requirements.  The AIS 31 has been well-tried in a number of product evaluations.  A reference implementation of the applied statistical tests can be found www.bsi.bund.de/zertifiz/zert/interpr/ais_cc.htm

61 Schindler24.01.2008Slide 61 Alternative security paradigm  The crucial points of an AIS 31 evaluation are the understanding of the design and the effectiveness of the online test. Alternative approach (e.g., ANSI X9.82, Part 2 (draft)):  complex algorithmic postprocessing algorithm with memory that meets requirements R1-R3.

62 Schindler24.01.2008Slide 62 Alternative security paradigm: Advantages and disadvantages  (+) minor requirements on the understanding of the RNG design  (+) minor requirements on the effectiveness of the online tests  (-) requires a time-consuming postprocessing algorithm  (-) possibly (without being noticed!) only practical security  (-) requires the protection of the internal state

63 Schindler24.01.2008Slide 63 ISO / IEC 18031 „Random Bit Generation“  covers all classes of RNGs  PTRNGs: Allows design principles that either follow the AIS 31 or the ANSI X9.82 (draft) approach  considers also the correctness of the implementation

64 Schindler24.01.2008Slide 64 Final remark Combining  a strong noise source  with effective online tests  and a strong algorithmic postprocessing algorithm provides two security anchors which ensure theoretical security and computational security, respectively.

65 Schindler24.01.2008Slide 65 Contact Federal Office for Information Security (BSI) Prof. Dr. Werner Schindler Godesberger Allee 185-189 53175 Bonn Tel: +49 (0)3018-9582-5652 Fax: +49 (0)3018-10-9582-5652 Werner.Schindler@bsi.bund.de www.bsi.bund.de www.bsi-fuer-buerger.de

66 Schindler24.01.2008Slide 66 Ersatzfolien  ERSATZFOLIEN

67 Schindler24.01.2008Slide 67 Entropy (Shannon Entropy) Definition: Let X denote a random variable that assumes values in a finite set S = {s 1,...,s t }. The (Shannon) entropy of X is given by Remark: (i) 0  H(X)  log 2 | S | (ii) Shannon entropy is (maybe the most) important representative of a family of entropy definitions. H(X) =  Prob(X= s j )* log 2 (Prob(X=s j )) j=1 t _

68 Schindler24.01.2008Slide 68 Renyi Entropy For 0     the term H  (X) = log 2  Prob(X= s j )  j=1 t __ 1-  1 denotes the Renyi entropy of X to parameter . As a function of  the Rényi entropy is monotonously decreasing. The most important parameters are  = 1 (Shannon entropy) and  =  (or more precisely,    ; min- entropy). H  (X) = min {- log 2 (Prob(X=s j )) | j  t}

69 Schindler24.01.2008Slide 69 Evaluation aspects: PTRNGs vs. NPTRNGs  The noise source of a physical TRNG dedicated hardware behaves essentially identical for all copies allows accurate modelling and analysis  The entropy source of a non-physical TRNG exploits system data or human interaction may show very different behaviour for different exemplars of the RNG does usually not allow an accurate, universally valid model


Download ppt "Random Number Generators for Cryptographic Applications (Part 2) Werner Schindler Federal Office for Information Security (BSI), Bonn Bonn, January 24,"

Similar presentations


Ads by Google