Download presentation
Presentation is loading. Please wait.
Published byRandolph Manning Modified over 9 years ago
1
Hidden Markov Model Cryptanalysis Chris Karlof and David Wagner
2
The Context: Side Channels and Countermeasures The “Side Channel”: data gathered from the operation of a crypto scheme’s implementation Example: measuring power fluctuations of Pentium III processor when performing RSA decryption (SPA, DPA) Many processors draw different power for adds and multiplies or other operations Countermeasures: obscure the signature of key-related operations
3
Randomized Countermeasures Introduce random computations Example: randomized projective coordinates in Elliptic Curve computations Projective coordinates (X,Y,Z) of P = (x,y) are given by: Before each execution of the scalar mult to compute Q = dP, (X,Y,Z) are randomized with a random for every ≠ 0 in the finite field Coron, J.S.. “Resistance Against Differential Power Analysis for Elliptic Curve Cryptosystems”, 1999.
4
Attacks on Randomized Countermeasures Existing attacks are specific to each countermeasure No general framework or model exists for all randomized side channel countermeasures
5
Modeling Side-Channel Countermeasures To attack a randomized countermeasure, it would be great to model it first One model for simple countermeasures: Probabilistic Finite State Machine (PFSM) From Oswald, E. and Aigner, M. “Randomized Addition-Subtraction Chains as a Countermeasure against Power Attacks.” (2001) Red lines indicate optional state transitions
6
Key Recovery/Inference Problem for PFSM Need to assume PFSM is “faithful” i.e. no ambiguity in state transitions For all s i and s j S, set of states in PFSM, and = S x S x I (input bit): If (s i, s j, 0) > 0 then (s i, s j, 1) = 0
7
Key Recovery/Inference Problem for PFSM We want to infer the sequence of states traversed in a given execution of state machine M given M and Traces of the side channel, y = {y 1, y 2,…, y N } (N = number of key bits i.e. number of state transitions)
8
Solution to PFSM Inference Problem Maximum Likelihood Decoding: Input: trace y, PFSM M, state transition s, set of states S, Q = random variable of execution of M 1.Calc Pr [Q = s|y] for each s S N+1 2.Output q = argmax Pr[Q = s|y] Running Time: Exponential This paper presents how to transform PFSM into HMM, which has poly-time solution to its inference problem (using Viterbi Algorithm)
9
Hidden Markov Models (HMMs) Sequence of hidden, probabilistic states (S) Corresponding observable outputs (O) Each state is independent of every other (memoryless) P (S 1 = x 1 ) O1O2O3 P (S 2 = x 2 )P (S 3 = x 3 )
10
HMMs: The Inference Problem Definition: infer the values of the hidden states given only the observable outputs Viterbi algorithm solves the Inference Problem efficiently: O(|S| 2 * N) Are we done, then?
11
Input-Driven Hidden Markov Models HMMs do not model inputs Inputs are present in crypto systems i.e. secret keys The Viterbi algorithm on HMMs does not benefit from analysis of multiple traces of the side channel The paper presents IDHMMs and an algorithm on IDHMMs that benefits from multiple traces (useful in a noisy environment)
12
Input-Driven Hidden Markov Models IDHMMs extend HMMs by Treating inputs as random variable K n at each step n Add other random variables to capture multiple execution/trace pairs Y n r (list of R trace outputs) and Q n r (R sequences of state transitions) The solution to IDHMMs is a sequence of random variables, not quantities {0,1}
13
Solution to I-D Hidden Markov Models Can’t use Maximum Likelihood Decoding: exponential Can’t use Viterbi Alglorithm: (1) inputs are present and (2) can’t leverage multiple trace data
14
Solution to IDHMMs (cont.) Tried variation on Viterbi -> also exponential with R, number of traces Belief Propagation: new technique: Compute a separate inference of the key K for each trace, K r, for trace r For the r +1 trace, use Pr [K r | y r ] posterior distribution of keys as inputs We “propagate” biases derived in prior trace analyses to the following trace analyses
15
Solution to IDHMMs (cont.) Algorithm Progression: Compute each r single-trace inference using the r-1 key probability distribution as input (r 0 = Uniform distribution) Best estimate of the key: for probability distribution of keys K R -> If Pr [K i R = 1 | Y=y] > 0.5 then k = 1, else k = 0 INFER(K 1 1 ) K11K11 INFER(K 1 2 )INFER(K 1 r ) K12K12 K1rK1r k 1 =1 k 1 = 0
16
An Attack Experiment The authors use two randomized countermeasures as targets. The countermeasures must be modeled in a specific way to be attacked using the authors’ method The authors transform the countermeasures’ models into compatible models (PFSMs) They run their attack with errors introduced into the traces. Pr [error] is assumed to be known to attacker.
17
Attack Experiment A PFSM for randomized exponentiation e.g. 15P = 16P - P = 2(2(2(2P))) - P The transformation is applied at any step of the algorithm with Pr[0.5]
18
Attacking Randomized Countermeasures 182 key bits must be minimally recovered to be “successful.” Meet-in-the-middle search for last 10 bits takes 2 38 work. Error-less observations lead to key recovery with less than 10 traces
19
Conclusion Authors introduced HMM attacks for randomized side channel countermeasures modeled by PFSMs Presented IDHMMs and efficient approximate inference algorithm for inputs (keys) Demonstrated input inference algorithm on two randomize countermeasures in which keys could be recovered with less than 10 traces
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.