Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evolving Boolean Functions Satisfying Multiple Criteria John A Clark, Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian.

Similar presentations


Presentation on theme: "Evolving Boolean Functions Satisfying Multiple Criteria John A Clark, Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian."— Presentation transcript:

1 Evolving Boolean Functions Satisfying Multiple Criteria John A Clark, Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian Statistical Institute,Kolcatta,India) William Millan (SRC Queensland University of Technology,Brisbane, Australia)

2 Overview Optimisation Boolean function design Underpinning approach. Correlation immunity Linear change of basis Higher-order immunity via change of basis. Propagation criteria. Conclusions and future work.

3 Optimisation Subject of huge practical importance. An optimisation problem may be stated as follows: Find the value x that maximises the function z(y) over D. Example: maximise z(x)=-x 2 +8x-12, over x=0…100. Can use calculus to give us x=4 as the answer with z(x)=4. Given a domain D and a function z: D   find x in D such that z(x)=sup{z(y): y in D}

4 Local Optimisation - Hill Climbing Let the current solution be x. Define the neighbourhood N(x) to be the set of solutions that are ‘close to x’ If possible, move to a neighbouring solution that improves the value of z(x), otherwise stop. Choose any y as next solution provided z(y) >= z(x) loose hill-climbing Choose y as next solution such that z(y)=sup{z(v): v in N(x)} steepest gradient ascent

5 Local Optimisation - Hill Climbing x0x0 x1x1 x2x2 z(x) Neighbourhood of a point x might be N(x)={x+1,x-1} Hill-climb goes x 0  x 1  x 2 since z(x 0 ) z(x 3 ) and gets stuck at x 2 (local optimum) x opt Really want to obtain x opt x3x3

6 Simulated Annealing x0x0 x1x1 x2x2 z(x) Allows non-improving moves so that it is possible to go down x 11 x4x4 x5x5 x6x6 x7x7 x8x8 x9x9 x 10 x 12 x 13 x in order to rise again to reach global optimum Details of annealing are not that important for this talk – other global optimisation techniques could be used – but annealing has proved very effective.

7 What’s the paper about? There are many desirable properties for a Boolean functions in cryptography: balance, high non-linearity, low autocorrelation, high algebraic degree, correlation immunity of reasonable order, propagation immunity etc. The paper seeks to convince you of the following: Optimisation is a flexible tool for the design of Boolean functions with multiple desirable properties. We will consider two types of search domains: D= balanced Boolean functions; and D=sets of vectors that are Walsh (Autocorrelation) zeroes

8 Boolean Function Design A Boolean function For present purposes we shall use the polar representation 000 001 010 011 100 101 110 111 0 1 2 3 4 5 6 7 1 01 01 01 1 01 1 1 f(x) x Will talk only about balanced functions where there are equal numbers of 1s and -1s.

9 Preliminary Definitions Definitions relating to a Boolean function f of n variables Walsh Hadamard Linear function L  (x)=  1 x 1  …   n x n L  (x)=(-1) L  (x) (polar form)

10 Preliminary Definitions Non-linearity Auto-correlation For present purposes we need simply note that these can be easily evaluated given a function f. They can therefore be used as the functions to be optimised. Traditionally they are.

11 Basic Functions Using Parseval’s Theorem Parseval’s Theorem Loosely, push down on F(  ) 2 for some particular  and it appears elsewhere. Suggests that arranging for uniform values of F(  ) 2 will lead to good non-linearity. (Bent functions achieve this but we are concerned with balanced functions.) This is the initial motivation for our new cost function family NEW FUNCTION! a b c Pythagoras: a 2 +b 2 =c 2

12 Moves Preserving Balance Start with balanced (but otherwise random) solution. Move strategy preserves balance (Millan et al) Neighbourhood of a particular function f is the set of all functions obtained by exchanging (flipping) any two dissimilar values. Here we have swapped f(2) and f(4) 000 001 010 011 100 101 110 111 0 1 2 3 4 5 6 7 1 01 01 01 1 01 1 1 f(x) x 1 1 1 g(x) 1 Note that neighbouring functions have close non-linearity and autocorrelation – some degree of continuity.

13 Simple Hill Climbing Result Even simple hill-climbing can be used to good effect. By perturbing a 15 variable balanced Boolean function of non-linearity 16262 (obtained by modifying Patterson- Wiedemann functions) and hill-climbing we were able to obtain a non-linearity of 16264 (best known non-linearity so far for 15 variable balanced functions)

14 Getting in the Right Area Actually minimising this cost function family doesn’t give good results! But – it is very good at getting in the right area. Method is: Using simulated annealing minimise the cost function given (for given parameter values of X and R). Let the resulting function be f sa Now hill-climb with respect to non-linearity (Nonlinearity Targeted technique - NLT); OR…. Now hill-climb with respect to autocorrelation (Autocorrelation Targeted technique - ACT)

15 Best Profiles NLT ACT (n,degree,nonlinearity,autocorrelation)

16 Autocorrelation-related results In 1995 Zheng and Zhang introduced the two global avalanche criteria (autocorrelation and sum-of-squares). Autocorrelation bounds now receiving more attention. Best construction results due to Maitra. For n=8 both techniques (NLT and ACT) achieve lower autocorrelation than that by any previous construction or conjecture. Autocorrelation results

17 Sum of Squares Conjectures Zheng and Zhang introduced sum-of-squares: Use  f as cost function. Oddly, earlier functions actually gave better results!

18 Correlation Immunity- Direct Method See to punish lack of correlation immunity and low non-linearity Sub-optimal

19 Linear Transformation for CI(1) If Rank(WZ f )=n then form the matrix B f whose rows are linearly independent vectors from WZ f. Let C f =B f -1 and let f’(x)=f(C f x) Resulting function f’ has same nonlinearity and algebraic degree and is also CI(1). Can apply this method to basic functions generated earlier. Let WZ f be the set of Walsh zeroes of the function f Method used earlier by Maitra and Pasalic

20 Best Profiles Overall (direct and direct plus change of basis) Optimal non-linearity. Typically very low autocorrelation values Some previous bests: (6,1,2,24,64) (7,1,5,56,64) [Sarkar and Maitra, 2000] (8,1,6,116,80) [Maitra and Pasalic,2002] (7,2,4,56) [ Pasalic Maitra Johansson and Sarkar,2000] (8,1,6,116,24) seems very good, no (8,0,*,116,16) yet discovered.

21 Generalising to Higher Order Immunity Basis transformation can achieve higher order immunity functions too. Need to find subset of the Walsh zeroes such that for any k elements (1<=k<=m)  i 1,  i 2,…,  i k sums to a Walsh zero

22 Generalising to Higher Order Immunity Consider an initial permutation pwz of the Walsh zeroes We will view the first n elements of a permutation as a candidate basis How should we punish deviation from requirements?

23 Generalising to Higher Order Immunity By punishing lack of suitable rank and punishing relevant sums not being Walsh zeroes. For example for m=2 we can define the number of misses as the number of two-fold sums that are not Walsh zeroes Cost function is

24 Generalising to Higher Order Immunity This approach has allowed basis sets to be evolved with second order correlation immunity (e.g. some direct attempts to achieve (7,2,4,56) failed had required degree and non-linearity but were not CI(2). Basis transformations allowed (7,2,4,56) to be attained. Seems difficult to attain bases which give CI(3) but attempts are currently under way.

25 Transforming for Propagation Criteria Change of basis approaches can also be applied to attain PC(k)-ness. Essentially now work with autocorrelation zeroes. Only a small amount of work has been done on this but results are encouraging: Can use linear transform on (8,0,6,116,24) derived earlier to attain (8,0,6,116,24) with PC(1). Also possible to transform for higher order PC(k) in much the same fashion as before (but now we have autocorrelation misses).

26 Transforming for Propagation Criteria Have tried this on earlier functions to seek out bases of autocorrelation zeroes to give PC(2) functions. Prior to 1997 the highest algebraic degree achieved for a PC(2) function was n/2 (for Bent functions). Satoh et al [1998] gave constructions on n=L+2 L -1 input bits with algebraic degree n-L-1 (and similar for balanced functions). They note that deg(f) <=n-1 gives a trivial upper bound on degree. Searches for 2 nd -order change of basis reveals an earlier function on 6 variables which is PC(2) with degree 5. Support=c65b4d405ceb91f1

27 PC(k) and CI(m) Together Can use a cost function that punishes lack of PC(k)-ness, lack of CI(k)-ness and low non-linearity

28 Conclusions Optimisation is a very useful tool for Boolean function design and exploration. Have generated functions with excellent profiles over several criteria. The method would seem extensible. Basic functions have very special properties. Theory helps! Change of basis very useful.

29 Further Work Spectrum based approaches –some work already completed. Planting trapdoors! Who says you have to be honest about the cost function used. We said the method is extensible – there is nothing to stop it being maliciously extended! Some work on S-box generalisations completed. More on PC(k)CI(m) – very little attempted so far. Extend work on basis for higher order immunities. Other work on metaheuristic search and protocols, block cipher and public key cryptanalysis.


Download ppt "Evolving Boolean Functions Satisfying Multiple Criteria John A Clark, Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian."

Similar presentations


Ads by Google