Two-Stage Optimisation in the Design of Boolean Functions John A Clark and Jeremy L Jacob Dept. of Computer Science University of York, UK

Slides:



Advertisements
Similar presentations
Heuristic Search techniques
Advertisements

G5BAIM Artificial Intelligence Methods
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Tabu Search for Model Selection in Multiple Regression Zvi Drezner California State University Fullerton.
1 Optimisation Although Constraint Logic Programming is somehow focussed in constraint satisfaction (closer to a “logical” view), constraint optimisation.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Exploiting the Search Process John A Clark Dept. of Computer Science University of York, UK
Assumptions in the Use of Heuristic Optimisation in Cryptography John A Clark Dept. of Computer Science University of York, UK
Evolving Boolean Functions Satisfying Multiple Criteria John A Clark, Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian.
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
Heuristic Optimisation in Design and Analysis John A Clark University of York, UK
MAE 552 – Heuristic Optimization
Iterative Improvement Algorithms For some problems, path to solution is irrelevant: just want solution Start with initial state, and change it iteratively.
Fault Injection and a Timing Channel on an Analysis Technique John A Clark and Jeremy L Jacob Dept. of Computer Science University of York, UK
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Protocols are Programs Too: Using GAs to Evolve Secure Protocols John A Clark Dept. of Computer Science University of York, UK
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Challenging Assumptions in the Use of Heuristic Search Techniques in Cryptography John A Clark Dept. of Computer Science University of York, UK
Optimization Methods One-Dimensional Unconstrained Optimization
Problem Warping and Computational Dynamics in the Solution of NP-hard Problems John A Clark Dept. of Computer Science University of York, UK
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Genetic Algorithm.
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Cryptography, Attacks and Countermeasures Lecture 4 –Boolean Functions John A Clark and Susan Stepney Dept. of Computer Science University of York, UK.
Heuristic Optimization Methods
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
Simulated Annealing.
Computational Complexity Jang, HaYoung BioIntelligence Lab.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Algorithms and their Applications CS2004 ( ) 13.1 Further Evolutionary Computation.
G5BAIM Artificial Intelligence Methods
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Evolutionary Algorithms K. Ganesh Research Scholar, Ph.D., Industrial Management Division, Humanities and Social Sciences Department, Indian Institute.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Local Search and Optimization Presented by Collin Kanaley.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
Optimization Problems
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Optimization Problems
Optimization via Search
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
Heuristic Optimization Methods
C.-S. Shieh, EC, KUAS, Taiwan
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
Example: Applying EC to the TSP Problem
Meta-Heuristic Algorithms 16B1NCI637
Optimization Problems
School of Computer Science & Engineering
“Hard” Optimization Problems
More on Search: A* and Optimization
Presentation transcript:

Two-Stage Optimisation in the Design of Boolean Functions John A Clark and Jeremy L Jacob Dept. of Computer Science University of York, UK

Overview Optimisation general introduction hill-climbing simulated annealing. Boolean function design (reprise) Experimental approach and results. Conclusions and future work.

Optimisation Subject of huge practical importance. An optimisation problem may be stated as follows: Find the value x that maximises the function z(y) over D. Given a domain D and a function z: D   find x in D such that z(x)=sup{z(y): y in D}

Optimisation Traditional optimisation techniques include: calculus (e.g. solve differential equations for extrema) f(x)= -3 x 2 +6x solve f '(x)=-6x+6=0 to obtain x=1 with maximum f(x)=3 hill-climbing: inspired by notion of calculus gradient ascent etc. (quasi-) enumerative: brute force (a crypto-favourite) linear programming branch and bound dynamic programming

Optimisation Problems Traditional techniques not without their problems assumptions may simply not hold e.g. non-differentiable discontinuous functions non-linear functions problem may suffer from ‘ curse (joy?) of dimensionality ’ - the problem is simply too big to handle exactly (e.g. by brute force or dynamic programming). NP hard problems. Some techniques may tend to get stuck in local optima for non- linear problems (see later) The various difficulties have led researchers to investigate the use of heuristic techniques typically inspired by natural processes that typically give good solutions to optimisation problems (but forego guarantees).

Heuristic Optimisation A variety of techniques have been developed to deal with non-linear and discontinuous problems highest profile one is probably genetic algorithms works with a population of solutions and breeds new solutions by aping the processes of natural reproduction Darwinian survival of the fittest proven very robust across a huge range of problems can be very efficient Simulated annealing - a local search technique based on cooling processes of molten metals (used in this paper) Will illustrate problems with non-linearity and then describe simulated annealing.

Local Optimisation - Hill Climbing Let the current solution be x. Define the neighbourhood N(x) to be the set of solutions that are ‘close’ to x If possible, move to a neighbouring solution that improves the value of z(x), otherwise stop. Choose any y as next solution provided z(y) >= z(x) loose hill-climbing Choose y as next solution such that z(y)=sup{z(v): v in N(x)} steepest gradient ascent

Local Optimisation - Hill Climbing x0x0 x1x1 x2x2 z(x) Neighbourhood of a point x might be N(x)={x+1,x-1} Hill-climb goes x 0  x 1  x 2 since f(x 0 ) f(x 3 ) and gets stuck at x 2 (local optimum) x opt Really want to obtain x opt x3x3

Simulated Annealing x0x0 x1x1 x2x2 z(x) Allows non-improving moves so that it is possible to go down x 11 x4x4 x5x5 x6x6 x7x7 x8x8 x9x9 x 10 x 12 x 13 x in order to rise again to reach global optimum

Simulated Annealing Allows non-improving moves to be taken in the hope of escaping from local optimum. Previous slide gives idea. In practice the size of the neighbourhood may be very large and a candidate neighbour is typically selected at random. Quite possible to accept a worsening move when an improving move exists.

Simulated Annealing Improving moves always accepted Non-improving moves may be accepted probabilistically and in a manner depending on the temperature parameter Temp. Loosely the worse the move the less likely it is to be accepted a worsening move is less likely to be accepted the cooler the temperature The temperature T starts high and is gradually cooled as the search progresses. Initially virtually anything is accepted, at the end only improving moves are allowed (and the search effectively reduces to hill- climbing)

Simulated Annealing Current candidate x. At each temperature consider 400 moves Always accept improving moves Accept worsening moves probabilistically. Gets harder to do this the worse the move. Gets harder as Temp decreases. Temperature cycle

Crypto and Heuristic Optimisation Most work on cryptanalysis attacking variety of simple ciphers - simple substitution and transposition through poly-alphabetic ciphers etc. more recent work in attacking NP hard problems But perhaps most successful work has been in design of cryptographic elements. Most work is rather direct in its application.

Boolean Function Design A Boolean function For present purposes we shall use the polar representation f(x) x Will talk only about balanced functions where there are equal numbers of 1s and -1s.

Preliminary Definitions Definitions relating to a Boolean function f of n variables Walsh Hadamard Linear function L  (x)=  1 x 1  …   n x n L  (x)=(-1) L  (x) (polar form)

Preliminary Definitions Non-linearity Auto-correlation For present purposes we need simply note that these can be easily evaluated given a function f. They can therefore be used as the functions to be optimised. Traditionally they are. AC f =max |  f(x)f(x+s) | x s

Using Parseval’s Theorem Parseval’s Theorem Loosely, push down on F(  ) 2 for some particular  and it appears elsewhere. Suggests that arranging for uniform values of F(  ) 2 will lead to good non-linearity. This is the initial motivation for our new cost function. NEW FUNCTION!

Moves Preserving Balance Start with balanced (but otherwise random) solution. Move strategy preserves balance Neighbourhood of a particular function f to be the set of all functions obtained by exchanging (flipping) any two dissimilar values. Here we have swapped f(2) and f(4) f(x) x g(x) 1

Getting in the Right Area Previous work (QUT) has shown strongly Heuristic techniques can be very effective for cryptographic design synthesis Boolean function, S-box design etc Hill-climbing works far better than random search Combining heuristic search and hill-climbing generally gives best results Aside – notion applies more generally too - has led to development of memetic algorithms in GA work. GAs known to be robust but not suited for ‘fine tuning’. We will adopt this strategy too: use simulated annealing to get in the ‘right area’ then hill-climb. But we will adopt the new cost function for the first stage.

Hill-climbing With Traditional CF (n=8)

Varying the Technique (n=8) Non-linearity Autocorrelation Simulated Annealing With Traditional CF Simulated Annealing With New CF Simulated Annealing With New CF+ Hill Climbing With Traditional CF

Tuning the Technique Experience has shown that experimentation is par for the course with optimisation. Initial cost function motivated by theory but the real issue is how the cost function and search technique interact. Have generalised the initial cost function to give a parametrised family of new cost functions Cost(f)=  ||F(  )|-(2 n/2 +K)| R

Tuning the Technique (n=8) Non-linearity Autocorrelation Illustration of how results change as K is varied 400 runs

Tuning the Technique (n=8) Non-linearity Autocorrelation Further illustration of how results change as K is varied. 100 Runs

Comparison of Results

Summary and Conclusions Have shown that local search can be used effectively for a cryptographic non-linear optimisation problem - Boolean Function Design. ‘Direct’ cost functions not necessarily best. Cost function is a means to an end. Whatever works will do. Cost function efficacy depends on problem, problem parameters, and the search technique used. You can take short cuts with annealing parameters (and computationally there may be little choice) Experimentation is highly beneficial should look to engaging theory more?

Future Work Opportunities for expansion: detailed variation of parameters use of more efficient annealing processes (e.g. thermostatistical annealing). evolution of artefacts with hidden properties (you do not need to be honest - e.g. develop S-Boxes with hidden trapdoors) experiment with different cost function families multiple criteria etc. evolve sets of Boolean functions other local techniques (e.g. tabu search, TS) more generally, when do GAs, SA, TS work best? investigate non-balanced functions.