Presentation is loading. Please wait.

Presentation is loading. Please wait.

AUTOMATED REAL-WORLD PROBLEM SOLVING EMPLOYING META-HEURISTICAL BLACK-BOX OPTIMIZATION November 2013 Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos.

Similar presentations


Presentation on theme: "AUTOMATED REAL-WORLD PROBLEM SOLVING EMPLOYING META-HEURISTICAL BLACK-BOX OPTIMIZATION November 2013 Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos."— Presentation transcript:

1 AUTOMATED REAL-WORLD PROBLEM SOLVING EMPLOYING META-HEURISTICAL BLACK-BOX OPTIMIZATION November 2013 Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos National Laboratory Director, Natural Computation Laboratory Department of Computer Science Missouri University of Science and Technology

2 Black-Box Search Algorithms Many complex real-world problems can be formulated as generate-and-test problems Black-Box Search Algorithms (BBSAs) iteratively generate trial solutions employing solely the information gained from previous trial solutions, but no explicit problem knowledge

3

4 Practitioner’s Dilemma 1.How to decide for given real-world problem whether beneficial to formulate as black-box search problem? 2.How to formulate real-world problem as black-box search problem? 3.How to select/create BBSA? 4.How to configure BBSA? 5.How to interpret result? 6.All of the above are interdependent!

5 Theory-Practice Gap While BBSAs, including EAs, steadily are improving in scope and performance, their impact on routine real-world problem solving remains underwhelming A scalable solution enabling domain-expert practitioners to routinely solve real-world problems with BBSAs is needed 5

6 Two typical real-world problem categories Solving a single-instance problem: automated BBSA selection Repeatedly solving instances of a problem class: evolve custom BBSA 6

7 Part I: Solving Single- Instance Problems Employing Automated BBSA Selection 7

8 Requirements 1.Need diverse set of high- performance BBSAs 2.Need automated approach to select most appropriate BBSA from set for a given problem 3.Need automated approach to configure selected BBSA 8

9 Automated BBSA Selection 1.Given a set of BBSAs, a priori evolve a set of benchmark functions which cluster the BBSAs by performance 2.Given a real-world problem, create a surrogate fitness function 3.Find the benchmark function most similar to the surrogate 4.Execute the corresponding BBSA on the real-world problem 9

10 10 Benchmark Generator BBSA 1 BBSA 2 BBSA n … BBSA 1 BP 1 BBSA 2 BP 2 BBSA n BP n … A Priori, Once Per BBSA Set

11 11 A Priori, Once Per Problem Class Real-World Problem Sampling Mechanism Surrogate Objective Function Match with most “similar” BP k Identified most appropriate BBSA k

12 12 Per Problem Instance Real-World Problem Sampling Mechanism Surrogate Objective Function Apply a priori established most appropriate BBSA k

13 Requirements 1.Need diverse set of high- performance BBSAs 2.Need automated approach to select most appropriate BBSA from set for a given problem 3.Need automated approach to configure selected BBSA 13

14 Static vs. dynamic parameters Static parameters remain constant during evolution, dynamic parameters can change Dynamic parameters require parameter control The optimal values of the strategy parameters can change during evolution [1]

15 Solution Create self-configuring BBSAs employing dynamic strategy parameters 15

16 Supportive Coevolution [3] 16

17 Dynamical Evolution of Parameters Individuals Encode Parameter Values Indirect Fitness Poor Scaling Self Adaptation

18 Def.: In Coevolution, the fitness of an individual is dependent on other individuals (i.e., individuals are part of the environment) Def.: In Competitive Coevolution, the fitness of an individual is inversely proportional to the fitness of some or all other individuals Coevolution Basics

19 Primary Population –Encodes problem solution Support Population –Encodes configuration information Supportive Coevolution Primary Individual Support Individual

20 Each Parameter Evolved Separately Propagation Rate Separated Multiple Support Populations

21 Multiple Primary Populations

22 Rastrigin Shifted Rastrigin –Randomly generated offset vector –Individuals offset before evaluation Benchmark Functions

23 All Results Statistically Significant according to T-test with α = 0.001 Fitness Results ProblemSASuCo Rastrigin N = 10-0.2390 (-0.4249)-0.0199 (-0.1393) Rastrigin N = 20-0.3494 (-0.4905)-1.4132 (-0.7481) Shifted N = 10-4.4215 (-1.8744)-2.2518 (-0.9811) Shifted N = 20-10.397 (-4.3567)-5.7718 (-2.2848)

24 Proof of Concept Successful –SuCo outperformed SA on all but on test problem SuCo Mutation more successful –Adapts better to change in population fitness Conclusions

25 Self-Configuring Crossover [2] 25

26 Performance Sensitive to Crossover Selection Identifying & Configuring Best Traditional Crossover is Time Consuming Existing Operators May Be Suboptimal Optimal Operator May Change During Evolution Motivation

27 Meta-EA –Exceptionally time consuming Self-Adaptive Algorithm Selection –Limited by algorithms it can choose from Some Possible Solutions

28 Each Individual Encodes a Crossover Operator Crossovers Encoded as a List of Primitives –Swap –Merge Each Primitive has three parameters –Number, Random, or Inline Self-Configuring Crossover (SCX) Swap(3, 5, 2) Swap(r, i, r) Merge(1, r, 0.7) Offspring Crossover

29 Applying an SCX 1.02.03.04.05.06.07.08.0 Parent 1 GenesParent 2 Genes Concatenate Genes

30 Each Primitive has a type –Swap represents crossovers that move genetic material First Two Parameters –Start Position –End Position Third Parameter Primitive Dependent –Swaps use “Width” The Swap Primitive Swap(3, 5, 2)

31 Applying an SCX 1.02.03.04.05.06.07.08.0 Concatenate Genes Swap(3, 5, 2) Swap(r, i, r) Merge(1, r, 0.7) Offspring Crossover 3.04.05.06.0

32 Third Parameter Primitive Dependent –Merges use “Weight” Random Construct –All past primitive parameters used the Number construct –“r” marks a primitive using the Random Construct –Allows primitives to act stochastically The Merge Primitive Merge(1, r, 0.7)

33 Applying an SCX 1.02.05.06.03.04.07.08.0 Concatenate Genes Merge(1, r, 0.7) Swap(3, 5, 2) Swap(r, i, r) Offspring Crossover 0.7 g(1) = 1.0*(0.7) + 6.0*(1-0.7) g(i) = α*g(i) + (1-α)*g(j) 2.5g(2) = 6.0*(0.7) + 1.0*(1-0.7)4.5

34 Only Usable by First Two Parameters Denoted as “i” Forces Primitive to Act on the Same Loci in Both Parents The Inline Construct Swap(r, i, r)

35 Applying an SCX 2.52.05.04.53.04.07.08.0 Concatenate Genes Swap(r, i, r) Merge(1, r, 0.7) Swap(3, 5, 2) Offspring Crossover 2.04.0

36 Applying an SCX 2.54.05.04.53.02.07.08.0 Concatenate Genes Remove Exess Genes Offspring Genes

37 Evolving Crossovers Merge(1, r, 0.7) Merge(i, 8, r) Swap(r, i, r) Parent 1 Crossover Swap(4, 2, r) Swap(r, 7, 3) Parent 2 Crossover Merge(r, r, r) Offspring Crossover Swap(3, 5, 2)

38 Compared Against –Arithmetic Crossover –N-Point Crossover –Uniform Crossover On Problems –Rosenbrock –Rastrigin –Offset Rastrigin –NK-Landscapes –DTrap Empirical Quality Assessment ProblemComparisonSCX Rosenbrock-86.94 (54.54)-26.47 (23.33) Rastrigin-59.2 (6.998)-0.0088 (0.021) Offset Rastrigin-0.1175 (0.116)-0.03 (0.028) NK0.771 (0.011)0.8016 (0.013) DTrap0.9782 (0.005)0.9925 (0.021)

39 Requires No Additional Evaluation Adds No Significant Increase in Run Time –All linear operations Adds Initial Crossover Length Parameter –Testing showed results fairly insensitive to this parameter –Even worst settings tested achieved better results than comparison operators SCX Overhead

40 Remove Need to Select Crossover Algorithm Better Fitness Without Significant Overhead Benefits From Dynamically Changing Operator Conclusions

41 Current Work Combining Support Coevolution and Self-Configuring Crossover by employing the latter as a support population in the former [4] 41

42 Part II: Repeatedly Solving Problem Class Instances By Evolving Custom BBSAs Employing Meta-GP [5] 42

43 Difficult Repeated Problems –Cell Tower Placement –Flight Routing Which BBSA, such as Evolutionary Algorithms, Particle Swarm Optimization, or Simulated Annealing, to use? Problem

44 Automated Algorithm Selection Self-Adaptive Algorithms Meta-Algorithms –Evolving Parameters / Selecting Operators –Evolve the Algorithm Possible Solutions

45 Evolving BBSAs employing meta- Genetic Programming (GP) Post-ordered parse tree Evolve a repeated iteration Our Solution

46 Initialization Check for Termination Terminate Iteration

47 Genetic Programing Post-ordered parse tree Evolve a repeated iteration High-level operations Our Solution

48 Iteration Sets of Solutions Root returns ‘Last’ set Parse Tree

49 Mutate Mutate w/o creating new solution (Tweak) Uniform Recombination Diagonal Recombination Nodes: Variation Nodes

50 k-Tournament Truncation Nodes: Selection Nodes

51 makeSet Persistent Sets Nodes: Set Nodes

52 makeSet Persistent Sets addSet ‘Last’ Set Nodes: Set Nodes

53 Evaluates the nodes passed in Allows multiple operations and accurate selections within an iteration Nodes: Evaluation Node

54 Meta-Genetic Program Create Valid Population Generate Children Evaluate Children Select Survivors Check Termination

55 BBSA Evaluation Create Valid Population Generate Children Evaluate Children Select Survivors Generate Children

56 Evaluations Iterations Operations Convergence Termination Conditions

57 Deceptive Trap Problem Testing 0 | 0 | 1 | 1 | 00 | 1 | 0 | 1 | 01 | 1 | 1 | 1 | 0

58 Problem Configuration –Bit-length = 100 –Trap Size = 5 Verification Problem Configurations –Bit-length = 100, Trap Size = 5 –Bit-length = 200, Trap Size = 5 –Bit-length = 105, Trap Size = 7 –Bit-length = 210, Trap Size = 7 External Verification Testing (cont.)

59 Results 60% Success Rate

60 Results: Bit-Length = 100 Trap Size = 5

61 Results: Bit-Length = 200 Trap Size = 5

62 Results: Bit-Length = 105 Trap Size = 7

63 Results: Bit-Length = 210 Trap Size = 7

64 BBSA1BBSA2 BBSA3

65 BBSA1

66 BBSA2

67 BBSA3

68 Over-Specialization Trained Problem Configuration Alternate Problem Configuration

69 BBSA2

70 Created novel meta-GP approach for evolving BBSAs tuned to specific problem classes Ideal for solving repeated problems Evolved custom BBSA which outperformed standard EA and hill-climber on all tested problem instances Future work includes adding additional primitives and testing against state-of-the- art BBSAs on more challenging problems Summary

71 Take Home Message Practitioners need automated algorithm selection & configuration The number of BBSAs is increasing rapidly, making the selection of the best one to employ for a given problem increasingly difficult Some recent BBSAs facilitate automated real-world problem solving 71

72 References [1] Brian W. Goldman and Daniel R. Tauritz. Meta-Evolved Empirical Evidence of the Effectiveness of Dynamic Parameters. In Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '11), pages 155-156, Dublin, Ireland, July 12-16, 2011. [2] Brian W. Goldman and Daniel R. Tauritz. Self-Configuring Crossover. In Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '11), pages 575-582, Dublin, Ireland, July 12-16, 2011. [3] Brian W. Goldman and Daniel R. Tauritz. Supportive Coevolution. In Proceedings of the 14th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '12), pages 59-66, Philadelphia, U.S.A., July 7-11, 2012. [4] Nathaniel R. Kamrath, Brian W. Goldman and Daniel R. Tauritz. Using Supportive Coevolution to Evolve Self-Configuring Crossover. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '13), pages 1489-1496, Amsterdam, The Netherlands, July 6-10, 2013. [5] Matthew A. Martin and Daniel R. Tauritz. Evolving Black-Box Search Algorithms Employing Genetic Programming. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '13), pages 1497-1504, Amsterdam, The Netherlands, July 6-10, 2013.


Download ppt "AUTOMATED REAL-WORLD PROBLEM SOLVING EMPLOYING META-HEURISTICAL BLACK-BOX OPTIMIZATION November 2013 Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos."

Similar presentations


Ads by Google