Presentation is loading. Please wait.

Presentation is loading. Please wait.

Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer.

Similar presentations


Presentation on theme: "Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer."— Presentation transcript:

1 Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer Science Research Investigator, Intelligent Systems Center Collaborator, Energy Research & Development Center New Directions in Parameterless Evolutionary Algorithms

2 Vision EA fitness function representation EA operators EA parameters solution (good solution if operators and parameters are suitably configured) NOWGOAL problem instance Parameter- less EA fitness function representation problem instance good solution

3 EA Operators Parent selection, mate pairing Recombination Mutation Survival selection

4 EA Parameters Population size Initialization related parameters Parent selection parameters Number of offspring Recombination parameters Mutation parameters Survivor selection parameters Termination related parameters

5 Motivation for Parameterless EAs Parameterless EAs do not require parameters to be specified a priori A priori parameter tuning is computationally expensive Facilitate use by non-experts

6 Static vs. dynamic parameters Static parameters remain constant during evolution, dynamic can change The optimal value of a parameter can change during evolution Parameterless EAs w/ static parameters need a fully automated tuning mechanism (still computationally expensive & suboptimal) Therefore desired: Parameterless EA w/ dynamic parameters

7 Parameter Control While dynamic parameters can benefit from tuning, they can be much less sensitive to initial values (versus static) Controls dynamic parameters Three main parameter control classes: –Blind –Adaptive –Self-Adaptive

8 Prior (Semi-)Parameterless EAs 1994 Genetic Algorithm with Varying Population Size (GAVaPS) 2000 Genetic Algorithm with Adaptive Population Size (APGA) – dynamic population size as emergent behavior of individual survival tied to age – both introduce two new parameters: MinLT and MaxLT; furthermore, population size converges to 0.5 * offspring size * (MinLT + MaxLT)

9 Prior (Semi-)Parameterless EAs 1995 (1,λ)-ES with dynamic offspring size employing adaptive control – adjusts λ based on the second best individual created – goal is to maximize local serial progress-rate, i.e., expected fitness gain per fitness evaluation – maximizes convergence rate, which often leads to premature convergence on complex fitness landscapes

10 Prior (Semi-)Parameterless EAs 1999 Parameter-less GA – runs multiple fixed size populations in parallel – the sizes are powers of 2, starting with 4 and doubling the size of the largest population to produce the next largest population – smaller populations are preferred by allotting them more generations – a population is deleted if a) its average fitness is exceeded by the average fitness of a larger population, or b) the population has converged – no limit on number of parallel populations

11 Prior (Semi-)Parameterless EAs 2003 self-adaptive selection of reproduction operators – each individual contains a vector of probabilities of using each reproduction operator defined for the problem – probability vectors updated every generation – in the case of a multi-ary reproduction operator, another individual is selected which prefers the same reproduction operator

12 Prior (Semi-)Parameterless EAs 2004 Population Resizing on Fitness Improvement GA (PRoFIGA) – dynamically balances exploration versus exploitation by tying population size to magnitude of fitness increases with a special mechanism to escape local optima – introduces several new parameters

13 Prior (Semi-)Parameterless EAs 2005 (1+λ)-ES with dynamic offspring size employing adaptive control – adjusts λ based on the number of offspring fitter than their parent: if none fitter, than double λ; otherwise divide λ by number that are fitter – idea is to quickly increase λ when it appears to be too small, otherwise to decrease it based on the current success rate – has problems with complex fitness landscapes that require a large λ to ensure that successful offspring lie on the path to the global optimum

14 Prior (Semi-)Parameterless EAs 2006 self-adaptation of population size and selective pressure – employs “voting system” by encoding individual’s contribution to population size in its genotype – population size is determined by summing up all the individual “votes” – adds new parameters p min and p max that determine an individual’s vote value range

15 NC-LAB Vision for a New Direction in Parameterless EAs: Autonomous EAs (AutoEAs)

16 Motivation Selection operators are not commonly used in an adaptive manner Most selection pressure mechanisms are based on Boltzmann selection Framework for creating Parameterless EAs Centralized population size control, parent selection, mate pairing, offspring size control, and survival selection are highly unnatural!

17 Approach Remove unnatural centralized control by: Letting individuals select their own mates Letting couples decide how many offspring to have Giving each individual its own survival chance

18 Autonomous EAs (AutoEAs) An AutoEA is an EA where all the operators work at the individual level (as opposed to traditional EAs where parent selection and survival selection work at the population level in a decidedly unnatural centralized manner) Population & offspring size become dynamic derived variables determined by the emergent behavior of the system

19 Self-Adaptive Semi-Autonomous Parent Selection (SASAPAS) Each individual has an evolving mate selection function Two ways to pair individuals: –Democratic approach –Dictatorial approach

20 Democratic Approach

21

22 Dictatorial Approach

23 Self-Adaptive Semi-Autonomous Dictatorial Parent Selection (SASADIPS) Each individual has an evolving mate selection function First parent selected in a traditional manner Second parent selected by first parent –the dictator – using its mate selection function

24 Mate selection function representation Expression tree as in GP Set of primitives – pre-built selection methods

25 Mate selection function evolution Let F be a fitness function defined on a candidate solution. Let improvement(x) = F(x) – max{F(p1),F(p2)} Max fitness plot; slope at generation i is s(g i )

26 Mate selection function evolution IF improvement(offspring)>s(g i-1 ) –Copy first parent’s mate selection function (single parent inheritance) Otherwise –Recombine the two parents’ mate selection functions using standard GP crossover (multi-parent inheritance) –Apply a mutation chance to the offspring’s mate selection function

27 Experiments Counting ones 4-bit deceptive trap –If 4 ones => fitness = 8 –If 3 ones => fitness = 0 –If 2ones => fitness = 1 –If 1 one => fitness = 2 –If 0 ones => fitness = 3 SAT

28 Counting ones results

29 Highly evolved mate selection function

30 SAT results

31 4-bit deceptive trap results

32 SASADIPS shortcomings Steep fitness increase in the early generations may lead to premature convergence to suboptimal solutions Good mate selection functions hard to find Provided mate selection primitives may be insufficient to build a good mate selection function New parameters were introduced Only semi-autonomous

33 Greedy Population Sizing (GPS)

34 |P 1 | = 2|P 0 | … |P i+1 | = 2|P i | The parameter-less GA P0P0 P1P1 P2P2 Evolve an unbounded number of populations in parallel Smaller populations are given more fitness evaluations Fitness evals Terminate smaller pop. whose avg. fitness is exceeded by a larger pop.

35 Greedy Population Sizing P0P0 P1P1 P2P2 P3P3 P4P4 P5P5 F1F1 F2F2 F3F3 F4F4 Evolve exactly two populations in parallel Equal number of fitness evals. per population Fitness evals

36 GPS-EA vs. parameter-less GA F1F1 F2F2 F3F3 F4F4 NN F1F1 2F 1 F2F2 2F 2 F3F3 2F 3 F4F4 2F 4 2F 1 + 2F 2 + … + 2F k + 3N N 2N F 1 + F 2 + … + F k + 2N N Parameter-less GA GPS-EA

37 GPS-EA vs. the parameter-less GA, OPS-EA and TGA GPS-EA < parameter-less GA TGA < GPS-EA < OPS-EA GPS-EA finds overall better solutions than parameter-less GA Deceptive Problem

38 Limiting Cases F avg (P i+1 )<F avg (P i ) No larger populations are created No fitness improvements until termination Approx. 30% - limiting cases Large std. dev., but lower MBF Automatic detection of the limiting cases is needed

39 GPS-EA Summary Advantages –Automated population size control –Finds high quality solutions Problems –Limiting cases –Restart of evolution each time

40 Estimated Learning Offspring Optimizing Mate Selection (ELOOMS)

41 Traditional Mate Selection 2538245 MATES 58 54 t – tournament selection t is user-specified

42 ELOOMS NO YES MATES YES NO YES

43 Mate Acceptance Chance (MAC) j How much do I like ? k b 1 b 2 b 3 … b L d 1 d 2 d 3 … d L

44 Desired Features j d 1 d 2 d 3 … d L # times past mates’ b i = 1 was used to produce fit offspring # times past mates’ b i was used to produce offspring b 1 b 2 b 3 … b L Build a model of desired potential mate Update the model for each encountered mate Similar to Estimation of Distribution Algorithms

45 ELOOMS vs. TGA L=500 With Mutation L=1000 With Mutation Easy Problem

46 ELOOMS vs. TGA Without Mutation With Mutation Deceptive Problem L=100

47 Why ELOOMS works on Deceptive Problem More likely to preserve optimal structure 1111 0000 will equally like: –1111 1000 –1111 1100 –1111 1110 But will dislike individuals not of the form: –1111 xxxx

48 Why ELOOMS does not work as well on Easy Problem High fitness – short distance to optimal Mating with high fitness individuals – closer to optimal offspring Fitness – good measure of good mate ELOOMS – approximate measure of good mate

49 ELOOMS computational overhead L – solution length μ – population size T – avg # mates evaluated per individual Update stage: –6L additions Mate selection stage: –2L*T* μ additions

50 ELOOMS Summary Advantages –Autonomous mate pairing –Improved performance (some cases) –Natural termination condition Disadvantages –Relies on competition selection pressure –Computational overhead can be significant

51 GPS-EA + ELOOMS Hybrid

52 Expiration of population P i If F avg (P i+1 ) > F avg (P i ) –Limiting cases possible If no mate pairs in P i (ELOOMS) –Detection of the limiting cases

53 Comparing the Algorithms Without MutationWith Mutation Deceptive Problem L=100

54 GPS-EA + ELOOMS vs. parameter-less GA and TGA Without MutationWith Mutation Deceptive Problem L=100

55 GPS-EA + ELOOMS vs. parameter-less GA and TGA Without MutationWith Mutation Easy Problem L=500

56 GPS-EA + ELOOMS Summary Advantages –No population size tuning –No parent selection pressure tuning –No limiting cases –Superior performance on deceptive problem Disadvantages –Reduced performance on easy problem –Relies on competition selection pressure

57 NC-LAB’s current AutoEA research Make λ a dynamic derived variable by self- adapting each individual’s desired offspring size Promote “birth control” by penalizing fitness based on “child support” and use fitness based survival selection Make μ a dynamic derived variable by giving each individual its own survival chance Make individuals mortal by having them age and making an individual’s survival chance dependent on its age as well as its fitness


Download ppt "Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer."

Similar presentations


Ads by Google