Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evolutionary Computing Chapter 11. / 7 Chapter 11: Non-stationary and Noisy Function Optimisation What is a non-stationary problem? Effect of uncertainty.

Similar presentations


Presentation on theme: "Evolutionary Computing Chapter 11. / 7 Chapter 11: Non-stationary and Noisy Function Optimisation What is a non-stationary problem? Effect of uncertainty."— Presentation transcript:

1 Evolutionary Computing Chapter 11

2 / 7 Chapter 11: Non-stationary and Noisy Function Optimisation What is a non-stationary problem? Effect of uncertainty Algorithmic approaches Example: Time-varying knapsack problem 1

3 / 7 What is a non-stationary problem? Real-world environments contain sources of uncertainty –Measuring the fitness more than once will not always result in same fitness Not finding single optimum but a sequence of values over time Focus on measuring the quality of a solution x, f(x) with different sources of uncertainty: –The genotype to phenotype mapping is not exact and one-to-one f observed (x) = f(x+δx) –The act of measurement itself is prone to error or uncertainty f observed (x) = f mean (x) + f noise (x), where noise is N(0,σ) –The environment changes over time f observed (x) = f(x,t) 2

4 / 7 Effect of uncertainty (1/2) Measuring performance of algorithms dealing with uncertainty is done by: Running them for a fixed period of time Calculating two time averaged metrics -On-line measure -Offline measure where is time dependent fitness function, the best individual in a population at time t 3

5 / 7 Effect of uncertainty (2/2) This figure shows f(x) = 1/(0.1+x 2 ) and the values estimated after 5 samples with two different sorts of uncertainty (uniform between +/- 0.4) 4

6 / 7 Algorithmic Approaches (1/2) Increasing robustness/reducing noise by repeatedly re- evaluate solutions and take an average –Implicitly via population management –Explicitly: Resample when degree of variation present is greater than the range of estimated fitnesses in population Law of diminishing returns Resampling decisions independently for each solution Dynamic environments –Make sure that there is enough diversity in the population Memory based approaches for switching or cyclic environments –Expanding memory of EA –Example: GA with diploid representation, structured GA 5

7 / 7 Algorithmic Approaches (2/2) Explicitly increasing diversity in dynamic environment –Examples: GA with a hypermutation operator, random immigrants GA Preserving diversity and resampling: modifying selection and replacement policies –Steady-state GAs with “delete-oldest” replacement strategy 6

8 / 7 Example: Time-varying knapsack problem Number of items having value v i t and weight or cost c i t Select a subset maximising total value meeting time-varying capacity constraint C(t) Smith and Vavak did multiple experiments –Binary-coded SSGA 100 members –Parent selection by binary tournament –Uniform crossover –Hypermutation operator (triggered if running average drops) –Parameter values are decided after initial experiments Best performance when combining conservative (binary) tournament with delete-oldest. Using this policy with hypermutation results in finding the global optima in switching environment and continuously moving optima 7


Download ppt "Evolutionary Computing Chapter 11. / 7 Chapter 11: Non-stationary and Noisy Function Optimisation What is a non-stationary problem? Effect of uncertainty."

Similar presentations


Ads by Google