Presentation is loading. Please wait.

Presentation is loading. Please wait.

25 August, 2014Copyright Edward Tsang1 Evolutionary Computation Edward Tsang, University of Essex 1. Overview of EA 2. Analysis of GA Fundamental Theorem,

Similar presentations


Presentation on theme: "25 August, 2014Copyright Edward Tsang1 Evolutionary Computation Edward Tsang, University of Essex 1. Overview of EA 2. Analysis of GA Fundamental Theorem,"— Presentation transcript:

1 25 August, 2014Copyright Edward Tsang1 Evolutionary Computation Edward Tsang, University of Essex 1. Overview of EA 2. Analysis of GA Fundamental Theorem, the Royal Road Function 3. Epistasis, Constraints Handling Penalties, repair, GGA 4. EA for Machine Learning Classifier Systems, Bucket Brigade 5. Genetic Programming 6. Estimation of Distributions Algorithms

2 25 August, 2014Copyright Edward Tsang2 Evolutionary Computation: Model-based Generate & Test Model (To Evolve) Candidate Solution Observed Performance Test Feedback (To update model) A Candidate Solution could be a vector of variables, or a treevector of variablestree A Model could be a population of solutions, or a probability model populationprobability model The Fitness of a solution is application-dependent, e.g. drug testing Generate: select, create/mutate vectors / treesselectvectorstrees

3 25 August, 2014Copyright Edward Tsang3 Motivation Idea from natural selection To contain combinatorial explosion –E.g. the travelling salesman problem To evolve quality solutions –E.g. to find hardware configurations

4 25 August, 2014Copyright Edward Tsang4 Terminology in GA Example of a candidate solution in binary representation (vs real coding) A population is maintained Chromosome with Genes with alleles String with Building blocks with values fitness evaluation

5 25 August, 2014Copyright Edward Tsang5 Example Problem For GA maximize f(x) = x – x 2 –optimal solution: x = 14 (f(x) = 296) Use 5 bits representation –e.g. binary = decimal 13 –f(x) = 295  Note: representation issue can be tricky  Choice of representation is crucial

6 25 August, 2014Copyright Edward Tsang6 Example Initial Population To maximize f(x) = x – x2

7 25 August, 2014Copyright Edward Tsang7 Selection in Evolutionary Computation Roulette wheel method –The fitter a string, the more chance it has to be selected 24% % % % 11000

8 25 August, 2014Copyright Edward Tsang8 Crossover  F(x) , Sum of fitness: Average fitness: 

9 25 August, 2014Copyright Edward Tsang9 A Control Strategy For GA Initialize a population Repeat –Reproduction Select strings randomly but reflecting fitness –Crossover –Mutation –Replace old population with offspring Until termination condition

10 25 August, 2014Copyright Edward Tsang10 Discussion New population has higher average fitness –Is this just luck? –Fundamental theorem Is an offspring always legal? –Epistasis A fundamental question: can candidate solutions be represented by string?

11 25 August, 2014Copyright Edward Tsang11 Claimed advantages of GA Simple to program General -- wide range of applicability Efficient -- finding solutions fast Effective -- finding good solutions Robust -- finding solutions consistently  Are these claims justified?  Different people have different opinions

12 Analysis of GA Schema Fundamental Theorem Royal Road Function

13 25 August, 2014Copyright Edward Tsang13 Similarity Templates (Schema) Schema: string of alleles + “don’t care” (*) –10**0 has 4 strings 10110, 10010, … –1*1** has 8 strings, 11111, 10111,... order of a schema o(H): # of fixed places –order of 1*1** is 2 defining length of a schema  (H): –defining length of 1**10 is 4 –defining length of 110** is 2

14 25 August, 2014Copyright Edward Tsang14 Schemas: Observations Low order schemas represent more strings Low order schemas have more chance to survive Short schemas are more stable Schemas with above-average fitness have more chance to survive

15 25 August, 2014Copyright Edward Tsang15 Schemas in Reproduction m(H, t) = number of examples of schema H at time t Expected number of schema H in the mating pool: m(H, t+1) = m(H, t)  –f(H, t) = average fitness of eg of H at time t –avg(f) = average fitness of population

16 25 August, 2014Copyright Edward Tsang16 Schemas in Crossover p s (H) = probability of a schema H surviving crossover p s (H)  1 –  (H) / (l – 1) –l = length of a chromosome –  (H) = defining length Let p c = crossover rate Hence p s (H)  1 – p c   (H) / (l – 1)

17 25 August, 2014Copyright Edward Tsang17 Schemas in Mutation Let p m = mutation rate Survival of a gene under mutation = 1 – p m Survival of a schema H = (1 – p m ) o(H) which is roughly equal to: 1 – o(H)  p m  p m is normally very small

18 25 August, 2014Copyright Edward Tsang18 Fundamental Theorem (Schema Theorem) m(H, t+1)  m(H, t)   [ 1 – p c   (H) / (l – 1) – o(H)  p m ] –Ignoring small terms Short, low-order, above-average fitness schemas have more chance to survive Chance of entering the mating pool Chance of being disturbed by crossover Chance of being destroyed by mutation

19 25 August, 2014Copyright Edward Tsang19 Building Block Hypothesis Building blocks: schemas that –are short, –have low-order and –have high fitness have better chance to survive Do GAs search for near-optimal building blocks? –Much debate on this

20 25 August, 2014Copyright Edward Tsang20 Deceptive Problems Problems where optimal solution surrounded by poor ones Used for testing effectiveness of Gas –as well as the building block hypothesis fitness

21 25 August, 2014Copyright Edward Tsang21 The Royal Road Function Step function to maximize: Each block of four 1’s get one point Optimal solution’s fitness = 4 –Variations: bonus point for 1’s in 2 / 4 blocks Difficult for hill climbing GA’s could do well

22 Epistasis (interaction between Genes) Constraints Penalties, Repair Guided Genetic Algorithm

23 25 August, 2014Copyright Edward Tsang23 Constraints Task: find optimal solution in constrained satisfaction Difficulties: –Before one can attempt to optimization, one needs to find legal solutions –Crossover may generate illegal solutions –Sometimes satisfying constraints alone is hard When problem is tightly constrained

24 25 August, 2014Copyright Edward Tsang24 Epistasis, Example in TSP Travelling Salesman Problem After crossover, offspring may not be legal tours –some cities may be visited twice, others missing crossover 

25 25 August, 2014Copyright Edward Tsang25 Penalty Method If a string violates certain constraints Then a penalty is deducted from the fitness –E.g. in TSP, if penalties are high, GA may attempt to satisfy constraints before finding short tours Problem with tightly constrained problems: –most strings are illegal

26 25 August, 2014Copyright Edward Tsang26 Repair Method If a string violates certain constraints Then attempt to repair it –E.g. in TSP, replace duplicated cities with missing cities –possibly with complete search or heuristics Make sure that a population only contains legal candidate solutions –One can then focus on optimization

27 25 August, 2014Copyright Edward Tsang27 Guided Genetic Algorithm (GGA) Hybrid GLS - GA for constrained optimization –Developed by Lau at University of Essex Aims: –To extend the domain of GLS (Voudouris, Lau) –To improve efficiency & effectiveness of Gas especially for handling constraints –To improve robustness of GLS

28 25 August, 2014Copyright Edward Tsang28 Guided Genetic Algorithm Initialize population Repeat –Run GA till best fitness remains unchanged for n generations (n is parameter) –Pick the best chromosome X –Penalize features of X according to GLS –Augment cost function Until Termination Condition

29 25 August, 2014Copyright Edward Tsang29 Using GLS Penalties in GGA High value in fitness template  instability Fitness Template Chromosome Affect: Crossover Mutation +k+k+k+k+k+k When penalty of feature F increased to k Add k to relevant loci 1

30 GA for Machine Learning Classifiers Bucket Brigade

31 25 August, 2014Copyright Edward Tsang31 Production System Architecture Working MemoryProduction Rules Scheduler Retrieve data Change data ( firing ) Conditions  Actions ….. Facts Sensor inputs Internal states …..

32 25 August, 2014Copyright Edward Tsang32 Classifier System Components Classifiers: Condition-Action rules –Special type of production system A Credit System –Allocating rewards to fired rules Genetic Algorithm –for evolving classifiers

33 25 August, 2014Copyright Edward Tsang33 Classifier System Example Message List Classifiers 01##: #0:1100 …. Detectors 0 1 Effectors 1 0 Info Payoff Action Classifiers bid to fire Classifiers have fixed length conditions and actions

34 25 August, 2014Copyright Edward Tsang34 Apportionment of Credit The set of classifiers work as one system –They react to the environment –The system as a whole gets feedback How much to credit each classifier? –E.g. Bucket Brigade method Each classifier i has a strength S i –which form the basis for credit apportionment –as well as fitness for evolution

35 25 August, 2014Copyright Edward Tsang35 Bucket Brigade, Basics A classifier may make a bid to fire B i = S i * C bid –where C bid is the bid coefficient Effective bid: EB i = S i * C bid + N(  bid ) –where N(  bid ) is a noise function –with standard deviation  bid Winner of an auction pays the bid value to source of the activating message

36 25 August, 2014Copyright Edward Tsang36 Classifiers In Action C bid = 0.1 C tax = 0 S Msg B 3 16 S Msg 0001 B4B4 R 4 50 Classifiers 01##: #0: ##:1000 ##00:0001 Environment S Msg 0111 B 0 20 S Msg 0000 B 1 20 S Msg B S

37 25 August, 2014Copyright Edward Tsang37 More on Bucket Brigade Each classifier is “taxed” S i (t+1) = S i (t) - C bid S i (t) - C tax S i (t) + R i (t) –where S i (t) is strength of classifier i at time t –C bid S i (t) is the bid accepted –C tax is tax –R i (t) is payoff For stability, the bid value should be comparable to receipts from environment

38 25 August, 2014Copyright Edward Tsang38 Classifiers: Genetic Algorithms T ga = # of steps between GA calls –GA called once every T ga cycles; or –GA called with probability reflecting T ga A proportion of the population is replaced Selection: roulette wheel –weighted by strength S i Mutation: 0  {1,#}, 1  {0,#}, #  {0,1}

39 Genetic Programming Building decision trees GP for Machine Learning

40 25 August, 2014Copyright Edward Tsang40 Genetic Programming, Overview Young field –Koza: Genetic Programming, 1992 –Langdon & Poli: Foundations of GP, 2001 Diverse definitions –Must use trees? May use lists? Must one evolve programs? –Suitable for LISP Machine learning: evolving trees –dynamic data structure

41 25 August, 2014Copyright Edward Tsang41 Terminals and Functions Terminal set: –Inputs to the program –Constants Function set: –Statements, e.g. IF-THEN-ELSE, WHILE-DO –Functions, e.g. AND, OR, +, := –Arity sensitive

42 25 August, 2014Copyright Edward Tsang42 Functions: Statements (e.g. IF-THEN-ELSE) or functions (e.g. AND, OR, +,  ) Terminals: Input to the program or constants GP: Example Tree (1) Last race time Won last time If-then-else Not-winWin If-then-else Win 5 min <

43 25 August, 2014Copyright Edward Tsang43 GP Application A tree can be anything, e.g.: –a program –a decision tree Choice of terminals and functions is crucial –Domain knowledge helps –Larger grammar  larger search space  harder to search

44 25 August, 2014Copyright Edward Tsang44 GP: Example Tree (2) WinNot-winWin Last raced 3 months ago Same Jockey Not-winWin Won last time Same Stable Boolean decisions only (limited functions) Terminals: input to the program or constants Functions: Statements (e.g. IF-THEN-ELSE) or functions (e.g. AND, OR, +,  )

45 25 August, 2014Copyright Edward Tsang45 GP Operators a d cb gfe i h a d cb gfe i h   Crossover Mutation: change a branch

46 25 August, 2014Copyright Edward Tsang46 Fitness in GP Generating Programs –How well does the program meet the specification? Machine Learning: –How well can the tree predict the outcome? Function Fitting: –How great/small the error is

47 25 August, 2014Copyright Edward Tsang47 Generational GP Algorithm Initialize population Evaluate individuals Repeat –Repeat u Select parents, crossover, mutation –Until enough offspring have been generated Until termination condition fulfilled

48 25 August, 2014Copyright Edward Tsang48 Steady-state GP Algorithm Initialize population P Repeat –Pick random subset of P for tournament –Select winners in tournament –Crossover on winners, mutation –Replace loser(s) with new offspring in P Until termination condition fulfilled

49 25 August, 2014Copyright Edward Tsang49 Why GP for Computational Finance? Expressive Power –Trees represent functions, rules, strategies Efficiency –Inspired by nature –Based on statistics + randomness GP for Machine Learning –Classifier systems They are evolving rule-based systems –Feedback reinforcement

50 Estimation of Distribution Algorithms (EDAs) Population-based Incremental Learning (PBIL) Building Bayesian networks

51 25 August, 2014Copyright Edward Tsang51 Population-based Incremental Learning (PBIL) Statistical approach Related to ant-colonies, GA Model M: x = v1 (0.5) x = v2 (0.5) y = v3 (0.5) y = v4 (0.5) Sample from M solution X, eg Evaluation X Modify the probabilities


Download ppt "25 August, 2014Copyright Edward Tsang1 Evolutionary Computation Edward Tsang, University of Essex 1. Overview of EA 2. Analysis of GA Fundamental Theorem,"

Similar presentations


Ads by Google