Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.

Similar presentations


Presentation on theme: "CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current."— Presentation transcript:

1 CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current Event Chelsea - today Beth - Tuesday, November 6

2 Genetic Algorithms Lecture 13

3 CS 484 – Artificial Intelligence3 Evolutionary Computation Computational procedures patterned after biological evolution Search procedure that probabilistically applies search operators to a set of points in the search space Generate successor hypothesis by repeatedly mutating and recombining parts of the best currently known hypothesis

4 CS 484 – Artificial Intelligence4 General Procedure At each stage has a collection of hypotheses known as the current population Update population by replacing some fraction of it by offspring of the most fit current hypotheses Process forms a generate-and-test search

5 CS 484 – Artificial Intelligence5 Reason for Popularity Evolution is known to be successful, robust method for adaptation within biological systems GAs can search spaces of hypotheses containing complex interacting parts Genetic algorithms are easily parallelized and can take advantage of the decreasing cost of powerful computer hardware

6 CS 484 – Artificial Intelligence6 The Parameters GA(Fitness, Fitness_threshold, p, r, m) Fitness - function evaluates how good a hypothesis is Fitness_threshold - minimum acceptable hypothesis p - size of the population r - fraction of population to be replaced m - mutation rate

7 CS 484 – Artificial Intelligence7 The Algorithm GA(Fitness, Fitness_threshold, p, r, m) Initialize: P ← p random hypotheses Evaluate: for each h in P, compute Fitness(h) While [max h Fitness(h)] < Fitness_threshold 1. Select: Select (1 – r) members of P to add to P S based on fitness 2. Crossover: Probabilistically select pairs of hypotheses from P. For each pair,, produce two offspring by applying the Crossover operator. Add all offspring to P S 3. Mutate: Invert a randomly selected bit in m · p random members of P S 4. Update: P ← P S 5. Evaluate: for each h in P, compute Fitness(h) Return the hypothesis from P that has the highest fitness

8 CS 484 – Artificial Intelligence8 Fitness Fitness is an important concept in genetic algorithms. The fitness of a chromosome determines how likely it is that it will reproduce. Fitness is usually measured in terms of how well the chromosome solves some goal problem. E.g., if the genetic algorithm is to be used to sort numbers, then the fitness of a chromosome will be determined by how close to a correct sorting it produces. Fitness can also be subjective (aesthetic)

9 CS 484 – Artificial Intelligence9 Representing Hypotheses From the EnjoySport (simplified) example Represent (Sky = Sunny v Cloudy) Λ (AirTemp = Warm) by Represent IF AirTemp = Warm THEN EnjoySport = yes by SkyAirTemp 01110 SkyAirTempEnjoySport 11110

10 CS 484 – Artificial Intelligence10 Single-Point Crossover Crossover is applied as follows: 1)Select a random crossover point. 2)Break each chromosome into two parts, splitting at the crossover point. 3)Recombine the broken chromosomes by combining the front of one with the back of the other, and vice versa, to produce two new chromosomes.

11 CS 484 – Artificial Intelligence11 Other types of Crossover Usually, crossover is applied with one crossover point, but can be applied with more, such as in the following case which has two crossover points: Uniform crossover involves using a probability to select which genes to use from chromosome 1, and which from chromosome 2.

12 CS 484 – Artificial Intelligence12 Mutation A unary operator – applies to one chromosome. Randomly selects some bits (genes) to be “flipped” 1 => 0 and 0 =>1 Mutation is usually applied with a low probability, such as 1 in 1000.

13 CS 484 – Artificial Intelligence13 Termination Criteria A genetic algorithm is run over a number of generations until the termination criteria are reached. Typical termination criteria are: Stop after a fixed number of generations. Stop when a chromosome reaches a specified fitness level. Stop when a chromosome succeeds in solving the problem, within a specified tolerance. Human judgment can also be used in some more subjective cases.

14 CS 484 – Artificial Intelligence14 Optimizing a mathematical function A genetic algorithm can be used to find the highest value for f(x) = sin (x). Each chromosome consists of 4 bits, to represent the values of x from 0 to 15. Fitness ranges from 0 (f(x) = -1) to 100 (f(x) = 1). f'(x) = 50 * (f(x) + 1) - determines fitness By applying the genetic algorithm it takes just a few generations to find that the value x =8 gives the optimal solution for f(x).

15 CS 484 – Artificial Intelligence15 What is a Schema? How to characterize evolution of population in GA? As with the rules used in classifier systems, a schema is a string consisting of 1’s, 0’s and *’s. E.g.: 1011*001*0 Matches the following four strings: 1011000100 1011000110 1011100100 1011100110 A schema with n *’s will match a total of 2 n chromosomes. Each chromosome of r bits will match 2 r different schemata. Example: 0010 is a representative of 2 4 distinct schemas 00**, 0*10, ****, etc.

16 CS 484 – Artificial Intelligence16 Schema Theorem Characterizes evolution in the population in terms of the number of instances representing each schema The fitness of a schema is defined as the average fitness of the chromosomes that match the schema. The fitness of a schema, S, in generation t th is written as follows: f(S, t) The number of occurrences of S in the population at time t is : m(S, t) Looking for the expected value of m(S, t+1) Defined in terms of m(S, t) and other properties

17 CS 484 – Artificial Intelligence17 Selection Step Evolution of GA depends on selection step, recombination step, and mutation step Selection step Cloned individuals become part of next generation How many strings in schema S will be present in the next generation? A particular string h, is selected with a probability Let a(t) be the average fitness of all individuals

18 CS 484 – Artificial Intelligence18 Fitness of the Next Generation Interested in the probability that a single hypothesis selected by the GA will be an instance of schema S Probability that we select a representative of S Given that a(S,t) is the average fitness of instances of schema S The expected number of instances of S resulting from n independent selection steps

19 CS 484 – Artificial Intelligence19 Schemata Properties The defining length d L (S) of a schema, S, is the distance between the first and last defined bits. For example, the defining length of each of the following schemata is 4: **10111* 1*0*1** The order O(S) is number of defined bits in S. The following schemata both have order 4: **10*11* 1*0*1**1 A schema with a high order is more specific than one with a lower order.

20 CS 484 – Artificial Intelligence20 The Effect of Single-Point Crossover For a schema S to survive crossover, the crossover point must be outside the defining length of S. Hence, the probability that S will survive crossover is: L(S) is the length of the bit strings and p c is the probability that single-point crossover will be applied to an individual This tells us that a short schema is more likely to survive crossover than a longer schema.

21 CS 484 – Artificial Intelligence21 The Effect of Mutation The probability that mutation will be applied to a bit in a string is p m Hence, the probability that a schema S will survive mutation is: We can combine this with the effects of selection, crossover, and reproduction to give:

22 CS 484 – Artificial Intelligence22 The Schema Theorem Holland’s Schema Theorem, represented by the above formula, can be written as: More fit schemas will tend to grow in influence, especially schemas containing small number of defined bits, and defined bits that near one another within the bit string. This helps to explain why genetic algorithms work. It does not provide a complete answer.

23 CS 484 – Artificial Intelligence23 The Building Block Hypothesis Genetic algorithms manipulate short, low-order, high fitness schemata in order to find optimal solutions to problems. ” These short, low-order, high fitness schemata are known as building blocks. Hence genetic algorithms work well when small groups of genes represent useful features in the chromosomes. This tells us that it is important to choose a correct representation.

24 CS 484 – Artificial Intelligence24 Messy Genetic Algorithms (1) An alternative to standard genetic algorithms that avoid deception. Each bit in the chromosome is represented as a (position, value) pair. For example: ((1,0), (2,1), (4,0)) In this case, the third bit is undefined, which is allowed with MGAs. A bit can also overspecified: ((1,0), (2,1), (3,1), (3,0), (4,0))

25 CS 484 – Artificial Intelligence25 Messy Genetic Algorithms (2) Underspecified bits are filled with bits taken from a template chromosome. The template chromosome is usually the best performing chromosome from the previous generation. Overspecified bits are usually dealt with by working from left to right and using the first value specified for each bit.

26 CS 484 – Artificial Intelligence26 Messy Genetic Algorithms (3) MGAs use splice and cut instead of crossover. Splicing involves simply joining two chromosomes together: ((1,0), (3,0), (4,1), (6,1)) ((2,1), (3,1), (5,0), (7,0), (8,0)) ((1,0), (3,0), (4,1), (6,1), (2,1), (3,1), (5,0), (7,0), (8,0)) Cutting involves splitting a chromosome into two: ((1,0), (3,0), (4,1)) ((6,1), (2,1), (3,1), (5,0), (7,0), (8,0))

27 CS 484 – Artificial Intelligence27 Co-Evolution In the real world, the presence of predators is responsible for many evolutionary developments. Similarly, in many artificial life systems, introducing “predators” produces better results. This process is known as co-evolution. For example, Ramps, which were evolved to sort numbers: “parasites” were introduced which produced sets of numbers that were harder to sort, and the ramps produced better results.


Download ppt "CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current."

Similar presentations


Ads by Google