Presentation is loading. Please wait.

Presentation is loading. Please wait.

Today’s Topics Read –For exam: Chapter 13 of textbook –Not on exam: Sections 14.1 - 14.3 & 14.4.1 Genetic Algorithms (GAs) –Mutation –Crossover –Fitness-proportional.

Similar presentations


Presentation on theme: "Today’s Topics Read –For exam: Chapter 13 of textbook –Not on exam: Sections 14.1 - 14.3 & 14.4.1 Genetic Algorithms (GAs) –Mutation –Crossover –Fitness-proportional."— Presentation transcript:

1 Today’s Topics Read –For exam: Chapter 13 of textbook –Not on exam: Sections 14.1 - 14.3 & 14.4.1 Genetic Algorithms (GAs) –Mutation –Crossover –Fitness-proportional Reproduction –Premature Convergence –Building-block Hypothesis End of Coverage of SEARCH 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 51

2 Genetic Algorithms (GAs) Use ideas of –Survival of fittest (death) –Combination of ‘genetic material’ (sex) –(‘Taxes’ play a role in some algo’s) –Mutation (randomness) Mixing of genes from parents more important than mutation (contrary to popular press) –About 25,000 human genes –For simplicity, assume two variants of each –So 2 25,000 possible combo’s to explore! 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 2

3 Basic FRAMEWORK for GAs (many possible ALGORITHMS) 1.Create initial population of entities 2.Evaluate each entity using a fitness function 3.Discard worst N% of entities 4.K times, stochastically grab ‘best’ parents (fitness proportional reproduction) i.Combine them (crossover) to create new entities ii.Make some random changes (mutation) 5.Goto 2 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 3

4 Representing Entities as Bit Strings Assume we represent our problem as a bit string (but any data structure ok for GAs) Cross Over (example on next slide) –Pick two entities, A and B –Choose a cross-over location –Copy first part of A and last part of B –Copy first part of B and last part of A Mutation –Randomly flip 1 or more bits 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 4

5 Crossover Example 1011001011001110 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 5 0010011101010100 Entity A Entity B Randomly chosen ‘cross over’ point 1011001101010100 0010011011001110 Child C Child D

6 Aside: My Family Phones My cell phone (#’s changed for anonymity) 406-0917 My wife’s cell phone 328-3729 Our daughter’s cell phone 328-0917 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 6

7 Typical Design Discard Worst HALF of Population Generate Children to Refill Population Keep Parents and Generated Children ‘Flip’ a Small Faction of Bits (eg, 0.1%) –Flip bits in all member of population 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 7

8 Fitness-Proportional Reproduction Let F i be the fitness of entity i Assume F i are non-negative (if not, use e Fi as the fitness for the GA) Let F total = ∑ F i // Sum the fitness of all the entities Prob (entity i chosen) = F i / F total 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 8

9 Roulette-Wheel View - spin arrow and see where it stops (pie-wedge size proportional to fitness) 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 9

10 A GA Approach to Supervised ML Assume we want to learn a model of the form (and all of our N features are numeric) if [ ∑ weight i feature i ] > threshold then return POS else return NEG Representation of Entities? –See next slide Fitness? –Accuracy on TRAIN set plus maybe some points for being different from rest of population Role of Tuning Set? –Could chose best member of population when done –If we use ALL of population (an ‘ensemble’), could weight each’s predictions 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 10

11 Possible Representation of Entities 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 11 … … … … …... 16 bits for weight 1 16 bits for weight N 16 bits for threshold Notes 1) we might only use 16 bits so weights are small (Occam’s Razor) 2) first bit could be SIGN (or use “2’s complement”)

12 Design Tip Design your space of entities so that most are viable (ie, get a non-zero fitness) Otherwise will waste a lot of cpu cycles generating useless entities 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 12

13 Premature Convergence (‘Inbreeding’) If not careful, entire population can become minor variations of a small number of ‘bit vectors’ Eg, consider crossing over A and child_of_A –Result will be  ¾ a copy of A Solutions –Don’t crossover with ‘recent’ descendent –Mutate more (but might destroy good traits) 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 13

14 GAs as Searching a Space Consider the space defined by single-bit mutations 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 14 101…01 101…00001…01 101…10001…00 011…10 etc What is a CROSSOVER? - Grab any two nodes (might not be adjacent) - ‘hyper jump’ to a possibly distant 3 rd node

15 Building-Block Hypothesis GAs work well when overall task has subtasks Fitness function gives credit for being able to solve subtasks Crossover ‘mixes and matches’ solutions to subtasks Eg, consider building cars –Need to engine, wheels, windows, brakes, etc 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 15

16 Which Fitness Function Better for GAs? 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 16 Fitness State Space Fitness State Space

17 Genetic Programming Entities need not be bit strings Often ‘genetic programming’ used for richer rep’s of entities –Decision trees –Neural networks –Code snippets –Etc 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 17

18 In-Class HW Design Genetic Programming Approach for Creating Good Decision Trees Think for 2-3 Mins before Raising Hand 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 18

19 GA Wrapup Can come up with quite creative solutions since many possibilities considered Might be too undirected? Designing good fitness functions can be a challenge Make more sense as computing power  10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 19

20 End of Search We’re done with search in discrete spaces SEARCH is a powerful, general-purpose way to look at problem solving Next: probabilistic reasoning (but we’ll return to viewing AI tasks from the perspective of search periodically) 10/6/15CS 540 - Fall 2015 (Shavlik©), Lecture 13, Week 5 20


Download ppt "Today’s Topics Read –For exam: Chapter 13 of textbook –Not on exam: Sections 14.1 - 14.3 & 14.4.1 Genetic Algorithms (GAs) –Mutation –Crossover –Fitness-proportional."

Similar presentations


Ads by Google