Presentation is loading. Please wait.

Presentation is loading. Please wait.

Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn

Similar presentations


Presentation on theme: "Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn"— Presentation transcript:

1 Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn
Evolving Levels for Super Mario Bros Using Grammatical Evolution & Evolving a Ms. Pac-Man Controller Using Grammatical Evolution Presentation by Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn

2 The papers Written by several people (1 person worked on both papers) from: Natural Computing Research & Applications Group, University College Dublin, Ireland & Center for Computer Games Research, IT University of Copenhagen, Denmark Fairly recent: 2010 (Ms. PacMan) / 2012 (SMB)

3 Contents Grammatical Evolution (Nadia)
Using grammatical evolution to evolve a Ms. Pac-Man controller (Alex) Using grammatical evolution to evolve levels for Super Mario Bros (Boris)

4 Grammatical Evolution (GE)
Relatively new concept (1998) Related to the idea of genetic programming (GP): find an executable program that will achieve a good fitness value for the given objective function Main difference: GP: uses tree style structured expressions that are directly manipulated GE: manipulates integer strings, that are subsequently mapped to a program through the use of a grammar

5 Grammatical Evolution
Integer strings -> grammar -> program fitness(program) -> integer string Inspired by nature: separate genotype from phenotype Genotype: integer string Phenotype: tree-like structure that is evaluated recursively (same as GP) Benefit of GE’s modular approach: no specific algorithm or method is required to perform search operations It is possible to structure a GE grammar that for a given function/terminal set is equivalent to genetic programming.

6 What is a grammar? “Grammar” can apply to: Natural Language (Linguistics): a set of structural rules governing the composition of clauses, phrases and words in any given natural language. Formal Language (mathematics, logic ,and theoretical computer science): is a set of production rules for strings in a formal language. A grammar does not describe the meaning of the strings or what can be done with them in whatever context.

7 Only syntax – NO semantics
1. The ate bear the fish 2. The bear ate the fish 3. The fish ate the bear The grammar for the natural language English will accept sentences 2 & 3, but will reject sentence 1. What does it mean to say that a grammar accepts some string (or sentence)?

8 Example CFG and parse tree
Language: anbn Context Free Grammar: S :- a S b S :- Є Parse tree for “aabb”: Є

9 Context-Free grammar? Equivalent to Backus-Naur Form
CF-grammar: lexicon of words and symbols + production rules Two classes of symbols: terminal + non-terminal Formal language defined by a CFG: set of strings derivable from the start symbol CF-grammar use: A device for generating sentences A device for assigning a structure to a given sentence

10 Formal CFG definition

11 Expressive Power Formal mechanisms (CFG’s, Markov Models, transducers etc.) can be described in terms of their power: = in terms of the complexity of the phenomena they can describe One grammar has greater generative power or complexity than another if it can define a language the other cannot define. Chomsky Hierarchy: hierarchy of grammars, where the set of languages describable by grammars of greater power subsumes the set of languages describable by grammars of less power.

12 Chomsky hierarchy

13 Back to GE Population: a set of integer strings
Applying a mapping rule, these integer strings are converted into problem instances following the rules of the Context-Free Grammar involved

14 Criticism and Variants
Due to the fact that GE uses a mapping operation, GE’s genetic operators do not achieve high locality: small changes in the genotype always result in small changes in the phenotype This is a highly regarded property of genetic operators in evolutionary algorithms. One possibility for variants is to use particle swarm optimization to carry out the search instead of genetic algorithms

15 Evolving a Ms. Pac-Man Controller using GE
Deterministic?

16 Ms. Pac-Man Competition
Aims to provide best software controller for the game of Ms. Pac-Man Best human player score: 921,360 Best computer score: 30,010 Hand-coded agent Developed by Matsumoto et al from Kyoto, Japan Year 2009

17 Using an Evolutionary Approach
Previous approach by Koza: Used Genetic Programming to combine pre-defined actions and conditional statements to evolve a simple Ms. Pac-Man player Goal: Achieve highest score Fitness function: Points earned per game Used reinforcement learning and the cross-entropy method to assist agent in learning appropriate decisions This paper: Attempts to successfully evolve rules in the form of “if <condition> then perform <action>” Uses Grammatical Evolution

18 Representation Grammatical evolution represents programs as a variable length linear genome Genome is an integer array of elements called codons Genotype mapped to phenotype using grammar in Backus-Naur Form Mapping function: Rule = c mod r c is the codon integer value r is the number of choices for the current symbol Codons may remain unused, or there may not be enough. In the latter case, may wrap back to the beginning up to a maximum number of times

19 Simplified Grammar

20 High-Level Functions

21 Experimental Setup One level, one life
Fitness function: Add scores for each pill, power pill, and ghost eaten Generation approach Population size 100 Ramped half and half initialization method (max tree depth 10) Tournament selection size 2 Int flip mutation (probability 0.1) One-point crossover (probability 0.7) Maximum of 3 wraps allowed to “fix”invalid individuals

22 Best Evolved Controller
Very aggressive Heads for power pills and then tries to eat all edible ghosts without looking to see if there are inedible ghosts in the way

23 Benchmarking Performance
Compared evolved agent to 4 other agents Hand-coded agent Random agent (chooses up, down, left, right, or neutral at every time step) Random Non-Reverse agent (same as random, but no back-tracking) Simple Pill Eater (heads for nearest pill, ignores all else)

24 Different Ghost Teams Three different ghost teams were used to test the agents Random team (Each ghost chooses a random direction each time step, no back-tracking) Legacy team (Three ghosts use different distance metrics: Manhattan, Euclidean, and shortest path distance. Last ghost makes random moves) Pincer team (Each ghost attempts to pick the closest junction to Ms. Pac-Man within a certain distance in order to trap her)

25 Results

26 Conclusions Evolved controller beat their own hand-coded controller against all ghost teams Evolved controller did not match of exceed the score of Matsumoto’s hand-coded agent But: Matsumoto’s agent was given three lives, could earn more lives, and had more than one level to play Our question: Why didn’t they evolve their controller under the same circumstances?

27 Evolving Levels for Super Mario Bros Using Grammatical Evolution
Boris Jakovljevic

28 The paper Authors: Noor Shaker Miguel Nicolau
Georgios N. Yannakakis (Member, IEEE) Julian Togelius (Member, IEEE) Michael O’Neill

29 Table of Contents Introduction Background Testbed Platform Game
Level Representation GE-based Level Generator Other Generators Expressivity Analysis Conclusions and Future Work

30 Introduction Game Development → Time and Money → Automatic Game Content Generation Compare different techniques of content generation Large amount of content → Automatic Evaluation Genetic Programming → Grammatical Evolution Greater control of output Generalization to different types of games Framework → generators’ Expressivity Range analysis and comparison

31 Background Grammatical Evolution
Framework → generators’ Expressivity Range analysis and comparison Suggested by G. Smith and J. Whitehead in ‘Analyzing the expressive range of a level generator’ from “Proceedings of the 2010 Workshop on Procedural Content Generation in Games”; ACM, 2010, p. 4 Defined Description Metrics Visualize the Generative Space Analyze the (Parameters 𝑖𝑚𝑝𝑎𝑐𝑡 Expressivity) relation of Level Generators

32 Background Framework extended through:
more informative aesthetic measures of generators’ expressivity applying the above measures to analyze and compare expressivity ranges of 3 level generators

33 Testbed Platform Game A modified version of Markus “Notch” Persson’s Infinite Mario Bros (IMB)*. Super Mario Bros – a very rich Environment Representation. J. Togelius, S. Karakovskiy, J. Koutnik, and J. Schmidhuber, “Super Mario Evolution” in Proceedings of the 5th international conference on Computational Intelligence and Games, ser. CIG’09. Piscataway, NJ, USA: IEEE Press, 2009, pp

34 Level Representation IMB:
2D array of Objects (brick blocks, coins, enemies…) Short levels 100 “blocks” wide app. 30 seconds to finish A set of “chunks”: platforms gaps tubes cannons boxes coins enemies

35 Level Representation More “terrain”: Obstruction Platforms Hills

36 GE-Based Level Generator Design Grammar
Ease of Interaction for designers Chunk positioning regardless of other chunks Problems? Chunk properties: x and y → starting point coordinates in a 2D Level Array → [ ], [3 … 5] 𝑤 𝑔 → gap width 𝑤 𝑏 → number of boxes 𝑤 𝑒 → number of enemies 𝑤 𝑐 → number of coins ℎ → height of flower tubes and cannons

37 GE-Based Level Generator Design Grammar
Example phenotype: hill 10, 4, 4 platform 74, 3, 4 tube 62, 4, 3 Genotype → Phenotype mapping a deterministic process guided by specified grammar parameters → chunks

38 GE-Based Level Generator Design Grammar
First version of the grammar Limitations Game experience 𝑤 𝑏𝑒𝑓𝑜𝑟𝑒 → width of the platform before the chunk 𝑤 𝑎𝑓𝑡𝑒𝑟 → width of the platform after the chunk Placement of enemies First version of grammar → in groups (always on platforms) <chunks> ::= <chunk> |<chunk> <chunks> <chunk> ::= gap(<x>,<y>,<wg>) | platform(<x>,<y>,<w>) | hill(<x>,<y>,<w>) | cannon_hill(<x>,<y>,<h>) | tube_hill(<x>,<y>,<h>) | coin(<x>,<y>,<wc>) | cannon(<x>,<y>,<h>) | tube(<x>,<y>,<h>) | boxes(<x>,<y>,<wb>) | enemy(<x>,<y>,<we>) <x> :: = [5..95] <y> ::= [3..5] <wg> ::= [2..5] <w> ::= [3..15] <h> ::= [2..3] <wb> ::= [2..7] <we> ::= [1..7]

39 GE-Based Level Generator Design Grammar
First version of the grammar Limitations More variability More enemy types / any platform with: constructing the physical structure of the level calculating the possible spawn positions spawning enemies on one of possible positions Enemy’s position → parameter in grammar (maintain deterministic genotype → phenotype mapping) <chunks> ::= <chunk> |<chunk> <chunks> <chunk> ::= gap(<x>,<y>,<wg>) | platform(<x>,<y>,<w>) | hill(<x>,<y>,<w>) | cannon_hill(<x>,<y>,<h>) | tube_hill(<x>,<y>,<h>) | coin(<x>,<y>,<wc>) | cannon(<x>,<y>,<h>) | tube(<x>,<y>,<h>) | boxes(<x>,<y>,<wb>) | enemy(<x>,<y>,<we>) <x> :: = [5..95] <y> ::= [3..5] <wg> ::= [2..5] <w> ::= [3..15] <h> ::= [2..3] <wb> ::= [2..7] <we> ::= [1..7]

40 GE-Based Level Generator Design Grammar
<level> ::= <chunks> <enemy> <chunks> ::= <chunk> |<chunk> <chunks> <chunk> ::= gap(<x>,<y>, <wg>,<wbefore>,<wafter>) | platform(<x>,<y>,<w>) | hill(<x>,<y>,<w>) | cannon_hill(<x>,<y>,<h>,<wbefore>,<wafter>) | tube_hill(<x>,<y>,<h>,<wbefore>,<wafter>) | coin(<x>,<y>,<wc>) | cannon(<x>,<y>,<h>,<wbefore>,<wafter>) | tube(<x>,<y>,<h>,<wbefore>,<wafter>) | <boxes> <boxes> ::= <box_type> (<x>,<y>) 2 | ... | <box_type> (<x>,<y>) 6 <box_type> ::= blockcoin | blockpowerup | rockcoin | rockempty <enemy> ::= (koopa | goompa)(<x>) 2 | ... | (koopa | goompa)(<x>) 10 <x> :: = [5..95] <y> ::= [3..5] Second version of the grammar A simplified version of the final grammar for level design specification; Superscripts → repetitions

41 GE-Based Level Generator Conflict Resolution
x and y → any value from [ ], [3 … 5] High % of overlapping Example: 𝐡𝐢𝐥𝐥 𝟔𝟓, 𝟒, 𝟓 hill 25, 4, 4 𝐜𝐚𝐧𝐧𝐨𝐧_𝐡𝐢𝐥𝐥 𝟔𝟕, 𝟒, 𝟒, 𝟒, 𝟑 coin 22, 4, 6 platform 61, 4, 5 Resolution: priority value → chunk

42 GE-Based Level Generator Sample Level
A sample generated level showing some of the grammar’s limitations

43 Implementation and Experimental Setup
GEVA software – implement needed functionalities M. O’Neill, E. Hemberg, C. Gilligan, E. Bartley, J. McDermott, and A. Brabazon. “GEVA: -grammatical evolution in Java”, ACM SIGEVO-lution, vol. 3, no. 2, pp. 2, pp , 2008. Experimental parameters: 1000 runs for (generations): 10 population size (individuals): 100 Maximum derivation tree depth: 100 Tournament selection size: 2 int-flip mutation probability: 0.1 (10%) one-point crossover probability: 0.7 (70%)

44 Implementation and Experimental Setup
Fitness function’s objective: levels with acceptable number of chunks Fitness function → weighted sum of 2 normalized measures 𝑓 𝑝 → maximum # of chunks – current # of chunks 𝑓 𝑐 → # of conflicting chunks found Conflict: ↑ 𝑓 𝑝 ⇒ ↑(𝑓 𝑐 )%

45 Notch Level Generator Incrementally places different chunks
Difficulty: number of generated gaps, enemies and enemy types

46 Parametrized Level Generator
Based on Notch’s level generator 6 features: 𝐺 → # of gaps in the level 𝐺 𝑤 → average gap width 𝐸 → # of enemies 𝐸 𝑝 → placement of enemies 𝑃 𝑥 → above or under horizontal blocks 𝑃 𝑔 → close to gap edge 100% 𝑃 𝑟 → random placement on ground 𝑁 𝑤 → # of powerups 𝐵 → # of boxes # of cannons and flower tubes → random

47 Expressivity Analysis
1000 levels 8 features 4 metrics: Linearity Density Leniency Compression Distance

48 Expressivity Analysis Linearity
Variety of hills and platforms Normalized to [0, 1] Linearity = 0.99 Linearity = 0

49 Expressivity Analysis Density
Different sized hills stacked Density value → each point Density = 0 Density = 0.85 (Linearity = 0.4) Density = 1 (Linearity = 0.9)

50 Expressivity Analysis Leniency
Level tolerance Chunks have different leniency values: Gaps → −0.5 Average gap width → −1 Enemies → −1 Cannons, flower tubes → −0.5 Powerups → +1 Leniency = 1 Leniency = 0

51 Expressivity Analysis Compression Distance
Overall structural similarity of output between generators Levels → number sequences Content events considered: Increase / Decrease in platform height Existence / Non-existence of enemies and items Beginning / Ending of a gap Diversity → Normalized Compression Distance (NCD) (content sequence 1) ℎ𝑖𝑔ℎ 𝑑𝑖𝑠𝑠𝑖𝑚𝑖𝑙𝑎𝑟𝑖𝑡𝑦 (content sequence 2) NCD>0.6 GE → 93% Parametrized → 91% “Notch” → 89%

52 Expressivity Analysis

53 Expressivity Analysis Histogram Comparison (GE)
Metrics histogram for 1000 generated levels

54 Expressivity Analysis Histogram Comparison (Parametrized)
Metrics histogram for 1000 generated levels

55 Expressivity Analysis Histogram Comparison (“Notch”)
Metrics histogram for 1000 generated levels

56 Expressivity Analysis Statistical Analysis
All levels across all generators linearity + leniency linearity − density leniency − density

57 Conclusion and Future Work
Potential use for game designers Generator comparison within the same genre Future work: incorporate player experience design grammar personalization more detailed expressivity measure Limitation: GE unable to generate high density levels → constraint-free grammar + play-test

58 Questions?


Download ppt "Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn"

Similar presentations


Ads by Google