Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

C O N T E X T - F R E E LANGUAGES ( use a grammar to describe a language) 1.
Translator Architecture Code Generator ParserTokenizer string of characters (source code) string of tokens abstract program string of integers (object.
CS252: Systems Programming
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
GRAMMATICAL EVOLUTION Peter Černo. Grammatical Evolution (GE) Is an evolutionary algorithm that can evolve programs. Representation: linear genome + predefined.
ICE1341 Programming Languages Spring 2005 Lecture #5 Lecture #5 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 2 Syntax A language that is simple to parse.
1 Introduction to Computability Theory Lecture12: Decidable Languages Prof. Amos Israeli.
C. Varela; Adapted w/permission from S. Haridi and P. Van Roy1 Declarative Computation Model Defining practical programming languages Carlos Varela RPI.
CS 330 Programming Languages 09 / 13 / 2007 Instructor: Michael Eckmann.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
BlueJam Genetic Programming and Evolutionary Algorithms Capturing Creativity with Algorithmic Music Evolutionary Music Composition Machine Learning and.
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Chapter 3 Describing Syntax and Semantics Sections 1-3.
Chapter 14 Genetic Algorithms.
Normal forms for Context-Free Grammars
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Nicholas Mifsud.  Behaviour Trees (BT) proposed as an improvement over Finite State Machines (FSM)  BTs are simple to design, implement, easily scalable,
16 November, 2005 Statistics in HEP, Manchester 1.
Genetic Programming. Agenda What is Genetic Programming? Background/History. Why Genetic Programming? How Genetic Principles are Applied. Examples of.
Marcus Gallagher and Mark Ledwich School of Information Technology and Electrical Engineering University of Queensland, Australia Sumaira Saeed Evolving.
Genetic Programming.
LOUKAS GEORGIOU and WILLIAM J. TEAHAN Artificial Intelligence and Intelligent Agents Research Group School of Computer Science, Bangor University, U.K.
Chapter 2 Syntax A language that is simple to parse for the compiler is also simple to parse for the human programmer. N. Wirth.
Invitation to Computer Science 5th Edition
1 Syntax and Semantics The Purpose of Syntax Problem of Describing Syntax Formal Methods of Describing Syntax Derivations and Parse Trees Sebesta Chapter.
Lecture 21: Languages and Grammars. Natural Language vs. Formal Language.
Chapter 1 Introduction Dr. Frank Lee. 1.1 Why Study Compiler? To write more efficient code in a high-level language To provide solid foundation in parsing.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
CSI 3120, Grammars, page 1 Language description methods Major topics in this part of the course: –Syntax and semantics –Grammars –Axiomatic semantics (next.
© Negnevitsky, Pearson Education, Lecture 10 Evolutionary Computation: Evolution strategies and genetic programming Evolution strategies Evolution.
CS 355 – PROGRAMMING LANGUAGES Dr. X. Topics Introduction The General Problem of Describing Syntax Formal Methods of Describing Syntax.
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Investigation of the Effect of Neutrality on the Evolution of Digital Circuits. Eoin O’Grady Final year Electronic and Computer Engineering Project.
1 Chapter 3 Describing Syntax and Semantics. 3.1 Introduction Providing a concise yet understandable description of a programming language is difficult.
A sentence (S) is composed of a noun phrase (NP) and a verb phrase (VP). A noun phrase may be composed of a determiner (D/DET) and a noun (N). A noun phrase.
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Presenter: Chih-Yuan Chou GA-BASED ALGORITHMS FOR FINDING EQUILIBRIUM 1.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 2 Syntax A language that is simple to parse.
Grammars CPSC 5135.
C H A P T E R TWO Syntax and Semantic.
Computer Science and Mathematical Basics Chap. 3 발표자 : 김정집.
ISBN Chapter 3 Describing Syntax and Semantics.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Evolutionary Computation Dean F. Hougen w/ contributions from Pedro Diaz-Gomez & Brent Eskridge Robotics, Evolution, Adaptation, and Learning Laboratory.
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
CMSC 330: Organization of Programming Languages Context-Free Grammars.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
CPS 506 Comparative Programming Languages Syntax Specification.
09/20/04 Introducing Proteins into Genetic Algorithms – CSIMTA'04 Introducing “Proteins” into Genetic Algorithms Virginie LEFORT, Carole KNIBBE, Guillaume.
Syntax and Grammars.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Automated discovery in math Machine learning techniques (GP, ILP, etc.) have been successfully applied in science Machine learning techniques (GP, ILP,
Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 2 Syntax A language that is simple to parse.
Introduction Genetic programming falls into the category of evolutionary algorithms. Genetic algorithms vs. genetic programming. Concept developed by John.
CSC312 Automata Theory Lecture # 26 Chapter # 12 by Cohen Context Free Grammars.
1 Representation and Evolution of Lego-based Assemblies Maxim Peysakhov William C. Regli ( Drexel University) Authors: {umpeysak,
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 12 Mälardalen University 2007.
1 Turing Machines. 2 The Language Hierarchy Regular Languages Context-Free Languages ? ?
Genetic Programming. What is Genetic Programming? GP for Symbolic Regression Other Representations for GP Example of GP for Knowledge Discovery Outline.
Organization of Programming Languages Meeting 3 January 15, 2016.
Chapter 3 – Describing Syntax CSCE 343. Syntax vs. Semantics Syntax: The form or structure of the expressions, statements, and program units. Semantics:
Chapter 14 Genetic Algorithms.
Selected Topics in CI I Genetic Programming Dr. Widodo Budiharto 2014.
What does it mean? Notes from Robert Sebesta Programming Languages
R.Rajkumar Asst.Professor CSE
Programming Languages 2nd edition Tucker and Noonan
Presentation transcript:

Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn Evolving Levels for Super Mario Bros Using Grammatical Evolution & Evolving a Ms. Pac-Man Controller Using Grammatical Evolution Presentation by Alex van Poppelen, Boris Jakovljevic, and Nadia Boudewijn

The papers Written by several people (1 person worked on both papers) from: Natural Computing Research & Applications Group, University College Dublin, Ireland & Center for Computer Games Research, IT University of Copenhagen, Denmark Fairly recent: 2010 (Ms. PacMan) / 2012 (SMB)

Contents Grammatical Evolution (Nadia) Using grammatical evolution to evolve a Ms. Pac-Man controller (Alex) Using grammatical evolution to evolve levels for Super Mario Bros (Boris)

Grammatical Evolution (GE) Relatively new concept (1998) Related to the idea of genetic programming (GP): find an executable program that will achieve a good fitness value for the given objective function Main difference: GP: uses tree style structured expressions that are directly manipulated GE: manipulates integer strings, that are subsequently mapped to a program through the use of a grammar

Grammatical Evolution Integer strings -> grammar -> program fitness(program) -> integer string Inspired by nature: separate genotype from phenotype Genotype: integer string Phenotype: tree-like structure that is evaluated recursively (same as GP) Benefit of GE’s modular approach: no specific algorithm or method is required to perform search operations It is possible to structure a GE grammar that for a given function/terminal set is equivalent to genetic programming.

What is a grammar? “Grammar” can apply to: Natural Language (Linguistics): a set of structural rules governing the composition of clauses, phrases and words in any given natural language. Formal Language (mathematics, logic ,and theoretical computer science): is a set of production rules for strings in a formal language. A grammar does not describe the meaning of the strings or what can be done with them in whatever context.

Only syntax – NO semantics 1. The ate bear the fish 2. The bear ate the fish 3. The fish ate the bear The grammar for the natural language English will accept sentences 2 & 3, but will reject sentence 1. What does it mean to say that a grammar accepts some string (or sentence)?

Example CFG and parse tree Language: anbn Context Free Grammar: S :- a S b S :- Є Parse tree for “aabb”: Є

Context-Free grammar? Equivalent to Backus-Naur Form CF-grammar: lexicon of words and symbols + production rules Two classes of symbols: terminal + non-terminal Formal language defined by a CFG: set of strings derivable from the start symbol CF-grammar use: A device for generating sentences A device for assigning a structure to a given sentence

Formal CFG definition

Expressive Power Formal mechanisms (CFG’s, Markov Models, transducers etc.) can be described in terms of their power: = in terms of the complexity of the phenomena they can describe One grammar has greater generative power or complexity than another if it can define a language the other cannot define. Chomsky Hierarchy: hierarchy of grammars, where the set of languages describable by grammars of greater power subsumes the set of languages describable by grammars of less power.

Chomsky hierarchy

Back to GE Population: a set of integer strings Applying a mapping rule, these integer strings are converted into problem instances following the rules of the Context-Free Grammar involved

Criticism and Variants Due to the fact that GE uses a mapping operation, GE’s genetic operators do not achieve high locality: small changes in the genotype always result in small changes in the phenotype This is a highly regarded property of genetic operators in evolutionary algorithms. One possibility for variants is to use particle swarm optimization to carry out the search instead of genetic algorithms

Evolving a Ms. Pac-Man Controller using GE Deterministic?

Ms. Pac-Man Competition Aims to provide best software controller for the game of Ms. Pac-Man Best human player score: 921,360 Best computer score: 30,010 Hand-coded agent Developed by Matsumoto et al from Kyoto, Japan Year 2009

Using an Evolutionary Approach Previous approach by Koza: Used Genetic Programming to combine pre-defined actions and conditional statements to evolve a simple Ms. Pac-Man player Goal: Achieve highest score Fitness function: Points earned per game Used reinforcement learning and the cross-entropy method to assist agent in learning appropriate decisions This paper: Attempts to successfully evolve rules in the form of “if <condition> then perform <action>” Uses Grammatical Evolution

Representation Grammatical evolution represents programs as a variable length linear genome Genome is an integer array of elements called codons Genotype mapped to phenotype using grammar in Backus-Naur Form Mapping function: Rule = c mod r c is the codon integer value r is the number of choices for the current symbol Codons may remain unused, or there may not be enough. In the latter case, may wrap back to the beginning up to a maximum number of times

Simplified Grammar

High-Level Functions

Experimental Setup One level, one life Fitness function: Add scores for each pill, power pill, and ghost eaten Generation approach Population size 100 Ramped half and half initialization method (max tree depth 10) Tournament selection size 2 Int flip mutation (probability 0.1) One-point crossover (probability 0.7) Maximum of 3 wraps allowed to “fix”invalid individuals

Best Evolved Controller Very aggressive Heads for power pills and then tries to eat all edible ghosts without looking to see if there are inedible ghosts in the way

Benchmarking Performance Compared evolved agent to 4 other agents Hand-coded agent Random agent (chooses up, down, left, right, or neutral at every time step) Random Non-Reverse agent (same as random, but no back-tracking) Simple Pill Eater (heads for nearest pill, ignores all else)

Different Ghost Teams Three different ghost teams were used to test the agents Random team (Each ghost chooses a random direction each time step, no back-tracking) Legacy team (Three ghosts use different distance metrics: Manhattan, Euclidean, and shortest path distance. Last ghost makes random moves) Pincer team (Each ghost attempts to pick the closest junction to Ms. Pac-Man within a certain distance in order to trap her)

Results

Conclusions Evolved controller beat their own hand-coded controller against all ghost teams Evolved controller did not match of exceed the score of Matsumoto’s hand-coded agent But: Matsumoto’s agent was given three lives, could earn more lives, and had more than one level to play Our question: Why didn’t they evolve their controller under the same circumstances?

Evolving Levels for Super Mario Bros Using Grammatical Evolution Boris Jakovljevic

The paper Authors: Noor Shaker Miguel Nicolau Georgios N. Yannakakis (Member, IEEE) Julian Togelius (Member, IEEE) Michael O’Neill

Table of Contents Introduction Background Testbed Platform Game Level Representation GE-based Level Generator Other Generators Expressivity Analysis Conclusions and Future Work

Introduction Game Development → Time and Money → Automatic Game Content Generation Compare different techniques of content generation Large amount of content → Automatic Evaluation Genetic Programming → Grammatical Evolution Greater control of output Generalization to different types of games Framework → generators’ Expressivity Range analysis and comparison

Background Grammatical Evolution Framework → generators’ Expressivity Range analysis and comparison Suggested by G. Smith and J. Whitehead in ‘Analyzing the expressive range of a level generator’ from “Proceedings of the 2010 Workshop on Procedural Content Generation in Games”; ACM, 2010, p. 4 Defined Description Metrics Visualize the Generative Space Analyze the (Parameters 𝑖𝑚𝑝𝑎𝑐𝑡 Expressivity) relation of Level Generators

Background Framework extended through: more informative aesthetic measures of generators’ expressivity applying the above measures to analyze and compare expressivity ranges of 3 level generators

Testbed Platform Game A modified version of Markus “Notch” Persson’s Infinite Mario Bros (IMB)*. Super Mario Bros – a very rich Environment Representation. J. Togelius, S. Karakovskiy, J. Koutnik, and J. Schmidhuber, “Super Mario Evolution” in Proceedings of the 5th international conference on Computational Intelligence and Games, ser. CIG’09. Piscataway, NJ, USA: IEEE Press, 2009, pp. 156-161

Level Representation IMB: 2D array of Objects (brick blocks, coins, enemies…) Short levels 100 “blocks” wide app. 30 seconds to finish A set of “chunks”: platforms gaps tubes cannons boxes coins enemies

Level Representation More “terrain”: Obstruction Platforms Hills

GE-Based Level Generator Design Grammar Ease of Interaction for designers Chunk positioning regardless of other chunks Problems? Chunk properties: x and y → starting point coordinates in a 2D Level Array → [5 ... 95], [3 … 5] 𝑤 𝑔 → gap width 𝑤 𝑏 → number of boxes 𝑤 𝑒 → number of enemies 𝑤 𝑐 → number of coins ℎ → height of flower tubes and cannons

GE-Based Level Generator Design Grammar Example phenotype: hill 10, 4, 4 platform 74, 3, 4 tube 62, 4, 3 Genotype → Phenotype mapping a deterministic process guided by specified grammar parameters → chunks

GE-Based Level Generator Design Grammar First version of the grammar Limitations Game experience 𝑤 𝑏𝑒𝑓𝑜𝑟𝑒 → width of the platform before the chunk 𝑤 𝑎𝑓𝑡𝑒𝑟 → width of the platform after the chunk Placement of enemies First version of grammar → in groups (always on platforms) <chunks> ::= <chunk> |<chunk> <chunks> <chunk> ::= gap(<x>,<y>,<wg>) | platform(<x>,<y>,<w>) | hill(<x>,<y>,<w>) | cannon_hill(<x>,<y>,<h>) | tube_hill(<x>,<y>,<h>) | coin(<x>,<y>,<wc>) | cannon(<x>,<y>,<h>) | tube(<x>,<y>,<h>) | boxes(<x>,<y>,<wb>) | enemy(<x>,<y>,<we>) <x> :: = [5..95] <y> ::= [3..5] <wg> ::= [2..5] <w> ::= [3..15] <h> ::= [2..3] <wb> ::= [2..7] <we> ::= [1..7]

GE-Based Level Generator Design Grammar First version of the grammar Limitations More variability More enemy types / any platform with: constructing the physical structure of the level calculating the possible spawn positions spawning enemies on one of possible positions Enemy’s position → parameter in grammar (maintain deterministic genotype → phenotype mapping) <chunks> ::= <chunk> |<chunk> <chunks> <chunk> ::= gap(<x>,<y>,<wg>) | platform(<x>,<y>,<w>) | hill(<x>,<y>,<w>) | cannon_hill(<x>,<y>,<h>) | tube_hill(<x>,<y>,<h>) | coin(<x>,<y>,<wc>) | cannon(<x>,<y>,<h>) | tube(<x>,<y>,<h>) | boxes(<x>,<y>,<wb>) | enemy(<x>,<y>,<we>) <x> :: = [5..95] <y> ::= [3..5] <wg> ::= [2..5] <w> ::= [3..15] <h> ::= [2..3] <wb> ::= [2..7] <we> ::= [1..7]

GE-Based Level Generator Design Grammar <level> ::= <chunks> <enemy> <chunks> ::= <chunk> |<chunk> <chunks> <chunk> ::= gap(<x>,<y>, <wg>,<wbefore>,<wafter>) | platform(<x>,<y>,<w>) | hill(<x>,<y>,<w>) | cannon_hill(<x>,<y>,<h>,<wbefore>,<wafter>) | tube_hill(<x>,<y>,<h>,<wbefore>,<wafter>) | coin(<x>,<y>,<wc>) | cannon(<x>,<y>,<h>,<wbefore>,<wafter>) | tube(<x>,<y>,<h>,<wbefore>,<wafter>) | <boxes> <boxes> ::= <box_type> (<x>,<y>) 2 | ... | <box_type> (<x>,<y>) 6 <box_type> ::= blockcoin | blockpowerup | rockcoin | rockempty <enemy> ::= (koopa | goompa)(<x>) 2 | ... | (koopa | goompa)(<x>) 10 <x> :: = [5..95] <y> ::= [3..5] Second version of the grammar A simplified version of the final grammar for level design specification; Superscripts → repetitions

GE-Based Level Generator Conflict Resolution x and y → any value from [5 ... 95], [3 … 5] High % of overlapping Example: 𝐡𝐢𝐥𝐥 𝟔𝟓, 𝟒, 𝟓 hill 25, 4, 4 𝐜𝐚𝐧𝐧𝐨𝐧_𝐡𝐢𝐥𝐥 𝟔𝟕, 𝟒, 𝟒, 𝟒, 𝟑 coin 22, 4, 6 platform 61, 4, 5 Resolution: priority value → chunk

GE-Based Level Generator Sample Level A sample generated level showing some of the grammar’s limitations

Implementation and Experimental Setup GEVA software – implement needed functionalities M. O’Neill, E. Hemberg, C. Gilligan, E. Bartley, J. McDermott, and A. Brabazon. “GEVA: -grammatical evolution in Java”, ACM SIGEVO-lution, vol. 3, no. 2, pp. 2, pp. 17-22, 2008. Experimental parameters: 1000 runs for (generations): 10 population size (individuals): 100 Maximum derivation tree depth: 100 Tournament selection size: 2 int-flip mutation probability: 0.1 (10%) one-point crossover probability: 0.7 (70%)

Implementation and Experimental Setup Fitness function’s objective: levels with acceptable number of chunks Fitness function → weighted sum of 2 normalized measures 𝑓 𝑝 → maximum # of chunks – current # of chunks 𝑓 𝑐 → # of conflicting chunks found Conflict: ↑ 𝑓 𝑝 ⇒ ↑(𝑓 𝑐 )%

Notch Level Generator Incrementally places different chunks Difficulty: number of generated gaps, enemies and enemy types

Parametrized Level Generator Based on Notch’s level generator 6 features: 𝐺 → # of gaps in the level 𝐺 𝑤 → average gap width 𝐸 → # of enemies 𝐸 𝑝 → placement of enemies 𝑃 𝑥 → above or under horizontal blocks 𝑃 𝑔 → close to gap edge 100% 𝑃 𝑟 → random placement on ground 𝑁 𝑤 → # of powerups 𝐵 → # of boxes # of cannons and flower tubes → random

Expressivity Analysis 1000 levels 8 features 4 metrics: Linearity Density Leniency Compression Distance

Expressivity Analysis Linearity Variety of hills and platforms Normalized to [0, 1] Linearity = 0.99 Linearity = 0

Expressivity Analysis Density Different sized hills stacked Density value → each point Density = 0 Density = 0.85 (Linearity = 0.4) Density = 1 (Linearity = 0.9)

Expressivity Analysis Leniency Level tolerance Chunks have different leniency values: Gaps → −0.5 Average gap width → −1 Enemies → −1 Cannons, flower tubes → −0.5 Powerups → +1 Leniency = 1 Leniency = 0

Expressivity Analysis Compression Distance Overall structural similarity of output between generators Levels → number sequences Content events considered: Increase / Decrease in platform height Existence / Non-existence of enemies and items Beginning / Ending of a gap Diversity → Normalized Compression Distance (NCD) (content sequence 1) ℎ𝑖𝑔ℎ 𝑑𝑖𝑠𝑠𝑖𝑚𝑖𝑙𝑎𝑟𝑖𝑡𝑦 (content sequence 2) NCD>0.6 GE → 93% Parametrized → 91% “Notch” → 89%

Expressivity Analysis

Expressivity Analysis Histogram Comparison (GE) Metrics histogram for 1000 generated levels

Expressivity Analysis Histogram Comparison (Parametrized) Metrics histogram for 1000 generated levels

Expressivity Analysis Histogram Comparison (“Notch”) Metrics histogram for 1000 generated levels

Expressivity Analysis Statistical Analysis All levels across all generators linearity + leniency linearity − density leniency − density

Conclusion and Future Work Potential use for game designers Generator comparison within the same genre Future work: incorporate player experience design grammar personalization more detailed expressivity measure Limitation: GE unable to generate high density levels → constraint-free grammar + play-test

Questions?