Presentation is loading. Please wait.

Presentation is loading. Please wait.

Global 3D Technology Forum Seoul – 15 October 2014 Gregory Chung Assistant Director for Research Innovation National Center for Research on Evaluation,

Similar presentations

Presentation on theme: "Global 3D Technology Forum Seoul – 15 October 2014 Gregory Chung Assistant Director for Research Innovation National Center for Research on Evaluation,"— Presentation transcript:

1 Global 3D Technology Forum Seoul – 15 October 2014 Gregory Chung Assistant Director for Research Innovation National Center for Research on Evaluation, Standards, and Student Testing (CRESST) University of California, Los Angeles (UCLA) Developing and Testing Math Games for Learning: Best Practices, Lessons Learned, and Results From a Large-Scale Randomized Controlled Trial Global 3D Technology Forum

2 2 / 9 Structure of Talk Brief introduction Why develop math games? Large-scale randomized controlled trial of the effectiveness of our games Impact, best practices, and lessons learned

3 3 / 9 Introduction to CRESST National Center for Research on Evaluation, Standards, and Student Testing Co-Directors Eva Baker, Li Cai 15+ PhD, 10+ MA/MS/BS research staff; graduate students, programmers, graphic artists, business and administrative support Supported by federal (DoE, NSF, DARPA, ONR), state, and local agencies, foundations, non-profits, and business

4 4 / 9 Topic IES / CATS DARPA / Engage ONR / CSUSBPBSMobilize Grade6K-3CollegeK Domainmathphysicsmath c/s Engineering Develop ontology and knowledge specifications ✓✓✓ Develop testbed ✓✓✓ Instrument games ✓✓✓✓ Develop games ✓✓✓ Research Instructional/game designs ✓✓✓ Modeling user understanding ✓✓✓ Mining gameplay data ✓✓✓✓✓ In-game adaptivity ✓✓✓ Rapid assessment generation ✓✓ Novel assessment formats ✓✓ Executing RCT ✓✓ Crowdsourcing data collection ✓✓ CRESST Games for Learning R&D Research: In what ways and under what conditions can video games be used to effectively teach, assess, and engage students? Engineering: Develop or adopt design processes that lead to effective and engaging games Analytics: Develop or adopt methods of collecting and analyzing large-scale game data

5 Background Why Develop Math Games?

6 6 / 9 Focus on Pre-algebra Pre-algebra provides students with the fundamental skills and knowledge that underlie algebra Gateway to more advanced mathematics and STEM majors 3rd – 4th grade concepts remain unmastered well into college

7 7 / 9 What’s Wrong With This Picture?

8 8 / 9 8th Grade National Sample % Incorrect: 25% % Incorrect: 33%

9 9 Remedial Math in College: Intermediate Algebra Item% Incorrect 27% 26% 11%

10 10 / 9 Tensions: Games for Learning Math game learning fun math play time efficiency choose to play have to play commercial grade research grade implicit instruction explicit instruction unobtrusive measurement (embedded) obtrusive measurement (external) simple tasks complex tasks abstract math concrete math conceptual fluency

11 11 / 9 Project Goal Develop and evaluate video games aimed at improving students’ knowledge and skills in pre- algebra topics Focus on rational numbers and solving equations (Grade 6) Use an engineering approach to systematically develop and test games Conduct a randomized controlled trial to test the effectiveness of our games

12 Randomized Controlled Trial of Game Effectiveness

13 13 / 9 Research Question Does playing CRESST-developed rational number games result in more learning of rational number concepts compared to playing a comparison set of games?

14 14 / 9 Sample 62 classrooms randomly assigned to condition 30 treatment classrooms 29 comparison classrooms 3 classes dropped because of schools’ technology ~1500 students Final sample—59 classrooms, 9 districts, 23 schools in California and Nevada, ethnically diverse

15 15 / 9 Procedure Teachers integrated math games into their curriculum Rational number or control topic 3 hours professional development Implementation window: January to April (pre-state testing)  Pretest (30min)  10 gameplay days within window (40min per occasion)  Delayed posttest 1 week after last game (30min) $600 honorarium to teacher

16 16 / 9 Measures

17 17 / 9 Results The treatment condition performed significantly higher on the posttest than the comparison condition Treatment condition (n = 30) played 4 games related to fractions Comparison condition (n = 29) played 4 alternative games (solving equations) No differences between conditions on pretest

18 Teacher’s Perceptions Student’s Perceptions

19 19 / 9 Efficacy Trial Results Multi-level model shows significant effects of treatment, effect size =.6 Type of InterventionMean Effect Size Instructional format.21 a Instructional component/skill training.36 a Focused on individual students.40 a Simulation and games.34 b Computer-assisted instruction.31 b a Lipsey, M. W. et al. (2012). Translating the Statistical Representation of the Effects of Education Interventions Into More Readily Interpretable Forms. (NCSER ). b Hattie, J. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. Routledge.

20 Game Development Methodology

21 Hope is Not a Strategy

22 22 / 9 Launch Schedule Study lock Classroom dry run Y3Y1Y2 Y4 Game Development Studies Procedure test Launch

23 23 / 9 Game Development Studies 2 Years Before RCT Total of 24 small-scale studies, Grades 4-12 Schools, after-school program, private school  18 sites; 8 school districts; 4 states Types of studies Small scale tests, pre-post, think-alouds  7 studies; total N=238 Experimental studies: Instructional variations  10 studies; total N=1588  Typically minutes Assessment and methodology studies  7 studies; experimental data plus 711 new subjects

24 Game Development and Testing Process LARGE SCALE EFFICACY TEST 59 classrooms (1,700 kids) GAME EFFECTIVENESS TRIALS 60 + kids per condition DEVELOPMENT establish instructional sequence + assessments GAME PROTOTYPE + TESTING Small-scale (1-5 students) Classroom-scale (30 students) PLAY/PAPER TESTING Small-scale (1-5 students) Classroom-scale (30 students)

25 25 / 9 Save Patch (unit, fractional pieces, adding fractions) Tlaloc’s Book (inverse operations) Wiki Jones (number line) Rosie’s Rates (functions) Fractions Games

26 26 / 9 Zooples (solving equations) Expresso (transforming expressions) AlgebRock (solving equations) Solving Equations Games Monster Line (operations on pos. and neg. integers)

27 27 / 9 Game Development Studies Results Empirical Instruction/feedback – less is more, narrative and avatar choice matters for engagement, collaboration helpful for low performers, repeated play improves learning, math measure validation, engagement scale validation, misconceptions and play strategies detectable via data mining,... Student and teacher self-reports, classroom observations Game features – music, graphics, cheevos, avatar select Math not really an issue, high engagement, surprises Technology issues

28 28 / 9 Game Development Studies Results Developing games for learning – hard but doable Big difference between teaching and practice Games don’t have to be AAA quality  Classroom context sets a low bar Stickiness possible – the more challenging, the more sticky Game mechanics that require use of knowledge is a key design feature Anecdotal collateral benefits of gameplay – more student engagement, more individual attention, better feedback to teacher, increased confidence of student, productive student- student interactions

29 Impact, Best Practices, and Lessons Learned

30 30 / 9 Impact: Developed learning- Effective Games 6th students learned from our fractions games as measured by an external transfer test

31 31 / 9 Impact: New Methodology, Dissemination New statistical methodology developed at CRESST (Li Cai, Kil-Chan Choi) Technical reports craziness CATS technical reports downloaded 69,456 times (4 yrs) Journals (15), proceedings (7), chapters (9), technical reports (22), conference presentations (96) Transfer of development approaches and methodologies to other game-focused projects at CRESST

32 32 / 9 Best Practices: Coherent Design Process Ontologies and knowledge specifications Guides what to teach in game, what to measure, what to focus on in teacher training

33 Common Core State Standards

34 34 / 9 false RNLogins.xml false true false Level false true Best Practices: Game Testbed to Accelerate R&D

35 35 / 9 Best Practices: Gameplay as a Data Source Key design properties Game telemetry reflects description (not inferences) about events connected to learning Game telemetry packaged in a structured format (i.e., usable) Game mechanic requires use of math knowledge Game allows player to fail (i.e., commit errors) Game requires player to make decisions (do I do this or that?) Player cannot progress without requisite knowledge Stage / level design useful [stages = 1 concept; levels = variations of concepts]; sawtooth curve

36 36 / 9 Best Practice: Assume the Worst 1994: Is there a network? 2014: Can I access the network? Same problem 20 years later: Networked vs. stand-alone Option 1: Put everything on a USB stick (EXE, data) Option 2: Bring 40 laptops to classroom

37 37 / 9 Lessons Learned Assumption by game designers that making the game too “mathy” would destroy the game Assumption by game designers that to be learning effective, the player has to have a fluid and seamless experience Assumption by researchers that to be learning effective, every content detail had to be covered Assumption by all that using games for learning purposes is a good idea

38 38 / 9 Lessons Learned Developing games for learning--hard but doable Learning issues should drive game design Common design specifications focuses development effort In-house design and development facilitate rapid iteration Testbed capability is an R&D accelerant Gameplay interaction can be a rich data source Technology infrastructure a source of uncertainty Logistics is the last meter


40 40 / 9 Acknowledgements


42 1/3

43 43 / 9 Design: Game mechanics Design game mechanics to require targeted knowledge Leverage interaction to exercise players’ knowledge Allow correct and incorrect player actions Create decision points as part of gameplay Use misconceptions to create game “traps” (puzzles) Examples from a fractions game

44 44 / 9 = 4/4 Design: Game mechanics around misconceptions 1/4 Partitioning Misconception Students believe the denominator is the number of dividing lines rather than the spaces between

45 45 / 9 Design: Game mechanics around misconceptions = 4/3 1/3 Partitioning Misconception Students believe the denominator is the number of dividing lines rather than the spaces between

46 46 / 9 Design: Game mechanics around misconceptions

Download ppt "Global 3D Technology Forum Seoul – 15 October 2014 Gregory Chung Assistant Director for Research Innovation National Center for Research on Evaluation,"

Similar presentations

Ads by Google