030 – 13 th ICCRTS Coarse-grained After Action Review in Computer Aided Exercises Geoff Hone PhD Ian Whitworth & Andy Farmilo Department of Informatics.

Slides:



Advertisements
Similar presentations
Measuring Student and Teacher Technology Literacy for NCLB Whats an LEA to do? 2004 National School Boards Association Conference Denver Carol D. Mosley.
Advertisements

Fluency Assessment Bryan Karazia.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Based on the model used by Carlton Comprehensive in Prince Albert we are using a 4 point rubric to structure and assess our Math 9 classes. Balfour Outcome.
The Test Assessment Questionnaire Katherine M. Sauer William Mertens Metropolitan State College of Denver University of Colorado at Boulder
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
Use of Aerial Views in Remote Ground Vehicle Operations Roger A. Chadwick New Mexico State University Department of Psychology Douglas J. Gillan (PI)
Spreadsheets In today’s lesson we will look at:
Day 36 ENGLISH 11. Seating Charts! ASK YOUR NEW PARTNER WHAT THEY DID OVER BREAK AND WHAT THEIR FAVORITE MOVIE IS.
Introduction to Excel 2007 Part 3: Bar Graphs and Histograms Psych 209.
Process of developing e-Learning courses Research Seminar 26 th June 09.
English Word Origins Grade 3 Middle School (US 9 th Grade) Advanced English Pablo Sherman The etymology of language.
An Intelligent Tutoring System (ITS) for Future Combat Systems (FCS) Robotic Vehicle Command I/ITSEC 2003 Presented by:Randy Jensen
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
General Purpose Packages Spreadsheets. What is a Spreadsheet? A Spreadsheet is a computer program used mainly for recording mathematical data such as.
Day 3: Rubrics as an Assessment Tool. "There are only two good reasons to ask questions in class: to cause thinking and to provide information for the.
1 SKETCHING and LETTERING Print multiple handouts on one page to save paper Select File – Print Edit the following selections to read: Select the OK button.
Outlining Why outline? to organise your thoughts
Technical Adequacy Session One Part Three.
1 Excursions in Modern Mathematics Sixth Edition Peter Tannenbaum.
Tank Officer Infantry Officer Helicopter Officer.
Rational/Theoretical Cognitive Task Analysis Ken Koedinger Key reading: Zhu, X., & Simon, H. A. (1987). Learning mathematics from examples and by doing.
EDU 385 Education Assessment in the Classroom
American Association of Physics Teachers Alabama Section April 5, 2014 University of South Alabama Mobile, Alabama Are they Learning What we Teach? (Closing.
Ralf Becker United Nations Statistics Division EGM, May 2015 Factoryless goods producers (FGPs) and ISIC Rev.4.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
Creating Assessments The three properties of good assessments.
Planning how to create the variables you need from the variables you have Jane E. Miller, PhD The Chicago Guide to Writing about Numbers, 2 nd edition.
United Nations Statistics Division EGM, May 2015 Issues affecting ISIC Rev.4.
The Teaching Process. Problem/condition Analyze Design Develop Implement Evaluate.
Week 5 Slides: Job Analyses. Reading Assignment Earnest et al 2011: Realistic Job Previews Walker et al 2009: website testimonials.
Advanced Decision Architectures Collaborative Technology Alliance An Interactive Decision Support Architecture for Visualizing Robust Solutions in High-Risk.
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
(As if you don’t have enough to do!) This presentation was created by J. Sheppard and adapted by S. Russell Introduction to Assessment.
Assessing the Military Benefits of NEC Using a Generic Kill-Chain Approach David Nevell QinetiQ Malvern 21 ISMOR September 2004.
1. COMBINED ARMS... DECISIVE VICTORY O/C Training Overview.
1 Report Writing Report writing. 2 Contents What is a report? Why write reports? What makes a good report? Fundamentals & methodology »Preparation »Outlining.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 12 FUZZY.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
KSCO 2007 Waltham, MA1 Assessing the transmission of Commander’s Intent Ian Whitworth & Geoff Hone Department of Information Systems Defence College of.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Using Data to Implement RtI Colleen Anderson & Michelle Hosp Iowa Department of Education.
Preparing for End-of-Grade Test Walkertown Elementary School “Twas the Night Before Testing”
Accelerated Adaptation Evolution The learning contest between the IDF and its adversaries ( ) Hezbollah [aided by Iran], Hamas, Islamic Jihad (Gaza),
Performance Margins for Snowmaking
Educational Communication & E-learning
Understanding Your PSAT/NMSQT Results
Analysis and Planning Using THE HTA TOOL
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
Understanding Your PSAT/NMSQT Results
Understanding Your PSAT/NMSQT Results
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
Understanding Your PSAT/NMSQT Results
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
Understanding Your PSAT/NMSQT Results
Understanding Your PSAT/NMSQT Results
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
Understanding Your PSAT/NMSQT Results
Understanding Your PSAT/NMSQT® Results
Understanding Your PSAT/NMSQT Results
Understanding Your PSAT/NMSQT Results
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
Scientific forecasting
Understanding Your PSAT/NMSQT Results
UNDERSTANDING YOUR PSAT/NMSQT RESULTS
Presentation transcript:

030 – 13 th ICCRTS Coarse-grained After Action Review in Computer Aided Exercises Geoff Hone PhD Ian Whitworth & Andy Farmilo Department of Informatics and Sensors Defence College of Management and Technology Cranfield University at the Defence Academy of the United Kingdom Shrivenham SN6 8LA UK i.r.whitworth; cranfield.ac.uk & David Swift PhD Training Division HQ Land Forces UK 030 – 13 th ICCRTS

Why call it “coarse” After-Action Reviews (AAR) look at what has happened, to improve the learning and training experience. They are thus a form of performance assessment. Many experts believe that learning is facilitated by using “Formative” assessments: - early assessments used to guide the learning process - they can (and should) be very “coarse-grained”

030 – 13 th ICCRTS In contrast: Most After-Action Review (AAR) tools eg UPAS Stripes Exact TAARUSetc look at the simulator log-files. They tell WHAT happened, but can miss the WHY An Instructor, Observer-Controller, etc, who gives coarse-grained, accurate WHY data first, can enhance the learning experience from the WHAT data.

030 – 13 th ICCRTS The assessment continuum Subjective Unstructured Numerical Analysis Against Norms Against criterion Subjective Unstructured: Traffic Light systems and, “That was OK, but …” Against Norms: “Other trainees did it in two hours, you took three …” Against criterion: “We expect that exercise to be completed with a loss of effectives below 5% …” Numerical Analysis: “Your figures are: 3.2 rounds per kill from main armament Only 2 first round kills Mean advance rate of 20 metres per minute etc, etc… …”

030 – 13 th ICCRTS Take a real-time example: A junior officer leads a troop of four tanks, advancing cross country, in a SIMNET exercise. His force is engaged by enemy fire, and one tank is lost. A second (and then a third) are sent forward to look, and are also lost. Detailed AAR will tell him the exact location of each tank, and the exact time when it was hit. Coarse AAR could tell him WHY they were hit

030 – 13 th ICCRTS

Ground partly hides house Clear signs of dead ground Base of tree not visible Poor Terrain Awareness:

030 – 13 th ICCRTS Coarse (early or formative) assessment Subjective Unstructured Numerical Analysis Against Norms Against criterion How Hard How Successful How Good (Planning) How can we make these early assessments? And, make them quickly … …

030 – 13 th ICCRTS How Hard? Just how difficult was that exercise … …? If you don’t know HOW HARD, how can you rate performance? A tool to help with this is being developed based on SME judgements, which can give a difficulty rating within seconds: CADI or the Combined Arms Difficulty Index

030 – 13 th ICCRTS CADI

030 – 13 th ICCRTS The CADI Approach is a “Neural-net-in-a-Spreadsheet” It breaks the problem down into Factors Levels within each factor The levels are weighted for relative difficulty The levels are summed for each factor Each factor is reweighted by reference to all other factors All factors are summed The total value is corrected to fall within the desired scale The spreadsheet is easy – the trick is getting data from an SME The result is a consistent replication of SME judgement

030 – 13 th ICCRTS The principle is simple - getting data from an SME is hard but: The approach can be used for any domain where: - Multiple variables have to be integrated - Quick SME-type decisions are needed The example just shown related to Armour - Infantry, or Air Assault, may require a different index Example ran as an EXCEL overlay - Could be programmed as a stand-alone tool and Part of the Cranfield Cognitive Toolset (CCT) can help with sorting the levels into order of difficulty Future CADI development

030 – 13 th ICCRTS The next two quick assessments How Successful How Good (Planning) The concept here is to use a survey and assessment tool that is: - Fast to use - Simple to use - Collects subjective data - Informs the trainee (must be Formative) - Uses standard criteria

030 – 13 th ICCRTS We have the assessment tool: How well did these orders convey your intent? How well was the terrain used ? Very wellPoorly

030 – 13 th ICCRTS OKOK Poorly Very well How well was the terrain used ? The Tool Appearance

030 – 13 th ICCRTS The Question Set This must be developed in collaboration with SMEs It must relate well to: - The Combat Estimate (or 6-Step Estimate) - The 5-Paragraph model of Orders The questions may need to vary between - Armour, Infantry, and other Arms - different levels of Command

030 – 13 th ICCRTS How clearly are Routes and Locations indicated? To what extent are waypoint timings achievable? How well is artillery support de-conflicted? How well is air support co-ordinated? How clearly are movement bounds shown? To what extent are there omissions? How well is the terrain being used? The Question Format Questions are always phrased as a matter of degree

030 – 13 th ICCRTS The Tool Output The response is collected as a subjective value Within the tool it is converted into percentage data, and this can be turned into charts, tables, or traffic-light colours. Hence, an EXCON or DS could say: “Your plan only got a rating of 55%. You were weak on “Use of Terrain”, and you will see how this affected your subsequent performance”. The trainee is now prepared for the detailed AAR

030 – 13 th ICCRTS The Cranfield Cognitive Toolset One of several freeware tools developed by : Cranfield University Defence College of Management and Technology Defence Academy of the U K

030 – 13 th ICCRTS Conclusion Coarse-grained, Formative AAR has the potential to enhance the training process BUT Must be easy to use Must be easy to impart to the trainee Exercises should be graded for difficulty Given these, the trainee can get maximum benefit from a detailed AAR