Command Post of the Future Limited Objective Experiment-1 Presented to: Information Superiority Workshop II: Focus on Metrics 28 - 29 March 2000 Presented.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Objectives Identify the differences between Analytical Decision Making and Intuitive Decision Making Demonstrate basic design and delivery requirements.
Risk Management Introduction Risk Management Fundamentals
1 C2 Maturity Model Experimental Validation Statistical Analyses of ELICIT Experimentation Data Dr. David S. Alberts.
Overarching Goal: Understand that computer models require the merging of mathematics and science. 1.Understand how computational reasoning can be infused.
Asymmetric Threat Tracker Advanced Convoy Scenario Donald E. Brown, PhD James H. Conklin Department of Systems & Information Engineering University of.
Presented to ISMORS September 2009 Dr. David S. Alberts Director, Research OASD/NII – DoD CIO Redefining the “M” in MOR st Century OR Challenges.
Therapeutic exercise foundation and techniques Therapeutic exercise foundation and concepts Part II.
Build Assessment Literacy and Create a Data Overview Oct. 10, 2006.
Ward Page CPOF Program Manager DARPA
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
New Army Terms Table D-1. New Army terms Army positive control Army procedural control civil support1 combat power (Army) command and controlwarfare command.
(c) 2007 Mauro Pezzè & Michal Young Ch 1, slide 1 Software Test and Analysis in a Nutshell.
Building a Tool for Battle Planning: Challenges, Tradeoffs, and Experimental Findings Alexander Kott, Ray Budd, Larry Ground, Lakshmi Rebbapragada, and.
Chapter 10 Quality Improvement.
Thinking: A Key Process for effective learning “The best thing we can do, from the point of view of the brain and learning, is to teach our learners how.
Geography CPD Presentation Introducing Unit Specifications and Unit Assessment Support Packs National 3, 4 and 5.
Using Situational awareness and decision making
Strategic evaluation and control. 2 Strategy Review The firm’s internal and external environments are dynamic. Therefore, the best conceived and implemented.
The Bayesian Web Adding Reasoning with Uncertainty to the Semantic Web
New Advanced Higher Subject Implementation Events Chemistry : Unit Assessment at Advanced Higher.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Are Networked and Net- Centric the Same? Dr Terry Moon Head NCW S&T Initiative DSTO 30 March 2006 (NSI)
Chapter 8 Introduction to Hypothesis Testing
The aim / learning outcome of this module is to understand how to gather and use data effectively to plan the development of recycling and composting.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
Business Acumen Canadian ADM Mat Group Event 1 Deakin University CRICOS Provider Code: 00113B.
Measuring the Quality of Decisionmaking and Planning Framed in the Context of IBC Experimentation February 9, 2007 Evidence Based Research, Inc.
SponsorProblem AssessRisk SolutionStrategy Measures of Merit (MoM) Human & OrganisationalIssues Scenarios Methods & Tools Data Products
Becoming a Successful Health Sciences Student. In a Health Science course you will be asked 2 types of questions. Lower level thinking questions. –require.
Research Process Parts of the research study Parts of the research study Aim: purpose of the study Aim: purpose of the study Target population: group whose.
1Unclassified JCDEC Lt Col Anders Josefsson Integrated Dynamic Command and Control IDC 2.
Government Procurement Simulation (GPSim) Overview.
A Complex Adaptive Systems (CAS) Model
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
SINTEF Telecom and Informatics EuroSPI’99 Workshop on Data Analysis Popular Pitfalls of Data Analysis Tore Dybå, M.Sc. Research Scientist, SINTEF.
Analyses of Selective Military Sensemaking Cases Prepared for: CCRP Sensemaking Symposium October 2001 Prepared by: Evidence Based Research, Inc.
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 9 1 MER301:Engineering Reliability LECTURE 9: Chapter 4: Decision Making for a Single.
SponsorProblem AssessRisk SolutionStrategy Measures of Merit (MoM) Human & OrganisationalIssues Scenarios Methods & Tools Data Products
IS Metrics for C2 Processes Working Group 3 Brief Team Leaders: Steve Soules Dr. Mark Mandeles.
Crisis “Management”: A Way Forward David S. Alberts presented to Crisis Management 3.0: Social Media and Governance in Times of Transition.
Economics 173 Business Statistics Lecture 4 Fall, 2001 Professor J. Petry
Advanced Decision Architectures Collaborative Technology Alliance An Interactive Decision Support Architecture for Visualizing Robust Solutions in High-Risk.
Analyzing Statistical Inferences How to Not Know Null.
Global Environmental Change and Food Systems Scenarios Research up to date Monika Zurek FAO April 2005.
1 The Way Ahead Dr. Ed Smith Boeing Effects-Based Operations: The “how to”
Experimentation in Computer Science (Part 2). Experimentation in Software Engineering --- Outline  Empirical Strategies  Measurement  Experiment Process.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Traditional Economic Model of Quality of Conformance
What is a DBQ? Document Based Question.  Purpose  * Not to test your knowledge of the subject, but rather to evaluate your ability to use sources to.
1 Power to the Edge Agility Focus and Convergence Adapting C2 to the 21 st Century presented to the Focus, Agility and Convergence Team Inaugural Meeting.
NCW/NEC Workshop Working Group II. Framework Strengths Value of: –Metrics –Framework –Operational application of the framework (context) –Multiple analytical.
Unclassified//For Official Use Only 1 RAPID: Representation and Analysis of Probabilistic Intelligence Data Carnegie Mellon University PI : Prof. Jaime.
Introduction to decision analysis modeling Alice Zwerling, Postdoctoral fellow, JHSPH McGill TB Research Methods Course July 7, 2015.
Software Engineering (CSI 321) Software Process: A Generic View 1.
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
Presenter: Elizabeth Perry, M Ed. TESL-NS Fall 2014 Conference Dalhousie University.
Network Centric Planning ---- Campaign of Experimentation Program of Research IAMWG Dr. David S. Alberts September 2005.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Reasoning in Psychology Using Statistics Psychology
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 8 1 MER301: Engineering Reliability LECTURE 8: Chapter 4: Statistical Inference,
Machine Reasoning and Learning Workshops III and IV Kickoff Wen Masters Office of Naval Research Code 311 (703)
P3 Business Analysis. 2 A1. The need for, and purpose of, strategic business analysis A2. Environmental issues affecting the strategic position of an.
What are Training Paths and how to construct them
How Do Psychologists Ask & Answer Questions?
The Scientific Method in Psychology
President, Evidence Based Research, Inc
Presentation transcript:

Command Post of the Future Limited Objective Experiment-1 Presented to: Information Superiority Workshop II: Focus on Metrics March 2000 Presented by: Dr. Richard E. Hayes Evidence Based Research, Inc Spring Hill Road, Suite 250 Vienna, VA

2 Agenda HEAT and Quality of Information CPOF and Quality of Awareness Summary

3 Headquarters Effectiveness Assessment Tool (HEAT) HEAT is a set of consistent procedures that measure the effectiveness of a military headquarters or command center Quantitative, objective, and reproducible assessment of quality of C2 processes and overall effectiveness of directives Applied more than 50 times to exercises and real world applications in more than 200 command centers HEAT views headquarters as analogous to an adaptive control system that influences environment through plans (directives) Headquarters can be judged by viability of plans –Remain in effect for intended period –Include contingencies Diagnostic scores for quality of C2 processes

4 HEADQUARTERS DEVELOP ALTERNATIVE ACTIONS UNDERSTAND MONITOR PREDICT CONSEQUENCES DECIDE DIRECT QUERY INFORM ENVIRONMENT: - OWN & ENEMY FORCES - PHYSICAL - POLITICAL & ECONOMIC Awareness HEAT Analytic Structure

5 HEAT Information Measures Completeness Are all the required elements present? Correctness Does the information match the truth? Consistency Is the information the same across all nodes? Precision Was the information presented at the proper level of precision? Currency (Latency) How old is the information?

6 The goal of CPOF is to shorten the commander’s decision cycle to stay ahead of the adversary’s ability to react. Command Post of the Future Command Post of the Future (CPOF) is a DARPA program that aims to: Increase Speed and Quality of Command Decisions –Faster recognition and better understanding of changing battlefield situation –Faster and more complete exploration of available courses of action Provide More Effective Dissemination of Commands –COA capture for dissemination of commander’s intent –Status and capability feedback from deployed operators Enable Smaller, More Mobile and Agile Command Structures –More mobile, distributed command element –Smaller support tail & reduced deployment requirements

7 Limited Objective Experiment - 1 (LOE -1) Hypothesis: Tailored visualizations will improve Situation Awareness MOE: Correctness of Situation Awareness comprehension Quality of Pattern Recognition Level of Entity Retention

8 Force-on-Force Insurgency Situation 13 Situation 10 Situation 5 Situation 4 Less Complex More Complex LOE-1 Scenario Space

9 Data Collection 157 valid runs over 5 days (98% of design goal of 160) Data Collection Vehicle - Debrief Questionnaire Questions divided into four major elements: overall Situation Assessment, Patterns, Elements, and a Sketch of the situation Military experience data collected for each subject Run randomization compensated for variations between coders and subjects

10 Top Level Measures Overall Situation Awareness –Picture of the battlespace - enemy posture and intentions, friendly possibilities for favorable action Pattern Recognition –Dynamics of the battlespace - relative positions, strength, and capabilities Element Retention –Locations, status, events over time Ability to draw a general Sketch with relative positioning of important entities in space and time

11 1.You have three minutes to brief your commander on your assessment of the situation, what do you tell him? 1a. In this situation what are the friendly: Opportunities? Risks? 1b What are the adversary’s: Offensive capabilities? Defensive capabilities? Vulnerabilities? Intentions? 1c. What are the vulnerabilities for own forces? 2a. If you are not certain about adversary intentions and capabilities, what are the different possibilities? 2b. If you are not certain, what information would you require to resolve this uncertainty? 3a. What specific elements or element combinations lead you to your situation assessment? 3b. Please make a sketch of the situation, including the specific elements or element combinations cited in your previous answer. 3c. Was any information you would find valuable missing from the presentation of the situation? 4. What was good and or bad about this display? What would you like to see added, removed or changed? Debrief Questionnaire

12 Metrics Overall Prompted% of key componentswrong inferences added inferences Unprompted% of key componentswrong inferences added inferences Patterns % of pre-identified wrong added Elements % of important elements wrong added Sketch % of pre-identified elements wrong added temporal pattern geographic pattern

13 Interpretation CPOF Technologies Generated: –Better Situation Awareness (Higher Mean or X) –CPOF Technologies Performance Improves for Prompted CPOF Technologies Significantly Outperform Control in Overall Scores Prompted N=157 p=.007 Unprompted N=157 p=.058 s x Control CPOF Technologies

14 CPOF Technologies Significantly Outperform Control in Complex Situations Interpretation: CPOF Technologies generated better Situation Awareness (higher mean or x) in complex situations Unprompted Prompted N=78 p= s Control CPOF Technologies x

Treatment B Outperforms Treatment A in Overall Situation Awareness in Situation 13 Unprompted N=29 p=.073 Prompted N=29 p=.002 s x Treatment B Treatment A Interpretation: Treatment A used icon visualization scheme that subjects stated was confusing

16 Today’s Technology Allegiance Color Coding Treatment B Blobs and Allegiance Color Coding Treatment A Function Color Coding The Force-on-Force Visualization Treatments

17 Summary Information does not equal awareness Natural layers of overall patterns and elements Measures must always be developed in a military context Overall Patterns Elements Incorrect Richness