A Prescriptive Adaptive Test Framework (PATFrame) for Unmanned and Autonomous Systems: A Collaboration Between MIT, USC, UT Arlington and Softstar Systems.

Slides:



Advertisements
Similar presentations
1 Let the beauty we love be what we do. Dr. Ralph Young.
Advertisements

Requirements Engineering Processes – 2
Using Metrics to Reduce Cost of Re-work Dwight Lamppert Senior Test Manager Franklin Templeton.
The 4 T’s of Test Automation:
Requirements Engineering Process
Requirements Engineering Process
Chapter 24 Quality Management.
Chapter 8 Software Prototyping.
Systems Security Engineering An Updated Paradigm INCOSE Enchantment Chapter November 8, 2006 John W. Wirsbinski.
1 The Systems Engineering Research Center UARC Dr. Dinesh Verma Executive Director January 13,
1 Risk Management Between Projects and Systems Baqer Alali Old Dominion University INCOSE-HRA November 17-18, 2009 Co-Authers Dr. Ariel Pinto Dr. R. Landaeta.
Introduction to Product Family Engineering. 11 Oct 2002 Ver 2.0 ©Copyright 2002 Vortex System Concepts 2 Product Family Engineering Overview Project Engineering.
Aviation Security Training Module 4 Design and Conduct Exercise II 1.
Lisa Brown and Charles Thomas LAWNET 2002 Taking the Mystery Out of Project Management.
Multimission Ground Systems & Services MOS 2.0: A View of the Next Generation in Mission Operations Systems Duane L. Bindschadler, Carole A. Boyles, Carlos.
Objectives To introduce software project management and to describe its distinctive characteristics To discuss project planning and the planning process.
Module N° 7 – Introduction to SMS
NexSAT NexSAT Steering Group Meeting - 8 June 2004 © 2004 European Organisation for the Safety of Air Navigation (EUROCONTROL) 1 Welcome to the 4th meeting.
One Sky for Europe EUROCONTROL © 2002 European Organisation for the Safety of Air Navigation (EUROCONTROL) Page 1 FAA/Eurocontrol Technical Interchange.
Illinois Department of Children and Family Services, Pathways to Strengthening and Supporting Families Program April 15, 2010 Division of Service Support,
Leading for High Performance. PKR, Inc., for Cedar Rapids 10/04 2 Everythings Up-to-Date in Cedar Rapids! Working at classroom, building, and district.
Modern Systems Analyst and as a Project Manager
Gaining Senior Leadership Support for Continuity of Operations
1 Evaluating a Complex System of Systems Using State Modeling and Simulation National Defense Industrial Association Systems Engineering Conference San.
Chapter 7 Process Management.
EMS Checklist (ISO model)
Chapter 5 – Enterprise Analysis
A Roadmap to Successful Implementation Management Plans.
Effective Test Planning: Scope, Estimates, and Schedule Presented By: Shaun Bradshaw
© 2010 Invensys. All Rights Reserved. The names, logos, and taglines identifying the products and services of Invensys are proprietary marks of Invensys.
OOAD – Dr. A. Alghamdi Mastering Object-Oriented Analysis and Design with UML Module 3: Requirements Overview Module 3 - Requirements Overview.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Testing Workflow Purpose
The importance of the service catalogue to the service desk
1. 2 August Recommendation 9.1 of the Strategic Information Technology Advisory Committee (SITAC) report initiated the effort to create an Administrative.
1 Elevators, Airbags & FM Radio Communication Planning Innovation and Next Practice Division (DEECD) Design Teams Professional Development Workshop February.
Technology Module: Technology Readiness Levels (TRLs) Space Systems Engineering, version 1.0 SOURCE INFORMATION: The material contained in this lecture.
MANAGING INFORMATION TECHNOLOGY 7th EDITION
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Determining the Significant Aspects
Directions for this Template  Use the Slide Master to make universal changes to the presentation, including inserting your organization’s logo –“View”
Target Costing If you cannot find the time to do it right, how will you find the time to do it over?
Core Curriculum for Clinical Coaching Intro - VNIP Model
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 28 Slide 1 Process Improvement 1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 4 Slide 1 Software processes 2.
آزمایشگاه مهندسی نرم افزار
NDIA SoS SE Committee Topics of Interest May 25, 2009.
Supporting Transition: Lessons learned from Nurse Internship & Residency © Vermont Nurses In Partnership, Inc. All rights reserved. No copying.
2009 – E. Félix Security DSL Toward model-based security engineering: developing a security analysis DSML Véronique Normand, Edith Félix, Thales Research.
RTI Implementer Webinar Series: Establishing a Screening Process
1 Safety Assurance JPDO Perspective Maureen Keegan 11 October, 2012.
PROJECT RISK MANAGEMENT
From Model-based to Model-driven Design of User Interfaces.
© 2009 The MITRE Corporation. All rights Reserved. Evolutionary Strategies for the Development of a SOA-Enabled USMC Enterprise Mohamed Hussein, Ph.D.
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Sixth Hour Lecture 10:30 – 11:20 am, September 9 Framework for a Software Management Process – Artifacts of the Process (Part II, Chapter 6 of Royce’ book)
Rational Unified Process
PATFrame Workshop #2 USC Los Angeles, CA March 11, 2010.
What is Business Analysis Planning & Monitoring?
Software Development *Life-Cycle Phases* Compiled by: Dharya Dharya Daisy Daisy
1 Process Engineering A Systems Approach to Process Improvement Jeffrey L. Dutton Jacobs Sverdrup Advanced Systems Group Engineering Performance Improvement.
Slide 1V&V 10/2002 Software Quality Assurance Dr. Linda H. Rosenberg Assistant Director For Information Sciences Goddard Space Flight Center, NASA
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
The Development of BPR Pertemuan 6 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
A Prescriptive Adaptive Test Framework (PATFrame) for Unmanned and Autonomous Systems: A Collaboration Between MIT, USC, UT Arlington and Softstar Systems.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
1 PATFrame Workshop #2 Tool Concepts Dan Ligett Softstar Systems (603)
Presentation transcript:

A Prescriptive Adaptive Test Framework (PATFrame) for Unmanned and Autonomous Systems: A Collaboration Between MIT, USC, UT Arlington and Softstar Systems Dr. Ricardo Valerdi Massachusetts Institute of Technology August 11, 2010 Massachusetts Institute of Technology

© 2010 Massachusetts Institute of Technology Valerdi- 2 Outline PATFrame team The Challenge Test & evaluation decisions PATFrame features Use case Anything that gives us new knowledge gives us an opportunity to be more rational - Herb Simon

Sponsors Transition Partners

© 2010 Massachusetts Institute of Technology Valerdi- 4 PATFrame team Tejeda Cowart Hess Edwards Deonandan Valerdi Kenley Medvidovic Ferreira Ligett

© 2010 Massachusetts Institute of Technology Valerdi- 5 The Challenge: Science Fiction to Reality Singer, P. W., Wired For War: The Robotics Revolution and Conflict in the 21st Century (Penguin, 2009) You will be trying to apply international law written for the Second World War to Star Trek technology.

© 2010 Massachusetts Institute of Technology Valerdi- 6 Science & Technology Background Current state of UAS T&E UAS T&E is focused on single systems One-shot planning for T&E and manual test strategy adaptation Value-neutral approach to test prioritization Autonomy not a key consideration Function-based testing Traditional acquisition process Physics-based hardware-focused test prioritization and execution Future State of UAS T&E Systems of systems introduce complex challenges Accelerated & automated test planning based on rigorous methods Value-based approach to test prioritization Autonomy as a central challenge Mission-based testing Rapid acquisition process Multi-attribute decision making to balance cost, risk and schedule of autonomous software-intensive systems of systems

© 2010 Massachusetts Institute of Technology Valerdi- 7 PATFrame Objective To provide a decision support tool encompassing a prescriptive and adaptive framework for UAS SoS Testing PATFrame will be implemented using a software dashboard that will enable improved decision making for the UAS T&E community Focused on addressing BAA topics TTE-6 Prescribed System of Systems Environments and MA-6 Adaptive Architectural Frameworks Three University team (MIT-USC-UTA) draws from experts in test & evaluation, decision theory, systems engineering, software architectures, robotics and modeling Based on Valerdi, R., Ross, A. and Rhodes, D., A Framework for Evolving System of Systems Engineering, CrossTalk - The Journal of Defense Software Engineering, 20(10), 28-30, I think this article would be of interest to LAI consortium members. -Ricardo I think this article would be of interest to LAI consortium members. -Ricardo

© 2010 Massachusetts Institute of Technology Valerdi- 8 Three Stages in Decision Making Simon, H. (1976), Administrative Behavior (3rd ed.), New York: The Free Press

© 2010 Massachusetts Institute of Technology Valerdi- 9 Prescriptive Adaptive Test Strategy/ Test Infrastructure System under test Prescriptive Adaptive Test Framework

© 2010 Massachusetts Institute of Technology Valerdi Hours Years Long Term Planning Decisions/ investments Data Collection, analysis & reprioritization Test Planning (in SoS environment) Time Scale for Testing Decisions Days PATFrame Scope Test Development (i.e., design for testability) Months Test Execution (real time) Minutes

© 2010 Massachusetts Institute of Technology Valerdi S SoS Complexity of the system under test Autonomy of system under test AI Human SoS none Net-centricity of the environment Difficulty Testing a SoS in SoS environment Testing SoS Testing a system in a SoS environment net-centric focus UAST focus Ultimate Goal DARPA Urban Grand Challenge Use case (UAS in SoS) PATFrame Initial Focus: Testing Autonomous System in SoS environment

© 2010 Massachusetts Institute of Technology Valerdi NormativeDescriptivePrescriptive Goal: Capture Actual SoS Test Goal: Construct Theoretical Best SoS Test Metric set, best levels Metrics, state of the practice levels Goal: Synthetic framework for SoS testing at single and multi-program level Successful SoS Test = f(metricA, metricB, etc.) Actual SoS tests include metricA, metricC, etc. limit (MetricA) Normative (best) Descriptive (SoP) Descriptive (actual) Potential (new test A) test A contribution to state of the practice success MetricA MetricBMetricCMetricN … … Prescribed System of Systems Environment

© 2010 Massachusetts Institute of Technology Valerdi Integrated Test Management Feasible T&E test planning options UASoS test strategy estimated cost Primary Outputs for UASoS T&E Planners Test requirements Primary Inputs to PATFrame SUT requirements, capabilities, & architecture Schedule, resources and cost constraints Mission needs and goals Test resources & associated architecture (infrastructure) UASoS test strategy, recommended tests & their sequencing UASoS test strategy risks Undesirable emergent behavior Recommended and automatic adaptations to UASoS T&E issues

© 2010 Massachusetts Institute of Technology Valerdi Current Efforts: Ontology and Architecture Frameworks OntologyMeta Modeling Discrete Event Simulation

© 2010 Massachusetts Institute of Technology Valerdi Current Efforts: Normative Models Cost Modeling Real Options Value- based Testing

© 2010 Massachusetts Institute of Technology Valerdi PATFrame Workflow Data Processing Discrete Event simulation Cost model Real options analysis Design of experiments analysis Value-based analysis (algorithm) Outputs DSM Analysis Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions

© 2010 Massachusetts Institute of Technology Valerdi Cost Model Workflow Data Processing Cost model Value-based analysis (algorithm) Discrete Event simulation Real options analysis Design of experiments analysis DSM Analysis Inputs Outputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions Cost Estimate List of prioritized tests

© 2010 Massachusetts Institute of Technology Valerdi Design of Experiments Analysis Workflow Data Processing Design of experiments analysis Value-based analysis (algorithm) Cost model Discrete Event simulation Real options analysis DSM Analysis Outputs Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions Cost Estimate List of prioritized tests

© 2010 Massachusetts Institute of Technology Valerdi Dependency Structure Matrix (DSM) Analysis Workflow Data Processing Value-based analysis (algorithm) DSM Analysis Design of experiments analysis Cost model Discrete Event simulation Real options analysis Inputs Outputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions Cost Estimate List of prioritized tests

© 2010 Massachusetts Institute of Technology Valerdi Value-based Analysis Workflow Data Processing Cost model Design of experiments analysis Value-based analysis (algorithm) DSM Analysis Discrete Event simulation Real options analysis Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Outputs List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions Cost Estimate List of prioritized tests

© 2010 Massachusetts Institute of Technology Valerdi Real Options Analysis Workflow Data Processing Real options analysis Discrete Event simulation Design of experiments analysis DSM Analysis Cost model Value-based analysis (algorithm) Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Outputs List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions Cost Estimate List of prioritized tests

© 2010 Massachusetts Institute of Technology Valerdi Discrete Event Simulation Workflow Data Processing Discrete Event simulation Design of experiments analysis DSM Analysis Cost model Value-based analysis (algorithm) Real options analysis Inputs Outputs List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Cost Estimate List of prioritized tests

© 2010 Massachusetts Institute of Technology Valerdi Use Case 1: Define and Prioritize Tests Operational thread NAVSEA unmanned surface vehicles (USVs) must comply with International Regulations for Preventing Collisions at Sea (COLREGS)* T&E thread Action to avoid collision shall be: positive, obvious, made in good time** Validate USVs ability to self- adapt to: learn, sense & avoid, perform automated coupling, optimize adaptive software*** *Hansen, E. C., USV Performance Testing, April 14, **Part B, Sect. I, Sec. 8, Convention on the International Regulations for Preventing Collisions at Sea, IMO (The International Maritime Organisation), ***Engineering Autonomous System Architectures.

© 2010 Massachusetts Institute of Technology Valerdi Use Case 1: Define and Prioritize Tests Use Case Name: Test selection and prioritization for NAVSEA UASoS Goal: Define and prioritize a set of tests for an unmanned & autonomous SoS comprised of NAVSEAs fleet of unmanned surface vehicles (USVs) Summary: SoS cannot be exhaustively tested, therefore tests must be chosen that provide the most value within the allocated time and budget Actors: Test planner personnel, program manager, scheduler, range safety officer, owners of USVs, regulatory organization(s) Components: Dependency Structure Matrix (DSM) modeling interface, DSM clustering algorithm, XTEAM meta-modeling environment, value-based testing algorithm, LVC environment Normal Flow: 1.Input information about each USV, such as architecture, sensors, communication attributes, etc. 2.Integrate necessary LVC assets 3.Input the types of possible tests to assess COLREGS compliance 4.Input desired confidence intervals and compliance criteria for COLREGS 5.PATFrame outputs A prioritized set of tests to perform Expected level of confidence of COLREGS compliance after each test

© 2010 Massachusetts Institute of Technology Valerdi Use Case 1: Define and Prioritize Tests

© 2010 Massachusetts Institute of Technology Valerdi Testing to Reduce SoS Risks vs. the Risks of Performing Tests on an SoS SoS Risks What are unique risks for UASs? For UASs operating in an SoS environment? How do you use testing to mitigate these risks? What are metrics that you are using to measure the level of risk? Risks of Testing an SoS What are unique programmatic risks that impact your ability to do testing on UASs? To do testing on UASs operating in an SoS environment? What methods do you use to mitigate these risks? What are the metrics that you are using to measure the level of programmatic risk in testing?

© 2010 Massachusetts Institute of Technology Valerdi Example PATFrame Tool Concept Question: When am I Done testing? Technology: Defect estimation model Trade Quality for Delivery Schedule Inputs: Defects discovered Outputs: Defects remaining, cost to quit, cost to continue

© 2010 Massachusetts Institute of Technology Valerdi When am I done testing?

© 2010 Massachusetts Institute of Technology Valerdi Technical Specifications Specifications Parameter Current Performance Level Current Target*Ultimate Goal*Achieved Automation level in predicting undesirable emergent behavior ManualSemi-automated (moderate – based on identified best practices and rules from SMEs) Semi–automated (extensive – current target plus collected patterns in SUTs, test infrastructure, associated test settings) Probability of automatically predicting actual undesirable emergent behavior before testing (emergent behavior will occur) No capability Rate of falsely predicting emergent behavior automatically and before testing (emergent behavior wont occur – false pos.) No capability1 per 100 scenarios 1 per 1000 scenarios * to be validated with stakeholders at workshop #3

© 2010 Massachusetts Institute of Technology Valerdi PATFrame Technology Maturation Plan 1. Applied Research Phase (TRL 3-4) 1A. Define Prescribed SoS Environments Critical UAS SoS T&E concepts are validated through a normative, descriptive and prescriptive framework. 1B. Implement Adaptive Architectural Frameworks Basic ideas are implemented in a prototype and integrated through a UAS SoS T&E ontology model, system dynamics model and SoS architecture notions for dynamic adaptation of SoS T&E. 2. Development Phase (TRL 4-5) 2A. Develop Decision Support System Prescriptive framework (PATFrame) is developed as a decision support system to enable better outcomes of UAS T&E. 2B. Refine Use CasesUse cases identified by transition partners are used to validate PATFrame in relevant environments. Simulations are performed in specific operational domains (ground, air, space, water, underwater). 3. Deployment Phase (TRL 5-6) 3A. Conduct Focused Experiments Readiness is demonstrated through focused experiments in simulated operational environments (i.e., Future Combat Systems). 3B. Deploy Decision Support System PATFrame documented test performance demonstrating agreement with analytical predictions deployed across facilities in the Army, Navy and Air Force engaged in UAS T&E.