Presentation is loading. Please wait.

Presentation is loading. Please wait.

FAA V&V Conference 20 Oct 2011 Jeff Olinger Technical Director

Similar presentations


Presentation on theme: "FAA V&V Conference 20 Oct 2011 Jeff Olinger Technical Director"— Presentation transcript:

1 FAA V&V Conference 20 Oct 2011 Jeff Olinger Technical Director
AFOTEC Case Study: F-22 Air Combat Simulation (ACS) Verification and Validation (V&V) FAA V&V Conference 20 Oct 2011 Jeff Olinger Technical Director

2 Overview AFOTEC Background VV&A Process ACS Description/capabilities
V&V Requirements V&V Plan/execution VV&A Results Lessons Learned

3 AFOTEC Mission AFOTEC tests and evaluates new war-fighting capabilities in operationally realistic environments, influencing, and informing national resource decisions. - Not all Air Force Programs, ACAT I, ACAT II and programs on DOT&E Oversight - Realistic battlespace: as realistic as possible and practical using live, virtual, & contstuctive simulation as required - Fact-based (or Knowledge-based), decision quality information for decision makers - Effectiveness: degree of mission accomplishment when used by representative personnel - Suitability: degree to which a system can be placed in field use considering availability, compatibility, transportability, interoperability, reliability, maintainability, safety, and training - Mission Capability: Additionally, AFOTEC determines overall mission capability of the system. Includes information such as: evaluation of capabilities provided, missions supported, operational impacts, and decisions supported. Overall degree of mission accomplishment considering operational cost, aspects of the mission, operational conditions (Fully Mission Capable, Mission Capable, Partially Mission Capable, Not Mission Capable, and Not Determined)

4 Scope & Responsibilities
Independent operational test agency for AF Reports directly to CSAF Operationally tests all Acquisition Category I, II, & oversight programs Required by USC Title 10 Guidance & direction from DoD 5000-series ACAT I > $365M RDT&E > $2.19B Procurement ACAT II > $140M RDT&E >$660M Procurement UPDATE AFOTEC only answers to CSAF; CC reports directly to CSAF Part of the acquisition community, but independent by law. ACAT I = programs are major defense acquisition programs (MDAP) requiring eventual expenditure for RDT&E of more than $365 million, or procurement of more than $2.19 billion. ACAT II = programs are major systems requiring eventual expenditure for RDT&E of $140 million, or procurement of more than $660 million. OSD DOT&E oversight list = Acquisition category level; Potential for becoming an acquisition program (such as an Advanced Concept Technology Demonstration project or pre-MDAP); Stage of development or production; Whether program is subject to DAES reporting; Congressional and DoD interest; Programmatic risk (cost, schedule, performance); Past history of the developmental command with other programs; Relationship with other systems as part of a system-of-systems; and Technical complexity of system. NSS was rescinded in Mar 09 by USD (AT&L) and now follows DoD 5000 series with waivers as approved. Space updates to DoD submitted Organizationally independent Not separate/black hat; work with users but no influence

5 Transparent Planning and Operations
Roles & Missions Operational testers test the “as is” system Developmental testers test the “to be” system Government Developmental Test exists to insert user inputs into the design and development cycle phase early on. Contractual agreements require trained government DT professionals as partners with industry, otherwise unconstrained cost growth and delays in fielding can occur. Transparent Planning and Operations

6 Roles and Responsibilities
DOT&E Issuing DoD OT&E policy & procedures Approves operational test plans for test adequacy Reviewing & analyzing results of OT&E for MDAPs Providing independent assessments to SecDef, USD(AT&L), & Congress AFOTEC Initial Operational Test & Evaluation, Qualification Operational Test & Evaluation, Follow-on Test & Evaluation, Operational Utility Evaluation (OUE), Operational Assessment (OA) MAJCOMs Force Development Evaluation, Tactics Development & Evaluation, Weapons System Evaluation Program, OUE, OA The Director, Operational Test & Evaluation (DOT&E) is the principal staff assistant and senior advisor to the Secretary of Defense on operational test and evaluation (OT&E) in the Department of Defense. DOT&E is responsible for issuing DoD OT&E policy and procedures; reviewing and analyzing the results of OT&E conducted for each major DoD acquisition program; providing independent assessments to SecDef, the Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)), and Congress making budgetary and financial recommendations to the SecDef regarding OT&E; and oversight to ensure OT&E for major DoD acquisition programs is adequate to confirm operational effectiveness and suitability of the defense system in combat use According to chapter 2 “Evaluations” collect, analyze, and report data against stated criteria with a high degree of analytical rigor and are used to support FRP or fielding decisions. “Assessments” usually collect and analyze data with less analytical rigor, need not report against stated criteria, and cannot be the sole source of T&E data for FRP or fielding decisions. Initial Operational Test and Evaluation (IOT&E). IOT&E is only conducted by the Air Force Operational Test and Evaluation Center (AFOTEC). AFOTEC determines the operational effectiveness and suitability of the items under test using production or production-representative articles with stabilized performance and operationally representative personnel. Additionally, AFOTEC will resolve the mission capability of the system. Tests are conducted under operational conditions, including combat mission scenarios that are as operationally realistic as possible. IOT&E determines if operational requirements and critical operational issues (COI) have been satisfied and assesses system impacts to peacetime and combat operations. A dedicated phase of IOT&E is required for new ACAT I and II programs according to Title 10 §2399. Dedicated IOT&E is also required for all OSD OT&E Oversight programs according to DODI The determination of appropriate types of operational testing for subsequent modifications and upgrades as well as applicability to other types of programs will be addressed according to paragraph 4.6. Qualification Operational Test and Evaluation (QOT&E). QOT&E is a tailored type of IOT&E conducted only by AFOTEC. It is used to evaluate military-unique portions and applications of COTS, NDI, and GFE for military use in an operational environment when little or no government-funded R&D takes place. PMs cannot disregard T&E of COTS, NDI, and GFE simply because these items came from pre-established sources. QOT&E supports the same kinds of decisions as IOT&E. See paragraph 5.15 for more information on COTS, NDI, and GFE. Follow-on Operational Test and Evaluation (FOT&E). By definition, FOT&E is the continuation of operational test and evaluation (OT&E) after IOT&E or QOT&E and is conducted only by AFOTEC. It answers specific questions about unresolved COIs and test issues; verifies the resolution of deficiencies or shortfalls determined to have substantial or severe impact(s) on mission operations; or completes T&E of those areas not finished during OT&E. AFOTEC OT&E reports will document known requirements for FOT&E. More than one FOT&E may be required. Multi-Service Operational Test and Evaluation (MOT&E). MOT&E can be IOT&E, QOT&E, or FOT&E when two or more military Services are involved. It can also be a Multi-Service FDE if a MAJCOM is the lead test organization. See the Memorandum of Agreement [MOA] on Multi-Service Operational Test and Evaluation (MOT&E) and Operational Suitability and Definitions, and paragraphs 4.7, 4.8, and 7.9. If MAJCOMs are involved with multi-Service testing without AFOTEC, they should use this MOA as a guide.

7 V&V Foundation Different V&V requirements for different applications
Technical documentation Data sources Software Test Capabilities Common elements Purpose and process description Documentation standards Employ agent-based activities to support accreditation Criteria to assess risk of use BY WAY OF INTRODUCTION TO AIR FORCE V&V, WE NEED TO RECOGNIZE DIFFERENT APPLICATIONS HAVE DIFFERENT V&V REQUIREMENTS. WHAT IS DONE FOR TECHNICAL DOCUMENTATION MAY NOT BE AS ROBUST AS THAT FOR SOFTWARE V&V…AND SOFTWARE V&V RESULTING FROM APPLICATION OF THE SYSTEM ENGINEERING “V-MODEL” MAY BE COMPLETELY DIFFERENT THAN V&V PROCESSES FOR TEST CAPABILITIES EVEN THOUGH THE TEST CAPABILITY IS SOFTWARE INTENSIVE AS WAS THE CASE FOR F-22 ACS. THERE ARE SOME COMMON ELEMENTS HOWEVER… THE “WHY” AND “HOW” V&V IS ACCOMPLISHED IS IMPORTANT FOR DECISION-MAKING PURPOSES. POLICY AND STANDARDS EXIST IN THE DEFENSE DEPARTMENT TO DOCUMENT THE V&V ACTIVITIES. TYPICALLY, THE STANDARDS ARE TAILORED FOR THE SPECIFIC V&V APPLICATION. MOST IF NOT ALL V&V APPLICATIONS EMPLOY AGENT-BASED ACTIVITIES TO ACCOMPLISH THE EFFORT WITH THE AGENTS ACCOUNTABLE FOR PRODUCING THE V&V PLAN(S), EXECUTING THE V&V PROCESSES, AND PRODUCING THE V&V REPORT(S) LASTLY, ALL V&V EFFORTS ESTABLISH CRITERIA TO ASSESS THE RISK OF USING THE APPLICATION FOR A SPECIFIC PURPOSE. THIS CRITERIA REFLECTS THE AMOUNT OF RISK ACCEPTABLE TO THE DECISION MAKER.

8 Accreditor Accredits M&S
Air Force VV&A Process Requirement Review Implementation Accreditation Additional V&V Requirement V&V Agent(s) V&V Plan Technical Review Working Group Accreditation Agent V&V Agent(s) Model Manager Domain/Subj Matter Experts Other Experts (As Req’d) V&V Agent(s) Implement Plan TRWG Review V&V Final Report Accreditation Report Accreditor Accredits M&S Model Development Upgrades Backlog (V&V) Extracted from AFI Fig 1

9 ACS Fundamental Requirements
Installed system performance Production representative system Software-in-the-loop Actual software complexity & limitations Realistic loading on sensor allocation / timelines Man-in-the-loop Man-machine interface critical to overall evaluation Thinking / reactive adversary for realistic operations Seamless mission flow & realistic relationship between operational tasks

10 ACS Use in F-22 IOT&E Specifically created to support operational effectiveness evaluation during F-22 Engineering & Manufacturing Development (EMD) Simulation requirements for IOT&E developed early in the EMD program Extensive usage during IOT&E preparation & execution Pilot training Test development exercises Culminated in “Test-the-Test” missions Formal verification & validation (V&V) Accreditation by AFOTEC 5 weeks / 152 trials during formal evaluation

11 ACS Description The ACS is a high fidelity, man-in-the-loop (MITL) simulation that AFOTEC uses to evaluate the effectiveness of the F-22 in combat scenarios that cannot be simulated on the open-air range. The ACS simulates many systems, each of which consists of several technical subsystems. Although the ACS is a simulation, important elements that are key portions of aerial engagements will be real: Live operators man foreground blue and red aircraft, a blue Airborne Warning and Control System (AWACS) controller position, and key posts in the red IADS. The F‑22 OFP is loaded and operated as it is in the aircraft. All systems interact with each other (weapon attacks) through emissions and transmissions produced within the engagement environment (infrared [IR], radio frequency [RF], and visual). The centerpiece of the ACS consists of manned F-22 cockpits that are surrounded by back lit geodesic panels that provide simulation of the pilot's full field of view. In addition, the ACS has manned interactive control stations (MICSs). MICSs use large, high-definition graphics terminals to provide switched forward/rear out-the-window views and depictions of aircraft controls and displays. Each MICS also provide hands-on-throttle-and-stick (HOTAS) controls to its pilots. There are stations designed for live red GCI and air defense operations center controllers and a live blue AWACS controller. In addition to the manned aircraft and controller stations, ACS is able to generate background aircraft that will be taking part in peripheral engagements with the F‑22s.

12 ACS Capabilities The ACS has the capability to simulate different threat air-superiority fighters, blue fighters, other airborne platforms, and ground-based players. These other players include ground attack and airborne aircraft with radars, a variety of blue and red air-to-air missiles, and an IADS laydown including early warning (EW) and ground control intercept (GCI) radar, surface-to-air missile (SAM) sites, and SAMs.

13 V&V Requirements Cost: Contractor proprietary Schedule: 3 years
Focus on reuse, re-hosting of existing software/models Performance: 10% rule Functional Operating Fidelity AFOTEC WAS INVOLVED EARLY IN THE ACS REQUIREMENTS DEFINITION AND DEVELOPMENT PROCESS IN ORDER TO ENSURE THE ACS REPRESENTED THE INSTALLED PERFORMANCE SEEN ON THE F-22 DURING FLIGHT TESTING. REGRETTABLY, I CANNOT DISCLOSE THE COST OF THE ACS V&V EFFORT SINCE THE CAPABILITY IS NOT GOVERNMENT-OWNED. BUT THE EFFORT TOOK ABOUT 3 YEARS TO ACCOMPLISH. THIS SHORT TIMEFRAME LED TO A FOCUS ON REUSING AND RE-HOSTING EXISTING SOFTWARE AND MODELS ESPECIALLY FOR NON-F-22 COMPONENTS OF ACS, SUCH AS THE DIADS, REAL-TIME SAM (RTSAM), AND THE OTHER BLUE AND RED AIRCRAFT MODELS. V&V PERFORMANCE REQUIREMENTS WERE DEVELOPED FOR INDIVIDUAL ACS COMPONENTS AND FOR THE ACS AS A WHOLE. THIS STRATEGY ALLOWED FOR AN EASIER EXAMINATION OF THE TECHNICAL PERFORMANCE OF THOSE COMPONENTS BEFORE THEY ARE COMBINED INTO A COMPLICATED TACTICAL ENGAGEMENT INVOLVING MULTIPLE AIRCRAFT AND GROUND RADARS, OPERATION CENTERS, AND SAMs. IF IT CAN BE DEMONSTRATED THAT THE INDIVIDUAL PIECES OF THE ACS ARE PROPERLY MODELED, AFOTEC WAS MORE CONFIDENT IN THE ACCURACY OF THE COMPLETE SIMULATION. AT THE FUNCTIONAL PERFORMANCE LEVEL, V&V REQUIRMENTS WERE BASED ON MEASURES OF PERFORMANCE SUCH AS RADAR MODE DETECTION RANGE AND ANGLE TRACKING ACCURACY. AT THE OPERATING OR MISSION LEVEL OF PERFORMANCE, MEASURES OF EFFECTIVENESS SUCH AS THE NUMBER OF VALID MISSILES LAUNCHED PER SORTIE, TYPES OF MISSILE LAUNCHED, AND LAUNCH RANGES WERE USED. BECAUSE OF THE SHEER MAGNITUDE OF THE NUMBER OF MEASURES CALCULATED DURING ACS RUNS AND BECAUSE OF THE COMPLICATED INTERACTIONS BETWEEN ACS COMPONENTS, THE USE OF SENSITIVITY ANALYSES TO DETERMINE ERROR TOLERANCES IS IMPRACTICAL. RATHER, AFOTEC USED A THREE-STEP APPROACH TO DETERMINE FIDELITY ACCEPTANCE: THE FIRST STEP OF THIS APPROACH WILL INVOLVE SCREENING THE MEASURES PRODUCED BY THE ACS TO IDENTIFY THOSE CASES THAT DIFFER FROM REFERENCE DATA BY 10 PERCENT OR MORE (THIS IS THE 10% RULE). AFTER IDENTIFYING THESE CASES, AFOTEC THEN TRIED TO FIND A REASONABLE EXPLANATION FOR THE DIFFERENCES. AN EXAMPLE OF A REASONABLE EXPLANATION MAY BE THAT THE ACS TEST CASE WAS NOT CONDUCTED AT THE SAME FLIGHT CONDITIONS AS THE OPEN-AIR TEST, THUS PRODUCING DIFFERENT RESULTS. FINALLY, IF AFOTEC COULD NOT IDENTIFY A REASONABLE EXPLANATION AND RESOLVE THE DISCREPANCY, WE DECIDED TO EITHER ACCEPT LESS STRINGENT FIDELITY OR INITIATE CORRECTIVE ACTION TO IMPROVE AGREEMENT BETWEEN TEST MEASUREMENTS AND ACS PERFORMANCE.

14 ACS Models Requirements identification key to VV&A success
Includes models, modifications, and interfaces Threat Model Analysis Program (TMAP) Fidelity requirements focused on System Under Test (SUT) capabilities Create models/modify ACS Verify stand-alone models and modifications New/modified models integrated into ACS Verify integrated performance As model suppliers, IPCs* must work closely with ACS Support fidelity of interaction with other ACS models Real-time requirements * IPC: Intelligence Production Center (NASIC, MSIC, ONI)

15 V&V Plan Accredit ACS for a specific purpose
Evaluate aspects of F-22 performance not obtainable in OAR ACS provides “test for score” environment V&V each ACS element and ACS as a whole Aircraft (F-22) Digital Integrated Air Defense System (DIADS) Blue and Red airborne players/threats Real time surface to air missile (RTSAM) Background (player density) Environment (ECM, clutter, weather) Endgame Separate plans…single report approved by Commander AT AFOTEC, TEST CAPABILITIES ARE ACCREDITED FOR A SPECIFIC PURPOSE. IN THE CASE OF ACS, THE PURPOSE WAS TWO-FOLD: TO USE THE CAPABILITY FOR MISSION REHEARSAL BY EXECUTING SCENARIOS/EVENTS THAT REQUIRED LAUNCHING AND DETECTION OF AIR-TO-AIR AND SURFACE-TO-AIR MISSILES, EXPENDITURE/ACTIVATION OF COUNTER-MEASURES, AND END-GAME PROCESSING. THESE EVENTS, BY THEMSELVES REPRESENT A CHALLENGE TO TESTING ON THE OPEN-AIR RANGE BECAUSE PILOTS WILL NOT ACTUALLY LAUNCH MISSILES OR ACTIVATE AND EXPEND MISSILE COUNTERMEASURES, AND TARGETS WILL NOT REALLY EXPLODE AND DISAPPEAR FROM SENSOR DISPLAYS. AND TO EVALUATE THE F-22’s PERFORMANCE IN AIR-SUPERIORITY. AIR-SUPERIORITY PERFORMANCE CENTERS ON LETHALITY AND SURVIVABILITY; THAT IS, INTERACTIONS AMONG VARIOUS PLAYERS. THE V&V AGENTS DEVELOPED SEPARATE PLANS FOR EACH ELEMENT OF THE ACS AND FOR THE ACS AS A WHOLE WHILE PRODUCING A SINGLE REPORT APPROVED BY THE AFOTEC COMMANDER.

16 V&V Plan Used Simulation, Test and Evaluation Process (STEP)
Goal: Give Commander sufficient information to make sound decision about using ACS for F-22 OT&E AFOTEC FOLLOWED THE SIMULATION TEST EVALUATE PROCESS (STEP) ADVOCATED BY THE DEPARTMENT OF DEFENSE (DOD) AS THE MOST DESIRABLE METHOD FOR ACCREDITING SIMULATIONS USED TO DEVELOP AND VERIFY WEAPON SYSTEMS. FOR MANY OF THE ACS PLAYERS AND COMPONENTS, WE HAD THE OPPORTUNITY TO COMPARE ACS PERFORMANCE TO ACTUAL FRIENDLY AND THREAT EQUIPMENT PERFORMANCE THROUGH BOTH OPEN AIR RANGE (OAR) AND HARDWARE-IN-THE-LOOP (HITL) TESTING.

17 V&V Execution Verification Validation Validation Working Groups (VWGs)
Prime contractor and major subcontractor efforts Government Labs Intelligence Centers Validation Conceptual model validation Results validation Correlation to testing validation Face validation Validation Working Groups (VWGs) THE PRIME CONTRACTOR AND MAJOR SUBCONTRACTORS WHO DEVELOPED OR RE-HOSTED THE MODELS WROTE SOFTWARE DEVELOPMENT PLANS FOLLOWING STANDARD SOFTWARE PRACTICES AS PART OF THEIR VERIFICATION PROCESS. IN MOST CASES, THEY HAD INDEPENDENT TEAMS ACCOMPLISHING THE ACCEPTANCE TESTING. CONCEPTUAL MODEL VALIDATION INVOLVED INTERNAL EXAMINATIONS OF M&S ASSUMPTIONS, ARCHITECTURE, AND ALGORITHMS IN THE CONTEXT OF THE INTENDED USE. RESULTS VALIDATION INVOLVED COMPARISONS OF ACS RESULTS WITH HISTORICAL/ANALYTICAL RESULTS FROM OTHER M&S. CORRELATION TO TESTING VALIDATION WAS AFOTEC’S PREFERRED TECHNIQUE FOR THE F-22 COMPONENT BECAUSE BENCH OR FLIGHT-TEST DATA WAS AVAILABLE FOR COMPARISON. HOWEVER, IT HAD THE MOST RISK BECAUSE IT DEPENDED ON THE F-22 TEST SCHEDULE AND THE CAPABILITIES OF MANY ORGANIZATIONS TO PRODUCE DATA. FOR FACE VALIDATION, AFOTEC TOOK ADVANTAGE OF THE FACT THAT THE ACS IS A MITL SIMULATION, WITH THE PILOTS AND CONTROLLERS WHO OPERATE THE REAL SYSTEMS ABLE TO ASSESS HOW WELL THE SIMULATION MATCHES. THE WORK WAS PERFORMED BY THREE VWGs ALIGNED WITH THE ACS SIMULATION FRAMEWORK, F-22 SYSTEMS, AND THREAT AND FRIENDLY SYSTEMS PRODUCT CENTERS. THE VWGs EXECUTED THE V&V PLANS FOR THE EIGHT ACS COMPONENTS PREVIOUSLY DISCUSSED.

18 Validation & Accreditation
Validate performance Validation Working Groups (VWG) Stakeholders supported by appropriate subject matter experts Correlation to Open Air Range (OAR) is capstone event Test-the-Test (TTT) Correct defects – re-verify/re-validate as required Accreditation by AFOTEC V&V process supports accreditation with continuous documentation

19 Test-the-Test (TTT) Process
TTT is the final step in validation OAR modeled with anticipated test conditions TMAP followed for exploited or surrogate threats in OAR Executed with same procedures, safety limits, etc. as the OAR Participants same as OAR – OT&E & Aggressor pilots, etc. Results compared following OAR testing Quantitative metrics: mission-level and supporting MOEs Face validation based on participant comments Deltas are investigated and explained Differences in test conditions or procedures identified Changes to models assessed & implemented as required TTT re-fly Conduct if necessary Correlation of virtual & OAR results must satisfy accreditation authority

20 V&V Results Functional verification by contractor teams
ACS validated by AFOTEC 98 component experiments Flew simulated sorties over 21 weeks Limitations (all minor) Four for F-22 system models Four for blue airborne models Two Digital Integrated Air Defense Systems One each red airborne model, background player, endgame, environment ACS accredited for use in F-22 OT&E THE ACS VERIFICATION EFFORT ENTAILED 5 CONTRACTOR TEAMS and independent agents. AFOTEC Validation WAS completed for Each component AND THE AIR COMBAT SYSTEM AS A WHOLE BY: comparing their simulation performance with the installed performance of those same systems demonstrated during open-air testing. This entailed 98 component experiments…AND comparing INTEGRATED PERFORMANCE measures observed during open-air trials with trials conducted in the ACS ONLY 14 MINOR LIMITATIONS WHERE ACS COMPONENTS DID NOT ACHIEVE THE 10% RULE were noted CONSEQUENTLY, THE ACS WAS ACCREDITED FOR USE IN F-22 OT&E

21 Lessons Learned Attributes for VV&A success
Models created with fidelity focused towards replicating installed system performance required for mission level evaluation Know conditions/variables that drive sensor performance Identify data sources that support validation DT scope & methodology are backbone to validation Know SUT capabilities & user CONEMP VV&A must occur at the subsystem, system, and system of systems levels

22 Summary Complex V&V effort Required to accomplish F-22 evaluation
Player interactions Man-machine interfaces Required to accomplish F-22 evaluation Represent realistic operational environment Mitigate test range limitations Keys to success Early influence Leverage existing guidance and practices Teamwork Document work IN SUMMARY, ACS IS A COMPLICATED SIMULATION THAT ALLOWS THE INTEGRATION OF LIVE AND DIGITAL PILOTS INTO A DENSE PLAYER AND SIGNAL ENVIRONMENT. HUNDREDS OF BACKGROUND PLAYERS (AIRBORNE AND GROUND-BASED) CONTRIBUTE TO THE OPERATIONAL REALISM OF THE ACS FOR F-22 EFFECTIVENESS EVALUATIONS. THE ACS V&V AGENTS DEFINED A ROBUST V&V STRATEGY BASED ON PLANNED AND OPPORTUNISTIC REUSE OF EXISTING CODE. THE ACS WAS V&V’ED AT BOTH THE FUNCTIONAL LEVEL AND THE MISSION LEVEL BEFORE IT WAS FINALLY ACCREDITED. OUR KEYS TO SUCCESS: BE INVOLVED EARLY, LEVERAGE EXISTING GUIDANCE/PRACTICES, WORK AS A TEAM, AND (MOST IMPORTANTLY) DOCUMENT THE WORK SO OTHERS CAN TAKE ADVANTAGE OF REUSING THE CAPABILITY. QUESTIONS?

23

24 Questions?


Download ppt "FAA V&V Conference 20 Oct 2011 Jeff Olinger Technical Director"

Similar presentations


Ads by Google