Presentation is loading. Please wait.

Presentation is loading. Please wait.

NAVAIR Pt. Mugu Executing Agent

Similar presentations


Presentation on theme: "NAVAIR Pt. Mugu Executing Agent"— Presentation transcript:

1 NAVAIR Pt. Mugu Executing Agent
Test & Evaluation/Science &Technology Net-Centric Systems Test (NST) Focus Area Overview NAVAIR Pt. Mugu Executing Agent Gil Torres

2 T&E/S&T Program Overview
The T&E/S&T program: Exploits new technologies and processes to meet important T&E requirements Expedites the transition of new technologies from the laboratory environment to the T&E community. Leverages commercial equipment, modeling and simulation, and networking innovations to support T&E Examines emerging requirements derived from joint service initiatives Identifies needed technology areas and develops a long-range roadmap for technology insertion Leverages research efforts from the highly developed technology base in the DoD labs, test centers, other government agencies, industry, and academia

3 TRMC Organization OSD Test Resource Management Center
Director Deputy Director Joint Investment, Policy, & Planning Deputy Director Budget & Resources Deputy Director Strategic Planning Policy S&T Program CTEIP Program JMETC Netcentric Test Components InterTEC Net-Centric Systems Test Focus Area Net-Centric Technology Feed

4 Net-Centric Systems Test (NST) Focus Area Overview
Netcentric Test Environment Recreate Net-Centric Test Battlespace Manage Net-Centric Test Environment TACTICAL DATA GIG SERVICES & DATA Technology Gaps Measure & Analyze Net-Centric Test Environment. Manage: Dynamically manage & monitor Test Env. Automated test planning & reporting Access & interface with the Netcentric Env. Build Characterize & replicate JNO Emulate Networks & Services Construct NST Infrastructure Sim/Stim Battlespace Components Measure Measure & Verify NR-KPP compliance Assess Joint Mission Effectiveness Automate analysis and visualization

5 Net-Centric Solutions Warfighter NST Vision
Accelerate the delivery of Mission Ready Net-Centric capabilities to the warfighter Net-Centric Tomorrows Plug & Test Net-Centric Environment Today’s Netcentric Environment Warfighter Solutions Stove-Piped Information• Centralized Control• Unique Software Solutions• Data Not Shared• Inefficiency• Net-Centricity• Decentralized Control• Enterprise Services• Shared Data• Autonomous Agent• Enterprise Architecture• Web 2.0 NST Portfolio NST T&E Advanced Technologies/Capabilities Build Manage Measure Broad Agency Announcement NST Reference Architecture SME COI Technology Lab Joint Architectures Netcentrtic Standards Net-Centric Policy GAP NST T&E Needs Technology Gaps

6 NST T&E Needs (1 of 2) Recreate the Net-Centric Test Battlespace
Emulate tactical edges Test mission threads with limited participants Integration/Responsiveness of Virtual Component with Live Component Integration of Cyber/Information Operations with Kinetic Simulations Representation of the Network Infrastructure and Network Environment Representation of Cyber threat to NSUT Simulate/Stimulate Irregular Warfare for urban environment Verification & Validation (V&V) of the Synthetic Battle Environment (SBE) 6

7 NST T&E Needs (2 of 2) Measure & Analyze Net-Centric Test Environment
Monitor/Analyze Critical Chaining Issues Improving Near Real Time Analysis Improve Instrumentation & Analyzing Cyber/IO Manage Net-Centric Test Battlespace Quality of Service to Critical Nodes Synchronize Synthetic Environments Manage Test Environment for Net Based Systems Manage Tactically Dynamically Configured Networks 7

8 NST Challenges “Technologies to. . . .”
Recreate Net-Centric Test Battlespace Flexible, Scalable LVC Environment For JMe Testing GIG & SOA Sim / Stim Capabilities Integrate & Validate Net-Centric Simulations & SoS Emulate Red Cyber Warfare Capabilities Automate C2 Decision Process Measure & Analyze Net-Centric Test Environment Analyze Joint Mission Threads Near Real-Time Evaluate Net-Centric Data & Services Automated NR-KPP Analysis Measure Joint Mission Effectiveness (JMe) Analyze & Visualize Information Assurance & Operations Manage Net-Centric Test Battlespace Dynamically Manage Test Infrastructure Automated Planning and Scenario Development Cross Domain Solution Over Distributed Test Infrastructure

9 NST Roadmap – Recreate the Net-Centric Test Battlespace
06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 T&E/S&T Investments Emulate Tactical Edges NE2S – Network Effects Emulator System TRCE – TENA in a Resource Constrained Environment Test Mission Thread w/Limited Participants Technologies to V&V NST Env. Integration/Responsiveness of VC with Theatre/Live Component Representation of Cyber Threat to NSUT Sim/Stim Irregular Warfare NE2S Representation of Network Infrastructure and Env. TRCE Integrate Internet into Family of Tested Nets Integrate IO with Kinetic Sim’s Challenges Flexible, Scalable LVC Environment For JMe Testing Emulate Red Cyber Warfare Capabilities Integrate & Validate Netcentric Simulations & SoS GIG & SOA Sim / Stim Capabilities Automate C2 Decision Process T&E Capability Recreate the Net-Centric Test Battlespace 9

10 NST Roadmap – Measure and Analyze the Net-Centric Test Environment
06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 T&E/S&T Investments NECM – NST Evaluation Capability Module NEIV – Net-Centric Environment Instrumentation & Visualization NSAT – NR-KPP Solution Architecture Toolset NECM NEIV NSAT Monitor/Analyze Critical Chaining Issues Improving Near Real Time Analysis Analyze impact of Cyber and IO Challenges Analyze Joint Mission Threads Near Real-Time Measure Joint Mission Effectiveness (JMe) Evaluate Netcentric Data & Services Analyze & Visualize Information Assurance & Operations Automated NR-KPP Analysis T&E Capability Measure and Analyze the Net-Centric Test Env. 10

11 NST Roadmap – Manage the Net-Centric Test Battlespace
06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 T&E/S&T Investments MENSA – Middleware Enhancements for Net-Centric Simulation Architecture ANSC – Analyzer for NST Confederations MLSCLS – Multi-Level Security Cross Layer Scheme PAM – Policy Based Adaptive Network Management RRM – Rapid Reconfiguration Module MENSA ANSC MLSCLS Synch Synthetic Environments PAM Manage Test Env. For Net-based systems RRM Manage Tactically Dynamic Configured Networks QoS to Critical Nodes Challenges Dynamically Manage Test Infrastructure Cross Domain Solution Over Distributed Test Infrastructure Automated Planning and Scenario Development T&E Capability Manage the Net-Centric Test Battlespace 11

12 T&E/S&T Project Critical Factors
T&E Need (Gaps) Address a known T&E technological gap. Understanding of the current T&E capabilities and techniques Understand capabilities and techniques for the future of T&E S&T Challenge What needs to be done to effectively address the T&E Need Efficiencies and test capacity improvements as a result of the technology developed or matured Results of project will significantly advance technology and thereby significantly enhance current test capabilities High Risk-High Payoff The project improves testing rigor and capabilities relevant to Net-Centric technology improvements Reduces the cost of government operations Provides an improved capability to test operators

13 T&E/S&T Project Critical Factors
Wide Application The technology can be applied at multiple sites, on multiple systems, in multiple organizations simultaneously The technology effectively supports a distributed LVC Net-Centric test event Transition Strategy Identify potential partners, future consumers of the technology (government lab, agency, test activity, industry, etc). Identify the intended use of the technology, and a plan to obtain buy-in to the technology from potential transition partners Demonstrate a concise and sensible plan that will create a higher probability of a successful transition to the identified transition partners

14 Project Execution Plan (PEP)
PEP required contents: Purpose and Work Scope T&E Needs S&T Challenges Technology Readiness Levels Budget Project Schedule Project Specifications Phase Exit Criteria Project Transition List of Deliverables

15 Project Specifications
Parameter Current Performance Level Current Target Ultimate Goal Achieved End-to-end Jitter Performance Improvement Up to 200% (7 hops, 70% load, Across Continental U.S.) Less than 20% (High priority video traffic) Less than 5% End-to-end packet Drop Performance Improvement Up to 100% Less than 1% End-to-end Delay Performance Improvement 200 – 500 msec <100 msec (High priority traffic) ~ 50 msec Network Fault Recovery Time Days/Hours/Minutes 30 Sec 15 Sec Test Exercises Disruption Caused by Network Problems Minutes/Hours/ Days 1 Sec

16 Netcentric Systems Test Technology Investments & Transitions (1 of 7)
Project Title T&E Gap/Description Current Status/Transition Dynamic Utility for Collaborative Architecture-Centric T&E (DUCAT) Georgia Tech Research Institute Inability for real-time visualization and analysis of C2 processes, integrated architectures and dynamic, adaptive test architectures Enable representation of test artifacts test architectures using XML; use of data mining to capture C2 interactions and visualize mission threads as they are being executed Complete – Developed and demonstrated advanced data mining and visualization techniques to work in near-real time with agile C2 in heterogeneous networks Prototype evaluation completed in September 09 Transitioned to NSAT project (1QFY10) Network Ready Architecture Evaluator (NetRAE) Raytheon Inability to perform T&E of Net-Enabled Weapon Systems operating in a Service Oriented Architecture environment Provide a persistent and extensible infrastructure with the means of testing against the Net Ready KPP Complete – Delivered prototype toolset to extract critical data from systems architecture Toolset for assessing service- oriented architectures provided by weapon systems for establishing Net Ready KPP compliance HWIL = Harware-In-the-Loop OAR = Open Air Range

17 Netcentric Systems Test Technology Investments & Transitions (2 of 7)
Project Title T&E Gap/Description Current Status/Transition Flexible Analysis Services (FAS) JITC/Northrop Grumman Inability for flexible analysis service that adapts to changes in tactical, C2 and emerging net-centric protocols and systems that dynamically optimize info exchange Research services for collection, parsing, transforming, analyzing, and visualizing information from diverse protocols Complete – Visualization and Reporting (FY09) : Conduct research, expand Phase 2 prototypes, and develop visualization and reporting prototypes Prototype flexible, user-definable message parsing, data transformation, and analysis services (1QFY10) Analyzer for Netcentric Systems Test Confederations (ANSC) SRI Inability to automate complex pre-event test planning with removal of human errors in developing a LVC based test environment Develop and demonstrate prototype analyzer software that will streamline pre-event specification and integration of systems or components into a test confederation. Phase 3 – Iterate Phase 2 tasks with the InterTEC Spiral 3 test event; Install Analyzer and KBs in TIE Lab. SMEs to add resource descriptions to selected KBs Prototype demo for InterTEC and DISA FDCE. (3QFY10) HWIL = Harware-In-the-Loop OAR = Open Air Range

18 Netcentric Systems Test Technology Investments & Transitions (3 of 7)
Project Title T&E Gap/Description Current Status/Transition Network Effects Emulation System (NE2S) JFCOM, Suffolk, VA Inability to simulate and analyze Network effects within joint context to create, instrument, and analyze the impact of effects on shared situational awareness in a Netcentric environment Develop technologies to provide an enterprise tool capable of simulating a wide range of network and host based effects that can be centrally managed and controlled. Phase 2 – Research types of effects and methods to be emulated; research capability to control emulated effects such as memory exhaustion and high CPU load Prototype demo of Federated Joint Live-Virtual-Constructive Effects Capability (3QFY11) HWIL = Harware-In-the-Loop OAR = Open Air Range

19 Netcentric Systems Test Technology Investments & Transitions (4 of 7)
Project Title T&E Gap/Description Current Status/Transition Middleware Enhancements for Netcentric Simulation Architecture (MENSA) JPL Inability to dynamically optimize information delivery; minimize network congestive failure; and overcome unreliable network environments Improving network efficiency; improving reliability via the use of redundancy; improving utilization through dynamic coding Phase 1 – Development environment covering TENA platforms; development of Compression enhancements and QoS Hooks Demo during InterTEC Spiral 3 and improve Network Quality of Service for iNET. (2QFY09) NST Evaluation Capability Module (NECM) Visense Inability for automated tool to provide an architecture-driven JMe capability across LVC distributed environment, works with the U.S. Joint Forces Command (USJFCOM) JACAE tool suite and operational methodology. Automating joint mission thread and net-enabled UJT analysis and event planning PHASE 1 – JACAE tool and schema technical exchange with NECM team October NECM Software Client and Data Host (3QFY10) HWIL = Harware-In-the-Loop OAR = Open Air Range

20 Netcentric Systems Test Technology Investments & Transitions (5 of 7)
Project Title T&E Gap/Description Current Status/Transition Multi-Level Security Cross Layer Scheme (MLSCLS) John Hopkins/Applied Physics Laboratory Inability to support efficient and flexible Cross Domain/Multilevel Security (MLS) in distributed test Infrastructures Provide guaranteed access to the wireless medium while using a single channel without fixed infrastructure and with physical layer solutions to provide a cross layer security solution Phase 2 – Research, feasibility, characterization and selection of alternatives of flexible & efficient architectures Fills a long range iNET requirement for test scenarios that need MLS, ad hoc & mobile protocols 2QFY2011 TENA in a Resource Constrained Environment (TRCE) SAIC Phase 2 – Analyze technologies for effective network utilization; develop key technology software (SW) component proof-of-concepts Prototype technologies evaluated in “Alpha” release of middleware 4QFY2011 HWIL = Harware-In-the-Loop OAR = Open Air Range

21 Netcentric Systems Test Technology Investments & Transitions (6 of 7)
Project Title T&E Gap/Description Current Status/Transition NR-KPP Solutions Architecture Toolkit (NSAT) GBL Systems Corp. Inability to automate testing of the Net Ready-Key Performance Parameters (NR-KPP) solutions architecture element. Provide automated compliance and conformance testing; plan mission threads for Netcentric System Test; and visualize & analyze mission thread execution through architecture views and tactical message exchanges Phase 1 – Define NSAT vocabulary/ontology and associated project capabilities and interfaces; Create SOA service for solutions architecture; Create standardized NST inference services Provides supporting DoDAF solutions architecture assessments; support joint interoperability certifications within the emerging net-centric warfare (NCW) environment 4QFY10 HWIL = Harware-In-the-Loop OAR = Open Air Range

22 Netcentric Systems Test Technology Investments & Transitions (7 of 7)
Project Title T&E Gap/Description Current Status/Transition Policy-based Adaptive Network & Security Management Technology for NST (PAM) JPL Inability to automate network infrastructure configuration and recovery based on dynamic policies and cross domain data sharing Provide “Mission_Plan-to- Network_Operations” automation and enables policy- based dynamic information sharing with Intellectual Protocol protection Phase 2 – R&D on reasoning front-end software with Policy- Based Network Management software & emulated network environment; demo planned to improve InterTEC event execution. Prototype demo of Policy-Based Adaptive Network and Security Management software system (4QFY11) HWIL = Harware-In-the-Loop OAR = Open Air Range

23 Questions?

24 T-E Web Monthly Reports
Monthly reporting timeliness is critical Timeliness of report submission is important to effectively communicate project status Submission to Deputy EA required by the 5th of the month for prior month Content: Executive Summary of the project Funding O&E Activities for the month just completed Significant accomplishments Planned efforts for the next month Problems/Issues Brief explanation on use of funds

25 FAM Monthly Reports Focus Area Monthly (FAM) reports
EA required to submit to PMO on the last Friday of each month Highlight significant (positive and negative) events or issues in projects Not a substitute for more timely notification to PMO of highly significant issues or events Issues requiring immediate action Events that could result in “bad press” FAM reports are now auto-generated from T-E Web, so significant accomplishments and issues are very important to report each month Keep the Focus Area Monthly Reports simple—no lengthy description of normal, scheduled activities—just good news and problems

26 Back-Up Slides

27 Weekly Activity Reports
Due to EA every Wednesday by noon Pacific Time Sample Content: Netcentric Systems Test (NST) NST Evaluation Capability Module (NECM): The NECM project is developing an automated object-relationship graph algorithm This technology will automate the arrangement of operational activities, control logic, sequence flows, and . . . This algorithm enables improved user interaction to create a. . . During the week of 11 January 2010, the NECM team implemented an initial Joint Personnel Recovery (JPR) joint mission thread (JMT). . . The future efforts will include. . . Status/Assessment: Green – Early engagement of transition partners to define appropriate use case to verify useful test technologies, and project is operating on schedule and on budget.

28 T&E/S&T Program Roles and Responsibilities
T&E/S&T Program Manager Provides guidance to focus areas to ensure disciplined project selection and funding process Provides program funds for administering and executing T&E/S&T Program Reviews technical execution of focus areas and projects to ensure T&E/S&T Program objectives are being met Reviews financial execution of focus areas and projects to ensure best value and fiscal responsibility Provides TESTWeb and the T&E/S&T Acquisition Community Connection (ACC) Special Interest Area to support exchange of financial, technical, and programmatic information throughout the T&E/S&T Program T&E/S&T Program Office Staff Supports the T&E/S&T Program Manager Reviews focus areas and projects to ensure technical objectives are being met Reviews focus areas and projects to ensure financial objectives are being met Provides recommendations to the T&E/S&T Program Manager on Program related matters Interfaces with EAs, Deputies, WG members and PIs on behalf of T&E/S&T Program Manager as required

29 T&E/S&T Program Roles and Responsibilities
Focus Area Executing Agents Primary point of contact for T&E/S&T Program Office within focus area Maintain awareness of DoD T&E needs within focus area and technology developments related to focus area Actively seek out high payoff T&E/S&T projects that address critical DoD T&E needs and mitigate identified T&E risks Issue Requests for Information (RFIs) and Broad Agency Announcements (BAAs) through contractual channels Responsible for technical and financial execution of projects approved by T&E/S&T Program Manager Chair for focus area working group (WG) and responsible for maintaining Tri-Service WG T&E and S&T representation Coordinate replacements with T&E/S&T Program Manager Responsible for maintaining current project and focus area information in TESTWeb and T&E/S&T ACC Special Interest Area

30 T&E/S&T Program Roles and Responsibilities
Focus Area Working Group Members Act as government leads on projects – where there is no intrinsic government lead Provide coordination of T&E/S&T projects with Service related efforts Identify and prioritize T&E needs Support the EA on source selection efforts Review ongoing focus area projects Identify transition opportunities for projects Project Principal Investigators Execute projects approved by T&E/S&T Program Manager Conduct research efforts Provide technical progress reports to EA Identify and actively pursue transition opportunities for T&E/S&T projects Responsible for maintaining accurate financial information in TESTWeb

31 T&E/S&T Program Roles and Responsibilities
Government Lead Attend major focus area reviews (when able) Provide guidance to the project to ensure a disciplined engineering approach Review technical execution of project to ensure NST Program objectives are being met; gaps are being satisfied Provides quick review of monthly reports to ensure accuracy Interfaces with EAs, Deputies, WG members and Principle Investigators as required Maintain awareness of identified T&E needs within the NST focus area and technology developments related to focus area Support the EA on source selection efforts Support EA in Identifying transition opportunities for project

32 Publications Professional publications are encouraged by the TRMC and PMO Provides opportunities for transition of technology Raises awareness of T&E/S&T Program to new communities Public Release of Technical Information EAs adhere to local procedures for release of technical information in EA’s focus area Copies of all technical information to be released are to be sent to the T&E/S&T Principal Scientist prior to release Use the ACC Special Interest Area Include acknowledgement of the T&E/S&T Program in all publications Example: “The authors would like to thank the Test Resource Management Center (TRMC) Test and Evaluation/Science and Technology (T&E/S&T) Program for their support.  This work is funded (or is partially funded) by the T&E/S&T Program through the (Put your organization’s name here) contract number xxxxxx.”

33 Release of Programmatic Information
The PMO must review and approve release of programmatic information prior to transmission outside the program Applies to detailed project status, funding levels and execution, schedule, problem areas, and potential management issues Also applies to Solicitations (BAAs, PRDAs, RFIs) Not intended to restrict information exchange between EAs or core focus area working groups This applies to written and verbal release of information For publications, PMO requires a minimum of 10 days before EA’s next suspense on the publication For verbal or Request for Information, call or PMO for approval before information is released Option is to refer requestor to PMO for the programmatic information

34 Technology Readiness Levels (TRL)
Description 1. Basic principles observed and reported Lowest level of technology readiness. Scientific research begins to be translated into applied research and development. Example might include paper studies of a technology's basic properties. 2. Technology concept and/or application formulated Invention begins. Once basic principles are observed, practical applications can be invented. The application is speculative and there is no proof or detailed analysis to support the assumption. Examples are still limited to paper studies. 3. Analytical and experimental critical function and/or characteristic proof of concept Active research and development is initiated. This includes analytical studies and laboratory studies to physically validate analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative. 4. Component and/or breadboard validation in laboratory environment Basic technological components are integrated to establish that the pieces will work together. This is relatively "low fidelity" compared to the eventual system. Examples include integration of 'ad hoc' hardware in a laboratory. 5. Component and/or breadboard validation in relevant environment Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so that the technology can be tested in a simulated environment. Examples include 'high fidelity' laboratory integration of components.

35 TRL (Cont) 6. System/subsystem model or prototype demonstration in a relevant environment Representative model or prototype system, which is well beyond the breadboard tested for TRL 5, is tested in a relevant environment. Represents a major step up in a technology's demonstrated readiness. Examples include testing a prototype in a high fidelity laboratory environment or in simulated operational environment. 7. System prototype demonstration in an operational environment Prototype near or at planned operational system. Represents a major step up from TRL 6, requiring the demonstration of an actual system prototype in an operational environment, such as in an aircraft, vehicle or space. Examples include testing the prototype in a test bed aircraft. 8. Actual system completed and 'flight qualified' through test and demonstration Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluation of the system in its intended weapon system to determine if it meets design specifications. 9. Actual system 'flight proven' through successful mission operations Actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation. In almost all cases, this is the end of the last "bug fixing" aspects of true system development. Examples include using the system under operational mission conditions.

36 Definitions Breadboard: A breadboard is an experimental model composed of representative system components used to prove the functional feasibility (in the lab) of a device, circuit, system, or principle without regard to the final configuration or packaging of the parts. Breadboard performance may not be representative of the final desired system performance, but provides an indication that desired performance can be achieved. Brassboard: A brassboard is a refined experimental model that provides system level performance in a relevant environment. The brassboard should be designed to meet (at a minimum) the threshold performance measures for the final system. Prototype: A prototype is a demonstration system that has been built to perform in an operational environment. A prototype device meets or exceeds all performance measures including size, weight, and environmental requirements. This device is suitable for complete evaluation of electrical and mechanical sub-system form, fit, and function design and characterization of the full system performance.

37 Accounting Definitions
Obligations The amount of an order placed, contract awarded, service rendered, or other transaction that legally reserves a specified amount of an appropriation or fund for expenditure Accruals The costs incurred during a given period representing liabilities incurred for goods and services received, other assets acquired and performance accepted, prior to payment being made Disbursements The charges against available funds representing actual payment and evidenced by vouchers, claims, or other documents approved by competent authority Expenditures The total of disbursements plus accruals


Download ppt "NAVAIR Pt. Mugu Executing Agent"

Similar presentations


Ads by Google