Presentation is loading. Please wait.

Presentation is loading. Please wait.

PMT-352 Systems Engineering Seminar DragonFly JRATS Simulation 1 Version 2.2, 3-26-15.

Similar presentations


Presentation on theme: "PMT-352 Systems Engineering Seminar DragonFly JRATS Simulation 1 Version 2.2, 3-26-15."— Presentation transcript:

1 PMT-352 Systems Engineering Seminar DragonFly JRATS Simulation 1 Version 2.2,

2 Introduction and Objectives Version 2.2, Apply the systems engineering technical management processes and effectively implement the technical processes and the overall system acquisition process. This seminar is designed to help prevent the teams from making cost/performance tradeoffs in an ad hoc manner without really appreciating the rigor of the SE process model. This seminar is intended to provide an appreciation for how the SE process would actually be applied to a system like the JUGV. ….using aspects of the DragonFly simulation. Version 2.2,

3 Slide 3 Technical Planning Ensures that the SE processes are applied properly throughout a system’s life cycle, includes defining the scope of the technical effort required to develop, field, and sustain the system, as well as providing critical quantitative inputs to program planning and life-cycle cost estimates. Requirements Management Ensures bi-directional traceability from the high-level requirements down to the system elements through the lowest level of the design (top down); and from any derived lower-level requirement up to the applicable source from which it originates (bottom-up). Configuration Management Establishes and maintains the consistency of a system’s functional, performance, and physical attributes with its requirements, design, and operational documentation throughout the system's life cycle. Interface Management Ensure interface definition and compliance among the system elements, as well as with other systems. Documents all internal and external interface requirements and requirements changes in accordance with the program’s Configuration Management Plan. Technical Data Management Identifies, acquires, manages, maintains, and ensures access to the technical data and computer software required to manage and support a system throughout the acquisition life cycle. Decision Analysis Transforms a broadly stated decision opportunity into a traceable, defendable, and actionable plan. employ procedures, methods, and tools, such as trade studies, for identifying, representing, and formally assessing the important aspects of alternative decisions to select an optimum decision. Technical Assessment Compares achieved results with defined criteria to provide a fact-based understanding of the current level of product knowledge, technical maturity, program status, and technical risk. Includes methods such as technical reviews and use of technical performance measures (TPMs). Risk Management Involves the mitigation of program uncertainties that are critical to achieving cost, schedule, and performance goals at every stage of the life cycle. Encompasses identification, analysis, mitigation, and monitoring of program risks. Technical Management Processes

4 Slide 4 Stakeholder Requirements Definitions Involves the translation of requirements from relevant stakeholders into a set of top-level technical requirements. The process helps ensure each individual stakeholder’s requirements, expectations, and perceived constraints are understood from the acquisition perspective. Requirements Analysis Involves the decomposition of top-level requirements captured by the Stakeholder Requirements Definition process into a clear, achievable, verifiable and complete set of system requirements. Architecture Design Involves a trade and synthesis process that translates the outputs of the Stakeholder Requirements Definition and Requirements Analysis processes into a system allocated baseline that describes the physical architecture of the system and the specifications that describe the functional and performance requirements for each configuration item along with the interfaces that compose the system. Implementation Involves two primary efforts; detailed design and realization. Outputs included the detailed design down to the lowest system elements in the system architecture, and the fabrication/production procedures. Integration Incorporates the lower level system elements into a higher-level system element in the physical architecture. Verification Provides evidence that the system or system element performs its intended functions and meets all performance requirements listed in the system performance specification and functional and allocated baselines. Verification answers the question, “Did you build the system correctly?” Validation Provides objective evidence that the capability provided by the system complies with stakeholder performance requirements, achieving its use in its intended operational environment. Validation answers the question, “Is it the right solution to the problem?” Transition Moves any system element to the next level in the physical architecture. For the end-item system, it is the process to install and field the system to the user in the operational environment. Technical Processes

5 Slide 5 Architecture Design Architecture Design Requirements Analysis Requirements Analysis Stakeholder Requirements Definition Stakeholder Requirements Definition Transition Validation (OT) Verification (DT) Integration Implementation Technical Planning Requirements Management Risk Management Decision Analysis Data Management Technical Assessment Configuration Management Interface Management Technical Management Processes Technical Processes Always On-going Defense Acquisition Guide Systems Engineering Process Model

6 6 Start at the Beginning JRATS Capabilities Documents (ICD and Draft CDD) JRATS CONOPS JCIDS Documents provide primary input to the SE process. Requirements immediately start to evolve when we add the CONOPS Stakeholders Requirements Definition Requirements Management Process JRATS Architectural Views An important part of the SE process Interface Management Process Architectural Views - Identify the “Context” - Identify Interfaces - Clarify “What’s Required” Version 2.2,

7 JUGV IDSystem RequirementTraces to: SR1 The JUGV shall be capable of autonomously attacking enemy targets. Draft CDD 1 SR2 The JUGV shall be capable attacking enemy targets with a probability of kill of 0.75(T), 0.90 (O). Draft CDD 1 SR3 The JUGV shall be capable or carrying and launching anti- armor guided missiles. Draft CDD 1, CONOPS SR4 The JUGV shall be capable of conducting operations by remote control using line of sight communication data-link. Draft CDD 2 Key Performance Parameters (KPPs), and other required capabilities JCIDS (Draft CDD) CONOPS Detailed stakeholder capability needs turned into good technical requirements 7 JCIDS Documents will detail operational requirements and KPPs SR6 The JUGV shall be capable of identifying friendly targets with single-target accuracy of 0.90(T), 0.99(O). Draft CDD 6 SR8 The JUGV system shall be certified to operate in accordance with DIACAP. Draft CD (Regulatory) SR9 The JUGV shall be certified for electro-magnetic spectrum supportability in accordance with DoDI Draft CDD Para. 10 (Regulatory) SR10 The JUGV system shall be capable of being physically maintained by 90% of both the male and female population Draft CDD Para. 14.c(4) (Consideration) SR11 The JUGV system shall comply with MIL-STD-464, Interface Standard for Electro-magnetic Environmental Effects for Systems. Draft CDD Para (Consideration) SR11 The JUGV shall be capable of LASER designation of targets for the DragonFly UAV. Draft CDD CONOPS (Interface) Example: Human Systems Integration, Electro-Magnetic Environmental Effects Design Considerations Example: Dragonfly Unmanned Air Vehicle (UAV) Interface Requirements Example: Cybersecurity, Spectrum Supportability Statutory & Regulatory/ Certification Other Requirements are derived from Statutory, Regulatory, Certification, Design, and Interface requirements 7 Stakeholders Requirements Definition Version 2.2,

8 As part of the Technical Assessment Process, the program office will plan and hold technical reviews. One of the most important is the System Requirements Review (SRR) The SRR ensures that the PMO, user, and contractor all have a common understanding & agreement on: – the system level technical requirements and – the associated costs, schedule, and risks associated with realizing a system to meet the requirements. JUGV Technical Assessment Process 8 Stakeholders Requirements Definition Version 2.2,

9 JUGV As an example, let’s look at three top-level requirements that were part of the output of the Stakeholder Requirements Definition process: SR1The JUGV shall be capable of autonomously attacking active or passive enemy targets. SR2The JUGV shall be capable of attacking targets with a probability of kill of 0.75(T), 0.90 (O). SR3The JUGV shall be capable or carrying and launching anti-armor guided missiles. Function (What the system must do?) Performance (How well?)Constraints Attack enemy targetsPk = 0.75(T), 0.90(O)Autonomously Active or passive targets Carry missilesHow many?Anti-Armor Guided Missiles Launch missilesHow quickly?Anti-Armor Guided Missiles Extract top level functions and determine their performance requirements and constraints Functional Analysis (Verbs) Verifiable Clear Concise Consistent Traceable Feasible Necessary May Cause us to Reassess How RQMTs are Stated 9 Requirements Analysis Version 2.2,

10 Another way to “See” this process is with a Functional Flow Block Diagram (FFBD) (the primary functional analysis technique) The FFBD Indicates the logical and sequential relationships Shows the entire “network of actions” and the “logical sequence” Does NOT prescribe a time duration to or between functions – HOWEVER: A time line analysis will be done “based on the FFBD” Does NOT show “how” a function is to be performed – HOWEVER: the “how” will have to be identified for each block JUGV 10 Requirements Analysis Version 2.2,

11 JUGV Requirements Analysis Using a Functional Flow Block Diagram Load 1.0 Start 2.0 Transit to Op Area 3.0 Conduct Mission Operations 4.0 Transit to Base Area 5.0 Shutdown 6.0 Top Level – Divide all functions into logical groups Third Level Conduct R&S 4.2 And Communicate 4.1 Target & Attack 4.3 Or Second Level Detect Mines 4.4 Detect Locate Track Identify Designate Arm Weapon And Launch Weapon Kill Target Safe Launcher Decide Guide Weapon Version 2.2,

12 JUGV Risk Management 12 Architecture Design Logical Decomposition Group functions in a way that they can be realized by a physical component or sub- system. ….“Allocated” into Logical Groups A first cut at a design. Decide which function groups can be COTS/NDI/GFE and which can be developed. For functions that could be realized in a number of ways, conduct a trade-off analysis to decide on best solution. TMPs in play throughout Architecture Design Interface Management Decision Analysis Version 2.2,

13 JUGV Sense Target COTS RADAR System Detect, Locate, Track Identify Target GFE IFF System Identify Designate Target NDI LASER Designator Designate Target, Guide Weapon Store and Launch Weapon GFE Launcher Arm Weapon, Launch Weapon, Safe Launcher Target and Control Weapon GFE Targeting Cmptr H/W Develop Targeting S/W Decide on Attack, Guide Weapon Kill Target GFE Missiles Kill Target SPECIFIC OPTIONS 13 Architecture Design Logical Decomposition – Example: Target & Attack Function An initial cut at a Physical Architecture Trade-Off Analysis Required Version 2.2,

14 JUGV Physical Architecture Radar Targeting Computer & SW LASER Designator IFF Missile Launcher Missile Functional Architecture Detect X Locate XX Track XX Identify X Designate XX Decide X Arm XX Launch XX Guide XX Kill X Safe XX Requirements Management “Traceability” Requirements Management “Traceability” “Loop” back to RQMTS Helps Shape the WBS VERBSVERBS 14 Architecture Design Functional Allocation Table – JUGV Targeting and Attack Sub-system NOUNSNOUNS Version 2.2,

15 Problems occur at “interfaces.” Identifying them in advance is crucial to effective Risk Management. N Diagrams are used to identify and analyze interface requirements between physical components or functions. “N ” Diagram: To Analyze Interfaces and Interactions Shown on the diagonal Interfaces then identified or Interface types and requirements are identified for each component. They could include: - Electrical - Mechanical - Hydraulic - Heating / Cooling - User interface 15 Architecture Design JUGV Risk Management Interface Management TMPs in play throughout Architecture Design 2 2 Version 2.2,

16 JUGV N Diagram To Analyze Interfaces and Interactions RADAR Target Location RADAR Mode Command Targeting Cmptr Interrogation Command LASER Mode Command Target Location Arm/Safe Command Guidance Data Target Identification IFF LASER Mode Status Laser Designator Encoded Laser Signal Launcher Status Missile Status LASER Lock Status Missile Launcher Arm/Safe Command Launch Command Guidance Data Missile Status LASER Lock Status Missile 16 2 Version 2.2,

17 JUGV Architecture Design Design Trade Study/Trade-off Analysis Example: A trade-study to choose the combination of COTS RADAR and GFE targeting computer to optimize our system according to selection criteria, which are defined as follows: – Weight – Range – Power Requirements – Life Cycle Cost Define Candidate Solutions Define Assessment Criteria Assign Weights to Criteria Assign MOE or MOP to Candidate Solutions Sensitivity Analysis on Results Decision Analysis Starts Here Note: The DragonFly simulation SW will provide the component data for you to use as input for the selection criteria categories for each combination of RADAR and Targeting Computer. 17 Architecture Design Version 2.2,

18 Example: Trade Study Results for RADAR/Targeting Computer Selection 18 Trade-off analysis for COTS RADAR and GFE Targeting Computer combinations. Nine different options considered based on available components in DF simulation. Decision Analysis 18 Architecture Design Version 2.2,

19 Trade Study Results for RADAR/Targeting Computer Selection (T) (O) LCC constraint Options 3 & 6 are within our cost & performance criteria. What’s the best choice? Would it still be the best choice if the weights were changed? 19 Sensitivity Analysis – What is the impact on the “Scores” if we make a change in One of the “weights?” Architecture Design Decision Analysis TRADE SPACE Version 2.2,

20 Describes how system level functional and performance requirements are allocated to physical components (hardware items, software items, and users) Describes the interfaces among system components and external systems/environment. Main artifacts = Hardware Configuration Item Performance Spec, Computer Software Configuration Item Requirements Spec, Interface Requirement Spec. This is the “Design to” Baseline. Describes in detail how to fabricate components and code software, and how to manufacture, operate, and maintain the system and its components, and how to train the various users. Main artifacts: Detailed item specifications, material specifications, process specifications, various drawings and manuals. This is the “Build to” Baseline. JUGV Evolving Technical Baseline (Example of requirements traceability down to JUGV Targeting Computer Configuration Item) Functional Baseline Allocated Baseline Product Baseline 20 What the system must do - functions How well it must do it - performance - at the “system” level Defines the interfaces/dependencies among the functions, groups and the environment. Main artifacts = System Performance Spec and Subsystem/Segment Specs. Technical Reviews System Functional Review (SFR) Preliminary Design Review (PDR) Technical Planning Architecture Design Critical Design Review (CDR) 20 Version 2.2,

21 Describes how system level functional and performance requirements are allocated to physical components (hardware items, software items, and users) Describes the interfaces among system components and external systems/environment. Main artifacts: Hardware Configuration Item Performance Specification, Computer Software Configuration Item Requirements Specifications, Interface Requirement Specifications. This is the “Design to” Baseline. Describes in detail how to fabricate components and code software, and how to manufacture, operate, and maintain the system and its components, and how to train the various users. Main artifacts: Detailed item specifications, material specifications, process specifications, various drawings and manuals. This is the “Build to” Baseline. Functional Baseline Allocated Baseline Product Baseline What the system must do (functions) and how well it must do it (performance) at the “system” level Defines the interfaces/dependencies among the different functions or functional groups and the external environment. Main artifacts: System Performance Spec and Subsystem/Segment Specs. Technical Assessment Configuration Management Requirements Management Technical Planning JUGV Evolving Technical Baseline (Example of requirements traceability down to JUGV Targeting Computer Configuration Item) Tracing Requirements in the Tech Reviews: - Identifies Requirements “Creep” -Increases confidence in meeting Stakeholder expectations Baselines provide: -The common reference point for Configuration Management - The basis for “Verification” activities 21 TMPs in play throughout Architecture Design Architecture Design Version 2.2,

22 Decomposition of Requirements and Traceability from Baseline to Baseline SR1The JUGV shall be capable of autonomously attacking enemy targets. IPS 2.1The Targeting Software shall cyclically update JUGV track database with the combat identification of all targets. IPS 2.2The Targeting Software shall fuse IFF track data with RADAR track data. IPS 2.3The Targeting Software shall cyclically apply a rules of engagement algorithm to each track in the track database. “System” Performance Requirement (JUGV System) “Item” Performance Specification (Targeting Software CI) Item Detail Specification (Targeting Software - ROE Module) Functional Baseline Allocated Baseline (Performance of CIs that make up the system) Product Baseline (Details of components and modules that make up CIs ) Pseudo Code, Flow Charts, Use Case Diagrams, Sequence Diagrams, State Diagrams, Structure Diagrams, Data Base Structure, Data Definitions, Data Storage Requirements, Etc. EXAMPLE: JUGV Targeting Software Configuration Item Completes “Design” 22 Version 2.2,

23 TPMs Selected attributes that are measurable through analysis from the early stages of design and development Allow Systems Engineers and PMs to track progress over time An integral part of the logical decomposition and the architecture design process Provide a mechanism for facilitating early awareness of problems Should be based on parameters that: – Drive costs – Are on the critical path – Represent High Risk factors Add a third dimension to the Cost & Schedule strengths of EVM – “Technical Achievement” EXAMPLE: “PROBABILITY OF Kill KPP (Next Slide) Crafting Technical Performance Measures (TPMs) (A critical step for Systems Engineers) Technical Assessment Risk Management 23 Architecture Design Version 2.2,

24 Example JUGV TPM Measure of Effectiveness (KPP) Probability of Kill (P k ) = % of time that attack of single target results in rendering target incapable of performing its mission) Measure of Performance Target Track Accuracy (meters) Measure of Performance Max Weapon Range (meters) Measure of Performance Targeting Data Update Rate (milliseconds) Technical Performance Measure Targeting Algorithm Running Time (milliseconds) Technical Performance Measure Targeting Software Memory Utilization (% total RAM capacity) Targeting Algorithm Running Time (ms) 1QFY12QFY13QFY14QFY11QFY22QFY23QFY2 Threshold Objective Planned progress over time with tolerance bands Actual, measured progress 24 Version 2.2,

25 Drives Implementation Strategy Applied to JUGV WBS. Example: Make, Buy, or Reuse? Involves two primary efforts: detailed design down to the lowest system elements and the realization of fabrication/production into actual products. Plans, designs, analysis, requirements development, and drawings are realized into actual products Example: Targeting & Attack Segment Documents how decisions are to be made 25 Implementation Ensure detail design is properly captured in Design Phase Artifacts Make, buy, or reuse system components Verify that each element— whether bought, made, or reused—meets specification “Design” “Realization” JUGV System Targeting & Attack RADAR Target H/W Target S/W IFF LASER Launch Missile Make Buy Re-use Buy Re-use Buy Re-use ACQ STRATEGY SEP Version 2.2,

26 JUGV Schematic block diagram Integration happens at each & every level in the system architecture Starts with basic parts, such as resistors... Then proceeds up through: Components (e.g., printed circuit boards) Assemblies (e.g., antenna) Segments (e.g., mission system segment) IFFLASER Targeting Cmptr RADAR Launcher MIL-STD Encoded LASER Energy MIL-STD-1760 Ethernet RS 422 MIL-STD- 1553B Ethernet Targeting & Attack Sub-system Configuration Management Configuration Management Interface Management Interface Management Subassemblies (e.g., actuators) Subsystems (e.g., target & attack subsystem) F ull systems (e.g., JUGV). 26 Integration (Putting the pieces together) TMPs in play Version 2.2,

27 JUGV Schematic block diagram IFF LASER Target Cmptr RADAR Launcher Targeting & Attack Sub-system 27 Integration (Putting the pieces together) Vehicle Control & Nav Vehicle Chassis Comms Electrical and Mechanical Subsystems Engine and Drive Train Remote Operator Integration within a Sub-system = a Major Task Integrating Sub-system to Sub-system = a HUGE Task Major challenges in Interface Management and Configuration Management Version 2.2,

28 JUGV (Across 3 Dimensions) “System” Verification Top to Bottom. Down to The Lowest Piece- part Throughout all Phases Methods Requires early and continuous Planning 28 Verification “Confirming system elements meet the design-to or build-to Spec…Did you build it right?” JUGV System Drive Train Chassis Target & Attack RADAR H/W S/W S&R IFF LASER Launch Survivability Missile Make Buy Re-use Buy Re-use Buy Re-use Development Qualification Acceptance Ops & Maintenance Think “DT” Version 2.2,

29 JUGV Verification System/Item Requirement Method Verification Requirement IADT SR1. The JUGV shall be capable of autonomously attacking active or passive enemy targets. XX VSR1D. Provide evidence that the JUGV is capable of autonomous attack by demonstrating in a representative operational environment with simulated friendly and hostile forces an attack against simulated hostile targets. The demonstration will show that the JUGV system is capable of distinguishing between friendly and non-friendly units and launching weapons to engage only non-friendly units. You never really have a “good” requirement until you have “verification” requirements to go with it Technical Planning Verification requirements provide the basis for the TEMP As they’re developed and refined, the SEP should be updated to describe how test results will feedback into system design. The SEP should also describe the tools used to for tracking and maintaining traceability of verification requirements. Verification Matrices are developed for each level of the system 29 Verification Version 2.2,

30 JUGV Validation JUGV Operational Test Design Example JUGV CDD JUGV CONOPS KPPs Probability of Kill / Ao KPPs Probability of Kill / Ao Operational Scenarios OT Framework/ OT Plan OT Data Requirements OT Resource Requirements Critical Operational Issues Can the JUGV kill its intended targets? Can the JUGV be maintained in an operational environment? Critical Operational Issues Can the JUGV kill its intended targets? Can the JUGV be maintained in an operational environment? Mission Task Analysis Attack by Fire an Enemy Force or Position Identify enemy Fix enemy location Engage enemy with weapons Provide Maintenance Support Identify failed components Remove and replace failed components in the field Mission Task Analysis Attack by Fire an Enemy Force or Position Identify enemy Fix enemy location Engage enemy with weapons Provide Maintenance Support Identify failed components Remove and replace failed components in the field Measures of Effectiveness Target Detection Range Measures of Effectiveness Target Detection Range Measures of Suitability Mean time to Fault Locate Measures of Suitability Mean time to Fault Locate Measures of Performance RADAR Range Built in Test (BIT) False Alarm Rate Measures of Performance RADAR Range Built in Test (BIT) False Alarm Rate TEMP Developed by JCIDS Process Developed by T&E WIPT Developed by Ops Testers 30 Validation Think “OT” “Confirming system elements meet Stakeholder requirements” Did you get the requirements right? Version 2.2,

31 Verification Validation VERIFICATION REDUCES VALIDATION RISK 31 Verification Validation Version 2.2, Development TestsOperational Tests Controlled by program manager One-on-one tests Controlled environment Contractor environment Trained, experienced operators Precise performance objectives and threshold measurements Test to specification Developmental, engineering, or production representative test article Controlled by independent agency Many-on-many tests Realistic/tactical environment with operational scenario No system contractor involvement User troops recently trained Performance measures of operational effectiveness and suitability Test to operational requirements Production representative test article

32 Transition a system element to the next level of the physical architecture Transition an end- item to the user in the operational environment Focus on: Focus on Operational Integration Integrated Logistics Support Elements (Training & Maintenance Plans, Supply Provisions, Technical Publications, Support Equipment, PHS&T, etc.) JUGV Targeting & Attack Sub-system to JUGV System JUGV System to User Tech Data Management 32 Transition Interface Management Integration II Version 2.2,

33 Iterative - The application of a process to the same product or set of products to correct a discovered discrepancy or other variation from requirements. Recursive - The repeated application of processes to design next lower layer system products or to realize next upper layer end products within the system structure. 33 Recursive and Iterative Systems Engineering SYSTEM SUB- SYSTEM COMPONENTS Version 2.2,

34 Recursive and Iterative Systems Engineering Vee Model Stakeholder Requirements, CONOPS, Validation Planning Definition System Performance Specification and Verification Planning Configuration Item Performance Specification and Verification Planning Configuration Item Detail Specification and Verification Procedures Fabricate, code, buy, or reuse Inspect and test to Detail Specification Assemble Configuration Items and Verify to CI Performance Specification Integrate System and Verify to System Specification Validate System to Stakeholder Requirements and CONOPS Validation Verification Stakeholders Requirements Definition Requirements Analysis Architecture Design Implementation Integration Transition Validation Verification Technical Planning Requirements Management Risk Management Decision Analysis Data Management Technical Assessment Configuration Management Interface Management 34 Version 2.2,


Download ppt "PMT-352 Systems Engineering Seminar DragonFly JRATS Simulation 1 Version 2.2, 3-26-15."

Similar presentations


Ads by Google