Presentation is loading. Please wait.

Presentation is loading. Please wait.

OPS DAS SE CPM JCIDS PPBE

Similar presentations


Presentation on theme: "OPS DAS SE CPM JCIDS PPBE"— Presentation transcript:

1 OPS DAS SE CPM JCIDS PPBE DoDAF Improved Harmonization with Systems Engineering -Initial Discussion- Feburary 2012 1 1

2 Purpose Streamline and improve the alignment between DoD Architectural Descriptions (AD) and Systems Engineering (SE) including models (views and viewpoints), model data, artifacts, documents and data items There are two objectives: Establish standard cross-DoD relationships: Architectural Descriptions (AD) and SE artifacts ADs and SE documents and processes. Eliminate redundancy to produce and maintain Architectural Descriptions (AD) and SE artifacts and documents. March 2012

3 Initial Approach First objective Approach
Establish standard cross-DoD relationships: Architectural Descriptions (AD) and SE artifacts ADs and SE documents and processes. Approach Survey current SE Policy and Guidance Analyze primary SE document standards and guidance Establish common requirement types that comprise SE artifacts and their use in SE processes Formulate relationships between SE artifacts and ADs Models (DoDAF) Data (DM2) March 2012

4 Common Semantics Required to Manage Requirements, Design and Test Complexity
March 2012

5 Common Lexicon Facilitates Auditable Traceability and Reduces Ambiguity
Vision Guidance Capability Activity Information Rules Desired Effect Conditions Locations Measures Resources Performers Organizations PersonTypes Skill Standards Agreements Project MeasureTypes Systems Services Information March 2012

6 Primary DoD SE Requirements Documents Examined?
USAF-2010 Systems Requirements Document (SRD) Army-1999 System Performance Specification (SPS) Navy-2008 Systems Design Specification (SDS) Operational Concept Description (OCD) System/Subsystem Specification (SSS) Data Item Descriptions (DIDs) OSD Contracting (in current use per Mil-STD-961) System/Subsystem Design Description (SSDD) Software Requirements Specification (SRS) Software Design Description (SDD) Software Product Specification (SPS) March 2012

7 Primary DoD SE Requirements Document Analysis
An initial analysis of Component (Army, Navy, etc.) SE policies, directives, guidance and standards revealed: Multitude of guidance documents Components generally follow the contents of the DIDs prescribed by MIL-STD-961E (or original Mil-STD-490) for primary requirements documents Artifacts from Primary SE Requirements Documents replicated in numerous supporting documentation in various forms Regardless of Component, SE documents generally address common requirement types (next slide) March 2012

8 SE Primary Requirements Documents -Common Requirement Types-
Example Operating Environment The system shall operate under the conditions -50°F to +120°F ambient air temperature. Operational Capabilities The unit shall perform air assault operations under the conditions listed in Table 2.1. System Performance Metrics The system shall process a personnel change request in less than 0.1 seconds. System Interface Requirements The system shall exchange Call For Fire with Field Artillery via MIL-STD-2167. System Functional Requirements The system shall present, to the operator, patient medical records based on military ID. Support (Non-Functional) Requirements The system shall have a mean time between system abort of not less than 310 hours. Verification and Test The system shall be tested for salt fog per MIL-STD 810F Method 509.4 Traceability All requirements in Section 3.3 of this document shall be traced to a requirement in the CDD. March 2012

9 Establishing Relationships -System Specifications- -Common Requirement Types-
March 2012

10 Establishing Relationships -Primary Documents-
March 2012

11 Analysis to Date Summary of Issues
Redundancy and Ambiguity Ambiguous relationship between Architectural Descriptions (AD) and Systems Engineering (SE) products Inadequate specificity in SE document standards (e.g. DIDs) relative to Architecture data, models and descriptions used in requirements and architecture Documents (e.g. ICD, CDD, ISPs) Governance Ambiguity regarding the role of AD in SE processes (Requirements documents, Mil-STD-881C) Inconsistent precedence and use of DoDAF models in SE documents SE specification documents are not standardized across Components (e.g. SDS, SRD, PS, Mil-STD-961 DIDs) Inconsistencies in OSD SETR Guidelines and SE Guidance relative to DoDAF and architecture* Traceability Inadequate traceability between AD and SE artifacts Inadequate traceability between below the document level (not consistent amongst Components, Programs and Projects) *DEFENSE ACQUISITION PROGRAM SUPPORT METHODOLOGY, Version 2.0, January 9, 2009 “4.1.3.C2: The technical system architecture descriptions should use mandated Operational View (OV), System View (SV), and Technical View (TV) products as described in the DoDAF, and should be integral to the system design. There should be System Description Documents (SDDs) and System Capability Specifications (SCSs) that address those for the system and major subsystems.” March 2012

12 Next: Systems Engineering and Architecture Harmonization and Efficiency
AV DoDAF Viewpoints All (AV) Capabilities (CV) Operational (OV) Data / Information (DIV) Systems (SV) Services (SvcV) Standards (StdV) CBA System O & M Validation & Acquisition Model Decisions & Milestones CV MS-C TEMPcapabilities MSA System Validation CPD Capabilities Based Assessment (CBA) Material Solutions Analysis (MSA) Verification Technology Development (TD) MS-A JCIDS Documents Initial Capabilities Doc (ICD) Capabilities Design Doc (CDD) Capabilities Production Doc (CPD) Information Support Plan (ISP) OV DIV1 SVR ICD TEMPoperational Engineering & Manufacturing Development (E&MD) Prototyping System Verification SRR SRD,OCD,SPS,SCS System Engineering Technical Reviews System Requirements Reviews (SRR) System Functional Reviews (SFR) Preliminary Design Reviews (PDR) Critical Design Reviews (CDR) Test Readiness Review (TRR) System Verification Review (SVR) SV SvcV DIV2 StdV TRR TEMPsystem SFR System Design Subsystem Verification Typical Systems Engineering Work Products System Requirements Document (SRD) Operational Concept Description (OCD) System Capability Specifications (SCSs) Systems Performance Specification (SPS) System Design Specification (SDS) System/Subsystem Specification (SSS) System/Subsystem Design Description (SSDD) Software Requirements Specification (SRS) Software Design Description (SDD) Software Product Specification (SPS) Data Base Design Document (DBDD) Interface Requirements Specification (IRS) Interface Control Document (ICD) / Interface Design Document (IDD) Test and Evaluation Master Plan (TEMP) SSS,SDS, SRS,IRS CDDprelim; ISPprelim MS-B SV SvcV DIV2 StdV Component Design Component Verification PDR CDDfinal; ISPfinal SSS, SDD,IDD CDR DIV3 SSDD, IDD, SPD, DBDD Build Unit Test Notional Systems Development “V” March 2012

13 Way Forward -Opportunities-
March 2012

14 Back Ups March 2012 14

15 SE Primary Requirements Documents -Common Requirement Types-
Example Operating Environment The system shall operate at SECRET High. Operational Capabilities The unit shall perform air assault operations under the conditions listed in Table 2.1. System Performance Metrics The system shall process a personnel change request in less than 0.1 seconds. System Interface Requirements The system shall exchange Call For Fire with Field Artillery via MIL-STD-2167. System Functional Requirements The system shall present, to the operator, patient medical records based on military ID. Support (Non-Functional) Requirements The system shall have a mean time between system abort of not less than 310 hours. Verification and Test All human interfaces shall be tested for compliance with MIL-STD-1472. Traceability All requirements in Section 3.3 of this document shall be traced to a requirement in the CDD. March 2012

16 DoD/Industry SE Guidance
March 2012

17 SETR Milestones March 2012

18 Establishing Relationships -Interface Specifications-
Update March 2012

19 Top 5 Systems Engineering Issues in the Defense Industry
Key Systems Engineering practices known to be effective are not consistently applied across all phases of the program life cycle. Insufficient Systems Engineering is applied early in the program life cycle, compromising the foundation for initial requirements and architecture development. Requirements are not always well-managed, including the effective translation from capabilities statements into executable requirements to achieve successful acquisition programs. Collaborative environments, including SE tools, are inadequate to effectively execute SE at the joint capability, System-of-Systems (SoS) and system levels. The quantity and quality of Systems Engineering expertise is insufficient to meet the demands of the government and the defense industry. Mr. Gary Blohm, Director, US Army RDECOM Communications Electronics Research, Development and Engineering Center , 12 March 2010 March 2012

20 Summary of Issues and Improvement Opportunities -From POA&M-
Opportunity The relationship between AD and SE products is ambiguous resulting in redundant effort DoDAF 2’s greater semantic precision compared to prior DoDAF versions can be used to clarify the relationship. The relationship between DoDAF models and SE documents is too coarse resulting in inadequate traceability between AD and SE artifacts. The DM2 provides a means to define the relationship Governance is not standardized across the Components indicating ambiguity regarding the role of AD in SE processes. DoDAF 2’s disambiguation through the DM2 and current model description technical edits offer an opportunity for standardization The precedence of DoDAF models and SE documents is inconsistent. DoDAF 2’s reification model provides an opportunity to specify precedence of model and artifact types at an appropriate reification level. SE documents are not standardized across Components. Mapping to the unambiguous and precise DM2 and DoDAF 2’s reification levels can lead to standard definitions of SE documents. Traceability below the document level is not consistent amongst Components, Programs and Projects. The DM2 can provide explicit relationships at the document content level that can substantially improve requirement traceability between the various reification levels and associated documents and artifacts. March 2012

21 DoD/Industry SE Guidance
DAU-2001 OSD-2008 DISA-2011 Industry INCOSE-2010 Navy SysCOMs-2004 Army AMC-2007 Navy ASN RDA-2006 USAF SMC-2005 March 2012

22 Problem: How to establish and maintain consistency between the numerous products
Policy MIL-STDs Handbooks Pamphlets CD Correlated Data? CD CDD ILSP TEMP WBS/IMP / IMS ISP SEP SRD SPS SDS SSS IRS Etc. CD TEMP Info Assur Certs TDS/AS DT&E/OT&E SETR Criteria and Checklists CPD CDD CBA MSA ICD Exit Criteria V&V Plans & Reports Organizational Interfaces Warfighter Requirements Program /Acquisition Verification Supporting GFI March 2012

23 March 2012

24 March 2012

25 DoDAF 2.0 Conceptual Data Model

26 Scoping Architectures to be "Fit-for-Purpose”
The architect is the technical expert who translates the decision-maker’s requirements into a set of data that can be used by engineers to design possible solutions. Establishing the scope of an architecture is critical to ensuring that its purpose and use are consistent with specific project goals and objectives. The term “Fit-for-Purpose” is used in DoDAF to describe an architecture (and its views) that is appropriately focused (i.e., responds to the stated goals and objectives of process owner, is useful in the decision-making process, and responds to internal and external stakeholder concerns. Meeting intended objectives means those actions that either directly support customer needs or improve the overall process undergoing change. The architect is the technical expert who translates the decision-maker’s requirements into a set of data that can be used by engineers to design possible solutions. At each tier of the DoD, goals and objectives, along with corresponding issues that may exist should be addressed according to the established scope and purpose, (e.g., Departmental, Capability, SE, and Operational), as shown in the notional diagram in the figure below. March 2012

27 DoDAF Meta-model Groups Mapping to Viewpoints and DoD Key Processes
Metamodel Data Groups View Points DoD Key Processes AV, CV, DIV,OV,PV,StdV, SvcV, SV JCIDS, DAS, PPBE, System Engineering, Operations, Portfolio Management (IT and Capability) Performer CV, OV, PV,StdV, SvcV, SV J, D, P, S, O, C Activity OV J, O, C Resource Flow AV, CV, DIV,OV,PV,StdV J, S, O Data and Information AV, DIV Capability CV, PV, SV, SvcV Services CV, StdV, SV P, S, C Project AV, CV, PV, SvcV, SV D, P, S, C Training/Skill/Education OV, SV, SvcV, StdV Goals CV, PV J, D, P, O, C Rules OV, StdV, SvcV, SV J, D, S, O Measures SvcV, SV J, D, S, O, C Location P, S, O March 2012

28 Establishing the Scope for Architecture Development
March 2012

29 March 2012

30 The DM2 Conceptual Data Model -key concepts-
Activity: Work, not specific to a single organization, weapon system or individual that transforms inputs (Resources) into outputs (Resources) or changes their state. Resource: Data, Information, Performers, Materiel, or Personnel Types that are produced or consumed. Materiel: Equipment, apparatus or supplies that are of interest, without distinction as to its application for administrative or combat purposes. Information: The state of a something of interest that is materialized -- in any medium or form -- and communicated or received. Data: Representation of information in a formalized manner suitable for communication, interpretation, or processing by humans or by automatic means. Examples could be whole models, packages, entities, attributes, classes, domain values, enumeration values, records, tables, rows, columns, and fields. Architectural Description: Information describing an architecture such as an OV-5b Operational Activity Model. Performer: Any entity - human, automated, or any aggregation of human and/or automated - that performs an activity and provides a capability. Organization: A specific real-world assemblage of people and other resources organized for an on-going purpose. System: A functionally, physically, and/or behaviorally related group of regularly interacting or interdependent elements. Person Type: A category of persons defined by the role or roles they share that are relevant to an architecture. Service: A mechanism to enable access to a set of one or more capabilities, where the access is provided using a prescribed interface and is exercised consistent with constraints and policies as specified by the service description. The mechanism is a Performer. The capabilities accessed are Resources -- Information, Data, Materiel, Performers, and Geo-political Extents. Capability: The ability to achieve a Desired Effect under specified (performance) standards and conditions through combinations of ways and means (activities and resources) to perform a set of activities. Condition: The state of an environment or situation in which a Performer performs. Desired Effect: A desired state of a Resource. Measure: The magnitude of some attribute of an individual. Measure Type: A category of Measures. Location: A point or extent in space that may be referred to physically or logically. Guidance: An authoritative statement intended to lead or steer the execution of actions. Rule: A principle or condition that governs behavior; a prescribed guide for conduct or action. Agreement: A consent among parties regarding the terms and conditions of activities that said parties participate in. Standard: A formal agreement documenting generally accepted specifications or criteria for products, processes, procedures, policies, systems, and/or personnel. Project: A temporary endeavor undertaken to create Resources or Desired Effects. Vision: An end that describes the future state of the enterprise, without regard to how it is to be achieved; a mental image of what the future will or could be like. Skill: The ability, coming from one's knowledge, practice, aptitude, etc., to do something well. March 2012

31 DoD Systems Engineering Technical Reviews (SETRs)
DoD SETR Checklists The updated DAG will also describe and refer to these technical review risk assessment checklists. The checklists accessible in the TR CLM are being updated for DoD usage. Seven of the checklists have been updated, and are now accessible on the SE COP. User comments and recommendations for checklist improvements are solicited. NOTE: OSD has established the policy that all of the checklists are intentionally "locked" to preclude minor question changes that may potentially change an evaluation score of "red" to something less problematic ("yellow" or "green"). Most of the Service Technical Authorities endorse this policy. If the checklists were unlocked, any program or evaluator could change the wording of a question to evoke a satisfactory response, thus potentially eliminating oversight during a technical review. The checklists are set up so they can be tailored to exclude questions that are not applicable to a given program. This can be accomplished by selecting "NA" for any given question. The checklist programming will ignore those question(s) when summing totals of each category of responses. March 2012

32 Joint Test and Evaluation Methodology (JTEM)
DEVELOPMENT STANDARD OPERATING PROCEDURE (SOP), Version 2, January 15, 2011 (Joint Test and Evaluation Methodology (JTEM) Joint Test and Evaluation project) https--acc.dau.mil-adl-en-US file MeasuresDevelopmentSOPv2_ pd March 2012

33 Measures Framework Relationship Diagram
Capability Hypothesis If one has a combination of means and ways under a set of standards and conditions, then one can perform tasks and achieve desired effects. DEVELOPMENT STANDARD OPERATING PROCEDURE (SOP), Version 2, January 15, 2011 (Joint Test and Evaluation Methodology (JTEM) Joint Test and Evaluation project) https--acc.dau.mil-adl-en-US file MeasuresDevelopmentSOPv2_ pd March 2012

34 DoDAF 2.0 Associations Used in Measures Framework
DEVELOPMENT STANDARD OPERATING PROCEDURE (SOP), Version 2, January 15, 2011 (Joint Test and Evaluation Methodology (JTEM) Joint Test and Evaluation project) https--acc.dau.mil-adl-en-US file MeasuresDevelopmentSOPv2_ pd March 2012

35 Required Relationships for Mission-Based Assessment
Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 March 2012

36 Complex Task Model March 2012
Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 March 2012

37 System/SoS Scoring Table
Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 March 2012

38 Aggregate System/SoS Scoring Table
Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 March 2012


Download ppt "OPS DAS SE CPM JCIDS PPBE"

Similar presentations


Ads by Google