Presentation is loading. Please wait.

Presentation is loading. Please wait.

Systems Engineering = Best Practices

Similar presentations


Presentation on theme: "Systems Engineering = Best Practices"— Presentation transcript:

1 Systems Engineering = Best Practices
Nicholas M. Torelli Deputy Director, Human Capital and Specialty Engineering Systems and Software Engineering (SSE) Office of the Deputy Under Secretary of Defense for Acquisition and Technology Presented to 2009 Systems Engineering Summit: Harnessing Best Practices Sponsored by the Huntsville Regional Chapter of INCOSE, US Army, RDECOM, AMRDEC, DAU, and NASA March 3, 2009 1

2 DoD Vision for Systems Engineering
Systems engineering principles and disciplines are fully accepted and assimilated into the DoD acquisition workforce positioning the DoD for acquisition excellence leading to a stronger national defense.

3 SSE’s 2009 Priorities Acquisition work force development
Renewed focus on early application of systems engineering to affect affordability and total ownership cost Increased level of visibility for our role in developmental test and evaluation Systems of Systems engineering tools and techniques Integration of program protection activities into acquisition oversight to address cyber threat Measuring results of our efforts

4 DoD SE Best Practice Continuum
Policy & Guidance Developing the Workforce & Advancing SE Practice Program Support & Assessment Teamwork & Collaboration

5 Overview of Acquisition Policy Changes*
Mandatory Materiel Development Decision (MDD) Mandatory competing prototypes before MS B Mandatory PDR and a report to the MDA (“the sliding PDR”) [PDR Report to MDA if before MS B; formal PDR Assessment by MDA if after MS B] Configuration Steering Boards at Component level to review all requirements changes Renewed emphasis on manufacturing during system development: Re-titles SDD phase to EMD with two sub phases: Integrated System Design and System Capability and Manufacturing Process Demonstration Establishes consideration of manufacturing maturity at key decision points Mandatory system-level CDR with an initial product baseline and followed by a Post-CDR Report to the MDA Post-CDR Assessment by the MDA between EMD sub phases MS A MS B MS C Joint Concepts CBA ICD Materiel Solution Analysis CDD Engineering and Manufacturing Development Technology Development CPD Production and Deployment O&S Strategic Guidance MDD JCIDS Process or Full Rate Production Decision Review PDR CDR * DoDI , 8 December 2008

6 Acquisition Policy Opportunities for SE
Early SE engagement with programs Program Support Reviews (PSRs) Pre-MS A/B/C Risk Reduction activities (e.g., Technical Risk assessment in AoAs, Competitive Prototyping) SE Technical Reviews - Informed Trades for Feasible Solutions Developmental Test and Evaluation Integrated DT/OT Updated T&E Strategy at MS A

7 New Systems Engineering Enclosure
Codifies several previous SE policy memoranda Codifies a number of SE-related policies and statutes since 2003: Environmental Safety and Occupational Health Modular Open Systems Approach Data Management and Technical Data Rights Item Unique Identification Spectrum Supportability Corrosion Prevention and Control Introduces new policy on Configuration Management

8 DoD Guidance Defense Acquisition Systems Engineering
Developmental Test and Evaluation (DTE) Modeling and Simulation (M&S) Safety System Assurance

9 DoD SE Best Practice Continuum
Policy & Guidance Developing the Workforce & Advancing SE Practice Program Support & Assessment Teamwork & Collaboration

10 OSD Systems Engineering Strategy
DSB DT&E Report & 231 Report Recommendations Integrated DT/OT Reliability Improvement Early SE Access to Relevant Data for Evaluation & Decision Making OSD Policy & Guidance DT&E Revitalization Reliability Milestones A/B Critical Correct SE Staffing Component Development Planning Pre-MS A analysis NRC Study Recommendations Workforce Development Our strategy for Systems Engineering in the Department is driven by external and internal assessments. The National Research Council study of Early Systems Engineering confirmed the need and potential impact for early application of systems engineering and recommended the four items listed on the top left of this chart. These recommendations are embodied in our OSD SE strategy to enhance SE engagement prior to MS B (and MS A), to ensure that we are training, educating and providing adequate resources to meet complex system acquisition needs that we require today, and also conduct research and studies to address areas where we know we have needs for system engineering improvement. In addition to this external study, we have conducted analysis of findings from over 40 assessments of major defense acquisition programs, determined root causes and developed several key recommendations to address these issues. These recommendations, that emerge from actual acquisition program issues, happen to be very synergistic with the external recommendations and our OSD strategy. Enhanced SE Pre-MS B Systemic Analysis Recommendations* Achievable Acquisition Strategy Enhanced Gate Review Process Enhanced Staff Capabilities *Based on 3,700 Program Assessment findings from 40 Programs Support Reviews

11 Workforce Development
The Defense Acquisition Community 126,033 Government and Military Certified Professionals 500,000+ Defense Industry Personnel SSE Functional Leader for SE, T&E, and PQM workforce

12 SE Human Capital Strategy
FY08 NDAA Section 852: DoD Acquisition Workforce Development Fund: >$300M per year across DoD SE, PQM and T&E initiatives to Recruit, Train, and Retain the workforce DoD Human Capital Initiative - Published Annex for SPRDE, PQM, and T&E Career Fields SE “core competency” assessment effort; completion - Summer 2009 “Program Systems Engineer” career path Partnership with INCOSE SE Certification Program “CSEP-Acq” aligned with Defense Acquisition Guidebook Equivalency granted for DAU Courses SYS101 and SYS202 Expanding potential for Industry “certifications” what the future could hold

13 Human Capital Initiatives (Defense Acquisition Workforce Development Fund 1)
1 Based on NDAA Section 852, Defense Acquisition Workforce Development Act

14 Notional DoD Systems Engineering Population
Workforce Size >55 Workforce Age

15 Notional DoD Systems Engineering Workforce Strategy
Retain: Rotational Training Assignments Education with Industry Qualification Cards Incentives Hazardous Duty Pay for PSEs?? Train: Mentor Recruit: Highly Qualified Experts Workforce Size Retain Recruit: Journeymen Recruit: Interns >55 Workforce Age

16 NDIA Systems Engineering Division E&T Committee
Mission: Strengthen Systems Engineering capabilities through education, training and experience across government, industry, and academia. 2009 Task: To address the issue of “self-proclaimed” Systems Engineers, the E&T Committee is attempting to determine the core essential set of competencies that distinguish systems engineers from domain engineers. Contact Govt Committee Chair: Dr. Don Gelosh, CSEP-Acq ODUSD (A&T) SSE / HC

17 SE Competency Model Factors that Affect the SE Competency Profile
Systems Thinking Life Cycle View Unit of Competence Element(s) Lvl Material Solution Analysis Technology Development EMD Production & Deployment Operations & Support Unit of Competence Competence Element(s) Lvl Factors that Affect the SE Competency Profile Essence of Systems Behavior Structure Goal Analytical Analytical Problem Solving Problem Solving Method Systems Engineer Composite Competency Map SE Technical Unit of Competence Competence Element(s) Lvl Stakeholder Requirements Analysis Professional / Interpersonal Unit of Competence Element(s) Lvl Analytical Growth Comp. Level Analytical Requirements Analysis Minimum Essential SE Criteria Analytical Architecture Design Analytical Implementation Time Domain Knowledge SE Technical Mgmt Unit of Competence Competence Element(s) Lvl Technical Management Decision Management Technical Management Risk Management Management Technical Configuration Management Technical Management Technical Planning

18 Ongoing Development Acquisition Community (DoD and Industry)
Research to Practice Ongoing Development Acquisition Community (DoD and Industry) Policy Guidance Education & Training Provide Lessons Learned and Challenges SERC Governance & Collaborators Co-Sponsors Government PMO Tasking Activities Industry Associations Academia SERC Deliver New/Improved SE Methods, Processes & Tools (MPTs)

19 Need for SE Research State-of-the-practice must keep up with application needs of DoD acquisition Methods, processes, and tools to enable effective acquisition and sustainment of systems Leveraging Modeling & Simulation for SE Size and complexity of modern systems drive need for commensurate extensions of SE, To System of Systems (SoS), to complex systems, to “architecting,” to net-centric sets of services Validation & Verification challenges SE “theory and practice” should be inclusive of, and establish linkages to, challenged sub-specialty areas Software engineering, reliability engineering, system safety, costing, etc.

20 Stevens Institute of Technology Lead University
Stevens-SERC Team Stevens Institute of Technology Lead University University Partners Auburn University Air Force Institute of Technology Carnegie Mellon University Fraunhoffer Center at UMD Massachusetts Institute of Technology Missouri University of Science and Technology (S&T) Pennsylvania State University Southern Methodist University Texas A&M University Texas Tech University University of Alabama in Huntsville University of California at San Diego University of Maryland University of Massachusetts University of Southern California University of Virginia Wayne State University The DoD Systems Engineering Research Center will be responsible for systems engineering research that supports the development, integration, testing and sustainability of complex defense systems, enterprises and services. Stevens has MOUs to develop enhanced SE courseware for competency development within DoD with AFIT, NPS and DAU. Further, SERC members are located in 11 states, near many DoD facilities and all DAU campuses.

21 Initial SERC Research Tasks
“Assessing Systems Engineering Effectiveness in Major Defense Acquisition Programs (MDAPs)” USC task lead, with support from Stevens, Fraunhofer Center, MIT, University of Alabama at Huntsville Characterize SE effectiveness within the context of DoD acquisition and identify methods of measurement suitable for application to project execution organizations (e.g., defense contractors), program management organizations (e.g., project managers and program executive offices) and oversight organizations (e.g., OUSD(AT&L)) (OSD) “Evaluation of Systems Engineering Methods, Processes, and Tools (MPT) on Department of Defense and Intelligence Community Programs” Stevens task lead, with support from USC, University of Alabama at Huntsville, and Fraunhofer Examine and recommend areas for advancing current SE methods, processes, and tools (MPTs) as they are applied across the DoD acquisition life cycle focusing on three different development environments: individual weapons systems, SoS, and network-centric systems (NSA)

22 Ways Your Organizations Can Become Involved…
Identify SE research challenges Provide funding to sponsor research as a Tasking Activity Collaborate on DoD-sponsored research by identifying pilot programs to participate, provide relevant acquisition information, identify subject matter experts Make use of research findings to improve systems engineering on acquisition programs

23 DoD SE Best Practice Continuum
Policy & Guidance Developing the Workforce & Advancing SE Practice Program Support & Assessment Teamwork & Collaboration

24 Our World of Stakeholders
DoD Components and Agencies Product Centers Program Executive Offices Acquisition Programs (PMs and CSEs) Practicing PMs and SEs Other DAWIA Career Fields Industry First tier contractors (e.g., Lockheed Martin, Boeing, Northrop-Grumman, Raytheon) Second tier contractors Practicing SEs SE tool vendors Education & Training Institutions DoD organizations: DAU, AFIT, NPS US universities with SE programs (~60) SE short course providers Professional and Industrial Associations DAU Alumni Association INCOSE NDIA SE Division, other assoc Div IEEE Standards Committee; Systems Council AIAA SE Committee TechAmerica ISO 24

25 DoD Challenge: “A”cquisition Process
ICD, CDD, CPD Future Threat and Op Tempo Planning, Programming, and Budgeting Process FYDP POM Process Appropriation and Authorization of Funds Requirements Budgeting Acquisition “Sweet Spot” Acquisition Policy, Guidance, and Oversight. MDAP Decision Authority

26 DoD Goal: Increase the Overlap
Requirements Budgeting Acquisition “Sweet Spot”

27 Systems & Software Engineering Organization Overview
Strategic Initiatives Cyber and System Assurance Program Protection System of Systems SE Early Systems Engineering Research and Development Study Management UARC SE Tools Engineering and Test Policy and Guidance Systems Engineering Policy & Guidance Test & Evaluation Policy & Guidance Software & System Assurance Policy Systems Engineering Plan Approval Test & Evaluation Master Plan Approval Program Protection Plan Approval Standards T&E Oversight List Human Capital and Specialty Engineering Education and Training: SPRDE, T&E, PQM, SW Specialty Engineering: RAM, Risk Management, VE and RTOC, HSI, Modeling and Simulation, CMMI, Targets Lean Six-Sigma Training/Certification Special Projects/ Task Mgmt (JATs/DSTs/etc) Acquisition Systems Engineering and Test Support Technical Support to Acquisition Programs (e.g. AS, TDS, CAIG, AOA, OIPTs) Program Support Reviews DAES Database Analysis and Support Measurements and Analysis Communications and Outreach Technical Community Leadership (SE Forum, T&E Executive Summit, etc) Outreach to Services, Industry, Academia Partnerships with International Community and Associations

28 Opportunities for SSE Engagement
Policy & Guidance Systems Engineering DT&E Program Support Program Support Reviews OIPT, T&E and SE WIPTs AOTR, Post-CDR Review & Assessment Workforce Planning Competency Models Certification Reqts Education & Training Emerging Concepts Systems of Systems SE Research Outreach SE Forum Engagement Strategy Statutory Direction Congress Sec Def Requirements Developers Service Acquisition Executives PEOs Program Offices Prime Contractors Second Tier Contractors Education & Collaboration Infrastructure Professional / Industry Associations (NDIA, INCOSE, AIA, ITEA, TechAmerica, etc.) DAU, Academic Institutions, SERC AT&L Direction ICD, CDD, CPD DAB, ITAR, DSAB, OIPT PSR, SEP, TEMP, Technical Reviews CSEP-Acq, Research, Industry-University SE Workforce Initiatives, Industry-Gov’t Projects

29 DoD Human Systems Integration (HSI)
Problem HSI is not being consistently integrated within DoD acquisition resulting in increased ownership costs and lower system effectiveness. Approach Assign DUSD(A&T) Executive Authority to define policy, as necessary, to implement changes. Maximize the good work that has been done (policies, plans, methods, tools and standards). Engage DoD stakeholders and strengthen industry affiliations including NDIA, TechAmerica and INCOSE. Identify gaps and apply necessary resources to resolve gaps or barriers to successful HSI implementation. Cross-DoD and Technical Community Collaboration

30 Collaboration with INCOSE
July 19 – 23, 2009 International Panel: “What Defines a Systems Engineer? Comparing and Contrasting Global Perspectives on Systems Engineering Competency” Moderator: Dr. Don Gelosh, Human Capital Strategy and Planning, OSD Directorate of Systems and Software Engineering Dr. Arthur Pyster, Distinguished Research Professor, School of Systems and Enterprises, Stevens Institute of Technology; Deputy Executive Director, DoD Systems Engineering Research Center; and member of the INCOSE Board of Directors Dr. John Snoderly, Program Director, Systems Engineering, Defense Acquisition University Samantha Brown, President-Elect of INCOSE and Systems Engineering Innovation Centre (SEIC), Loughborough, UK Mark Kupeski, Director, Complex Systems Integration, IBM Global Business Services Dr. U. Dinesh Kumar, Professor in Quantitative Methods and Information Systems, Indian Institute of Management Bangalore Professor Stephen Cook, Director, Defence and Systems Institute, University of South Australia

31 INCOSE Multi-Level Professional Certification
Senior Level Foundation Level Discipline Extensions Entry Level ESEP Expert Systems Engineering Professional CSEP-Acq CSEP w/ US DoD Acquisition CSEP Certified Systems Engineering Professional ASEP Associate Systems Engineering Professional 31

32 CSEP with US DoD Acquisition Extensions
Targeted towards Systems Engineers who support or work in a US Department of the Defense acquisition environment Same core CSEPs experience, education, and knowledge requirements Additional acquisition knowledge items tested Available since July 2008 Discipline Extensions Beyond the core knowledge covered in the CSEP, INCOSE certification also allows for extensions to cover specific disciplines in more detail. There is currently one CSEP discipline extension. Other extensions may be added in the future. 32

33 DoD Best Practices Clearinghouse (BPCh)
The DoD Acquisition Best Practices Clearinghouse (BPCh) facilitates the selection and implementation of systems engineering and software acquisition practices appropriate to the needs of individual acquisition programs. The BPCh uses an evidence-based approach, linking to existing resources that describe how to implement various best practices. What is the DoD AT&L Best Practices Clearninghours and what will be unique about it? The BPCh will be a major focusing effort across DoD to capture, organize, share and build upon practices, and implementation experiences, that make up the Big “A” domain. The BPCh will attempt to leverage other best practices systems in DoD, but will be unique in capturing context and associated practice evidence. This will make the knowledge more relevant to users with unique needs and circumstances. The needs and resources of a large DoD platform system differ from those of a small but critical subsystem. The BPCh, like the AKSS, will use OSD, Service, Agency and Industry Best Practice sites identified by the Knowledge Providers Network, as “Golden Sources”, augmented by user inputs from the AT&L workforce and students involved in formal training at DAU and other DoD training organizations. Allows users to search and select practices based on specific program risks (lower costs, shorten program timeline, improve program quality) Practices are “vetted” by SMEs (DAU senior faculty +) Practices and evidence are open to public Will eventually capture practices across all AT&L domains Will organize practices under a standard “taxonomy” developed and managed by DAU for DoD Will leverage and integrate assets from all DoD Knowledge Sharing Systems, thus reducing system life cycle cost. Will integrate tightly with DoD knowledge communities for maximum sharing and feedback to leadership to support continuous process improvement A “one size fits all” approach in terms of practice recommendations is not assumed People often learn better from contextualized stories as opposed to expert advice Supports users making data driven decisions. (An AT&L Goal)

34 We Need You! Browse the Clearinghouse for applicable best practices
Provide feedback on practices Submit additional evidence for an existing practice Submit a lead or a possible best practice Tell others about the Clearinghouse Consider becoming an editor to help vet practices submitted in your area of expertise

35 DoD SE Best Practice Continuum
Policy & Guidance Developing the Workforce & Advancing SE Practice Program Support & Assessment Teamwork & Collaboration

36 Defense Acquisition Program Support Methodology (DAPS) v1.1
OSD’s Program Support Reviews… “The Source of Systemic Root Cause Analysis Data” Rigor in process … Defense Acquisition Program Support Methodology (DAPS) v1.1 Plan Tools & Materials Tailorable, Scalable Methodology (DAPS v1.1) Templates Criteria Questions Training Materials Execution Guidance Subject Matter Expertise Conduct Analyze Report …ensures integrity © 2006 by Carnegie Mellon University 36

37 Program Support Review Data Set (since March 2004)
PSRs/NARs completed: 57 AOTRs completed: 13 Nunn-McCurdy Certification: 13 Participation on Service-led IRTs: 3 Technical Assessments: 13 Reviews planned for CY08: PSRs/NARs: 10 AOTRs: 1 NM: 1 Data derived from diverse & broad program set 37

38 OSD Systemic Analysis: Data Model
Tactical, Program and Portfolio Management Program Effects & Root Cause Use collective set of program findings… Program Unique Solutions Program Review Findings … to identify systemic issues at the root cause level… Strategic Management Systemic Issues Systemic Root Causes Corrective Actions … and develop recommendations that mitigate problems at their source! Policy/Guidance Education & Training Best Practices Other Processes (JCIDS, etc) Oversight (DAB/ITAB) Execution (staffing) DoD Acquisition Community 38

39 Deriving Systemic Issues… Occurred on 50% or more of the programs
PSR findings “Negative” Findings Analysis Pareto & Pivot Tables) Prelim. Results Systemic Issues 44 Programs Hypothesis testing: Re-analyze, review… Use “Core” and “Systemic” Root Cause tagging to cull data into focused subsets Analyze and trend using Pareto charting and Pivot Tables Occurred on 50% or more of the programs 39

40 SRCA Milestones & Results
Sep 07 – Feb 08 OUSD/SSE root cause analysis yields systemic issues and preliminary recommendations OUSD/SSE sponsored NDIA Task Group to develop executable acquisition recommendations NDIA Task Group concludes effort and proposes actions for both government and industry Mar 08 – Oct 08 Oct 08 – Dec 08 3 Recommendation Areas Acquisition Strategy & Planning (ASP) Enhanced Staff Capability (ESC) Decision Gate Review (DGR) Early & sufficient planning People: the right skills and numbers Clear, enforceable execution criteria

41 New Opportunities for Independent Reviews
What’s relevant:  Mandatory Milestone A for all “major weapon systems”  MS B after system-level PDR* and a PDR Report to the MDA  EMDD with Post-CDR* Report and MDA Assessment  PSR and AOTR in policy MS A MS B MS C Materiel Solution Analysis Engineering and Manufacturing Development and Demonstration Strategic Guidance Joint Concepts CBA ICD CDD CPD Production and Deployment O&S Technology Development MDD Full Rate Production Decision Review Potential Independent Technical Reviews - PSRs and AOTRs OTRR Program Support Reviews (PSRs) All ACAT ID and IAM To inform the MDA on technical planning and management processes thru risk identification and mitigation recommendations To support OIPT program reviews and others as requested by the MDA Assessments of Operational Test Readiness (AOTRs) All ACAT ID and special interest programs To inform the MDA, DOTE, & CAE of risk of a system failing to meet operational suitability and effectiveness goals To support CAE determination of materiel readiness for IOT&E This is a good news chart for OSD that is not necessarily shared by the field and major programs. The new finally makes formal in PSRs the somewhat casual reference in our first SE policy memo (Feb 2004) to our assessment of programs. AOTRs were born from Congressional frustration with the large number of programs failing OT. The AOTR was designed to provide independent input to the program’s CAE to inform his/her certification of OT readiness as well as the MDA and DOT&E. * PDR – Preliminary Design Review * CDR – Critical Design Review * OTRR – Operational Test Readiness Review

42 DoD SE Best Practice Continuum
Policy & Guidance Developing the Workforce & Advancing SE Practice Program Support & Assessment Teamwork & Collaboration

43 Always Our Focus The Mission: The Defense Acquisition Community
Delivering Timely and Affordable Capabilities to the Warfighter The Defense Acquisition Community 126,033 Government and Military Certified Professionals 500,000+ Defense Industry Personnel For additional information:

44 DoD Guidance (1/2) http://www.acq.osd.mil/sse/pg/guidance.html
Defense Acquisition Defense Acquisition Guidebook (DAG) Integrated Defense Acquisition, Technology, & Logistics Life Cycle Management Framework, Version 5.3.2: December 8, 2008 Systems Engineering DAG Chapter 4, Systems Engineering Systems Engineering Plan (SEP) Preparation Guide, Version 2.01, April 2008 SEP Frequently Asked Questions (FAQs), February 8, 2008 SE WIPT Brief: An Overview of Technical Planning and Systems Engineering Plan (SEP) Development, Version 1.0, March 20, 2008 SE Working Integrated Product Team (WIPT) Generic Charter Template Systems Engineering Guide for Systems of Systems, Version 1.0, August 2008 Systems Engineering Assessment Methodology, Defense Acquisition Program Support (DAPS), Version 2.0, October 21, 2008 Guide for Integrating Systems Engineering into DoD Acquisition Contracts, Version 1.0, December 11, 2006 Risk Management Guide for DoD Acquisition, 6th Edition, Version 1, August 2006 Integrated Master Plan / Integrated Master Schedule Preparation and Use Guide, Version 0.9, October 21, 2005 Program Manager's Guide: A Modular Open Systems Approach (MOSA) to Acquisition, Version 2.0, September 2004 Modular Open Systems Approach (MOSA) Program Assessment and Rating Tool (PART), Version 1.02 DoD Guide for Achieving Reliability, Availability and Maintainability, August 1, 2005 Program Reliability and Maintainability Review Template, Version 1.0, August 15, 2008 Designing and Assessing Supportability in DOD Weapon Systems: A Guide to Increased Reliability and Reduced Logistics Footprint, October 24, 2003 Technical Review Checklists

45 DoD Guidance (2/2) http://www.acq.osd.mil/sse/pg/guidance.html
Developmental Test and Evaluation (DTE) DAG Chapter 9, Integrated Test and Evaluation Guide on Incorporating Test and Evaluation into DoD Acquisition Contracts, FINAL DRAFT FOR FINAL APPROVAL VERSION, September 2008 Modeling and Simulation (M&S) DoD M, DoD Modeling and Simulation (M&S) Glossary, January 15, 1998 Recommended Practices Guide for Verification, Validation, and Accreditation (VV&A), Build 3.0, September 2006 Guide for Modeling and Simulation for the Acquisition Workforce, Version 1.01, October 2008 Safety System Safety — ESOH Management Evaluation Criteria for DoD Acquisition, Version 1.1, January 2007 Joint Systems Safety Review Guide for USSOCOM Programs, Version 1.1, October 12, 2007 Unmanned System Safety Guide for DoD Acquisition, Version 0.92, June 27, 2007 Joint Weapons and Laser Safety Review Guide (Draft), Version 0.92, September 4, 2007 ESOH in Acquisition: Integrating ESOH into SE, Version 3.0, January 2008 System Assurance DAG Chapter 7, Acquiring Information Technology and National Security Systems DAG Chapter 8, Intelligence, Counterintelligence, and Security Support DoD M, Acquisition Systems Protection Program, March 16, 1994 Engineering for System Assurance, Version 1.0, October 2008

46 DoD Best Practices Clearinghouse (BPCh) Program Goals
Useful Information Help finding, selecting, and implementing practices appropriate to User’s situation; fill the gap between “what” and “how” Active Knowledge Base Not just another practice list; experience data updated, expanded refined; encourages organic growth A Single Source For answers about practices, how to apply them, when they are good to use; lessons learned; and risks to avoid Repository What is the DoD AT&L Best Practices Clearninghours and what will be unique about it? The BPCh will be a major focusing effort across DoD to capture, organize, share and build upon practices, and implementation experiences, that make up the Big “A” domain. The BPCh will attempt to leverage other best practices systems in DoD, but will be unique in capturing context and associated practice evidence. This will make the knowledge more relevant to users with unique needs and circumstances. The needs and resources of a large DoD platform system differ from those of a small but critical subsystem. The BPCh, like the AKSS, will use OSD, Service, Agency and Industry Best Practice sites identified by the Knowledge Providers Network, as “Golden Sources”, augmented by user inputs from the AT&L workforce and students involved in formal training at DAU and other DoD training organizations. Allows users to search and select practices based on specific program risks (lower costs, shorten program timeline, improve program quality) Practices are “vetted” by SMEs (DAU senior faculty +) Practices and evidence are open to public Will eventually capture practices across all AT&L domains Will organize practices under a standard “taxonomy” developed and managed by DAU for DoD Will leverage and integrate assets from all DoD Knowledge Sharing Systems, thus reducing system life cycle cost. Will integrate tightly with DoD knowledge communities for maximum sharing and feedback to leadership to support continuous process improvement A “one size fits all” approach in terms of practice recommendations is not assumed People often learn better from contextualized stories as opposed to expert advice Supports users making data driven decisions. (An AT&L Goal) Living Knowledge Integration with DoD communities of practice (ACC) and experts Validated practices Consistent, verifiable information


Download ppt "Systems Engineering = Best Practices"

Similar presentations


Ads by Google