I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems AF Systems Engineering Assessment Model (AF SEAM) Validation Assessment.

Slides:



Advertisements
Similar presentations
Software Quality Assurance Plan
Advertisements

Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
Stepan Potiyenko ISS Sr.SW Developer.
Software Project Transition Planning
Secure System Administration & Certification DITSCAP Manual (Chapter 6) Phase 4 Post Accreditation Stephen I. Khan Ted Chapman University of Tulsa Department.
Configuration Management
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
ESC/EN Engineering Process Compliance Procedures August 2002.
QUALITY MANAGEMENT SYSTEM ACCORDING TO ISO
Understanding (and Untangling) Verification and Validation Requirements ISO 9001 vs. CMMI-Dev 1.2.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems AF Systems Engineering Assessment Model (AF SEAM) Self-Assessment.
CMMI Course Summary CMMI course Module 9..
The Key Process Areas for Level 2: Repeatable Ralph Covington David Wang.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Integrated Capability Maturity Model (CMMI)
Introduction to Software Quality Assurance (SQA)
COMPANY CONFIDENTIAL Page 1 Final Findings Briefing Client ABC Ltd CMMI (SW) – Ver 1.2 Staged Representation Conducted by: QAI India SM - CMMI is a service.
Continual Service Improvement Process
Chapter 6 Software Implementation Process Group
The Capability Maturity Model in Software Development Paul X. Harder, JD Government Micro Resources, Inc. September 14, 2004.
CLEANROOM SOFTWARE ENGINEERING.
Cybersecurity: Engineering a Secure Information Technology Organization, 1st Edition Chapter 7 Software Supporting Processes and Software Reuse.
NIST Special Publication Revision 1
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
The Challenge of IT-Business Alignment
SENG521 (Fall SENG 521 Software Reliability & Testing Software Product & process Improvement using ISO (Part 3d) Department.
Service Transition & Planning Service Validation & Testing
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
Air Armament Center Mr. John Mistretta Technical Director, AAC/EN War-Winning Capabilities…On Time, On Cost Systems Engineering Update AAC.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
10/16/2015Bahill1 Organizational Innovation and Deployment Causal Analysis and Resolution 5 Optimizing 4 Quantitatively Managed 3 Defined 2 Managed Continuous.
Important informations
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems AF Systems Engineering Assessment Model (AF SEAM) Validation Assessment.
Georgia Institute of Technology CS 4320 Fall 2003.
Michael Campe U.S. Army Aviation and Missile Command NDIA TID Technical Information Division Symposium Royal Sonesta Hotel, New Orleans, LA August 2003.
1 | 2010 Lecture 3: Project processes. Covered in this lecture Project processes Project Planning (PP) Project Assessment & Control (PAC) Risk Management.
US Army Corps of Engineers BUILDING STRONG ® Mr. Daniel Carrasco Chief, Contracting Division USACE – LA District 13 OCT 2015.
Purpose: The purpose of CMM Integration is to provide guidance for improving your organization’s processes and your ability to manage the development,
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
ORDER ENVIRONMENTAL PROTECTION PROGRAM WORKSHOP OVERVIEW OF ORDER Larry Stirling
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
SE513 Software Quality Assurance Lecture12: Software Reliability and Quality Management Standards.
Configuration Control (Aliases: change control, change management )
Chapter 6 Internal Control in a Financial Statement Audit McGraw-Hill/IrwinCopyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
1 Integration of Process Initiatives And Assessments Common Process Framework Integration of Management System Standards and Initiatives (QMS/CMMI/Lean/PMBP)
CMMI for Services, Version 1.3 Speaker: Business Excellence Date:
Introduction for the Implementation of Software Configuration Management I thought I knew it all !
SQA project process standards IEEE software engineering standards
DoD Template for Application of TLCSM and PBL
Life Cycle Logistics.
Software Project Configuration Management
Software Quality Control and Quality Assurance: Introduction
DoD SE Processes (DAG section)
Software Configuration Management
SQA project process standards IEEE software engineering standards
AF Systems Engineering Assessment Model (AF SEAM) Validation Assessment Out-Brief Program: (INSERT NAME) Current: 14 January 2015.
AF Systems Engineering Assessment Model (AF SEAM) Validation Assessment Training Current: 14 January 2015.
AF Systems Engineering Assessment Model (AF SEAM) Self-Assessment Training Current: 9 Jan 2015.
Identify the Risk of Not Doing BA
ISA 201 Intermediate Information Systems Acquisition
Level - 3 Process Areas (CMMI-DEV)
The Open Group Architecture Framework (TOGAF)
CMMI – Staged Representation
Engineering Processes
ISO/IEC IEEE/EIA Software Life Cycle Processes Supporting Life Cycle Processes IEEE Supporting Processes.
Engineering Processes
AICT5 – eProject Project Planning for ICT
Configuration Management
Software Reviews.
AF Systems Engineering Assessment Model (AF SEAM) Self-Assessment Training Current: 24 September 2018 This document was developed for use by programs.
Presentation transcript:

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems AF Systems Engineering Assessment Model (AF SEAM) Validation Assessment Out-Brief Program: (INSERT NAME) Current: 14 January 2015

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Overview AF SEAM Overview and History AF SEAM Goals Policy AF SEAM Practices and Composition Validation Assessment Overview Validation Assessment Scoring and Summaries Validation Assessment Results 2

I n t e g r i t y - S e r v i c e - E x c e l l e n c e What is AF SEAM? Overview Single AF-wide process improvement tool used for the assessment and improvement of systems engineering processes in a project or across an organization Composite of industry and DoD best practices Promotes consistent understanding/application of SE Facilitates a gap analysis of an organization’s SE processes History Baseline released (August 2008) – Version 1.0 Became policy with AFMCI (October 2009) Update released (September 2010) – Version 2.0 3

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Why We Need AF SEAM Lack of disciplined system engineering application has been a major contributor to poor program performance Many problems have surfaced repeatedly with AF programs Poor requirements development and management Poor planning fundamentals Lack of integrated risk and issue management Lack of rigorous process application Failure to deliver mission capabilities 4

I n t e g r i t y - S e r v i c e - E x c e l l e n c e AF SEAM Goals Ensure a consistent understanding of systems engineering Ensure core SE processes are in place and being practiced at the program/project level Document repeatable SE “Best Practices” across AF Identify opportunities for continuous Improvement Clarify roles and responsibilities Improve program performance & reduce risk AF SEAM is NOT an appraisal of product quality AF SEAM is NOT a report card on personnel or the organization 5 The Validation Assessment has a further goal of providing an independent “audit” of process and practice usage with the additional intent of continuous process improvement

I n t e g r i t y - S e r v i c e - E x c e l l e n c e AFMC Policy AFMCI , Implementing OSS&E and Life Cycle Systems Engineering, Change 2 (11 February 2011) “Programs listed in the Air Force Systems Information Library (AFSIL) shall use AF SEAM as a self assessment tool to evaluate the organization’s capability to perform SE processes. AF SEAM assessments shall be conducted annually.” (Para 1.6) “Organizations are encouraged to assess their programs managed under common processes within a single assessment. The assessment of common programs shall be at the organizational Division level or lower.” (Para 1.6) 6

I n t e g r i t y - S e r v i c e - E x c e l l e n c e BES Directorate Policy AF PEO BES Policy: ALL programs required to build a Business Process Directory (BPD) Tailoring Worksheet (TWS) shall complete AF SEAM Self- Assessment annually FoS / SoS: If consolidated under a single TWS, perform a single AF SEAM Self-Assessment Includes all ACAT and sustainment programs Validation Assessments All ACAT I/II/III programs will be subject to validation assessments Other programs will be selected for validation assessments by the Director of Engineering (DoE) 7 !

I n t e g r i t y - S e r v i c e - E x c e l l e n c e AF SEAM Pedigree AF SEAM Foundation: Capability Maturity Model Integration (CMMI ® ) Defense Acquisition Guidebook (DAG) AFI – Life Cycle Systems Engineering ANSI/EIA 632 – Processes for Engineering a System IEEE/EIA 731 – Systems Engineering Capability Model ISO/IEEE – Systems Engineering-System Life Cycle Processes INCOSE – System Engineering Standards IEEE 1220 – Application and Management of the Systems Engineering Process 8

I n t e g r i t y - S e r v i c e - E x c e l l e n c e AF SEAM Practices Specific Practices Unique to each process area Informative Material Description Typical Work Products Other Considerations References Local References Generic Practices Same questions apply to all process areas Informative Material Description Typical Work Products Facilitates successful achievement of specific practices and process area goals 9 GP1 – GP7

I n t e g r i t y - S e r v i c e - E x c e l l e n c e AF SEAM Practice Composition Process AreasGoals Specific Practices Generic Practices Total Practices Configuration Management (CM)38715 Decision Analysis (DA)15712 Design (D) Manufacturing (M) Project Planning (PP) Requirements (R) Risk Management (RM)37714 Transition, Fielding & Sustainment (TFS) Tech Mgmt & Control (TMC) V & V (V) IA SE Integration (IA) Totals:

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Validation Assessment Overview An AF SEAM Validation Assessment is an independent assessment of a project’s/program’s self–assessed implementation of SE practices, processes and procedures Who’s involved The Validation Assessment Team – led by the BPD CCB; includes matrixed process area SMEs independent of the program office Project/Program Office Team Prime Contractor Team (as appropriate) The Validation Assessment is an opportunity to: Ensure existence of disciplined systems engineering processes; validate project/program office demonstrated ability to execute processes Identify strengths / best practices exercised by programs/projects Identify opportunities for program/project or process improvement 11

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Scoring Methodology Same methodology as Self-Assessment Compared project/program processes (how you do things) to the practice (a process standard, or what should be done) and answered… ( 1) = YES – if your process completely satisfied the practice (0) = NO – if your process did not satisfy or partially satisfied the practice N/A – if the practice did not apply either by uniqueness of the program, timing, or other circumstances 12

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Documenting Findings… Validation Team made honest assessments of each practice All findings were discussed before final entries were recorded Practices were only scored YES if the project/program fully complies with the practice If a practice partially met and there is an opportunity to improve, it was scored as NO and explained Any practice scored as NO should be translated into a program risk All findings were adjudicated and agreed upon before preparing this briefing and the final assessment report 13

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Specific Practices Summary INSERT VALIDATION ASSESSMENT SPECIFIC PRACTICES SUMMARY TABLE FROM THE AFSAT TOOL

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Generic Practices Summary INSERT VALIDATION ASSESSMENT GENERIC PRACTICES SUMMARY TABLE FROM THE AFSAT TOOL

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Combined Summary INSERT VALIDATION ASSESSMENT COMBINED SUMMARY TABLE FROM THE AFSAT TOOL

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Configuration Management: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Configuration Management is to establish and maintain the integrity of the product’s technical baseline while accommodating change and providing a clear, concise, and valid description of the product to concerned parties. There were 3 Process Goals assessed which were broken down into 8 Specific Practices and 7 Generic Practices. A total of 15 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Decision Analysis: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Decision Analysis is to analyze possible decisions using a formal process that evaluates identified alternatives against established criteria. There was 1 Process Goal assessed which was broken down into 5 Specific Practices and 7 Generic Practices. A total of 12 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Design: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Design is to conceive and proof an integrated solution that satisfies product requirements. There were 3 Process Goals assessed which were broken down into 14 Specific Practices and 7 Generic Practices. A total of 21 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Information Assurance & Systems Engineering Integration: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of the Information Assurance (IA) process area is to ensure that acquisition program offices include IA requirements as part of the mainstream DAS requirements process and follow standard Systems Engineering (SE) practices to ensure compliance with DoD 8500 series directives. There were 4 Process Goals assessed which were broken down into 9 Specific Practices and 7 Generic Practices. A total of 16 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Manufacturing: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of the Manufacturing process is to prepare for and produce the required product. There were 4 Process Goals assessed which were broken down into 12 Specific Practices and 7 Generic Practices. A total of 19 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Project Planning: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Project Planning is to establish and maintain plans that define project activities. There were 3 Process Goals assessed which were broken down into 15 Specific Practices and 7 Generic Practices. A total of 22 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Requirements: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of the Requirements process area is to develop and analyze operational user, product, and product-component requirements, to assure consistency between those requirements and the project’s technical plans and work products and to manage requirements evolution through the life cycle of the product. There were 4 Process Goals assessed which were broken down into 13 Specific Practices and 7 Generic Practices. A total of 20 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Risk Management: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Risk Management is to identify potential problems before they occur, so that risk-handling activities may be planned and invoked as needed across the life of the product or project to mitigate adverse impacts on achieving objectives. There were 3 Process Goals assessed which were broken down into 7 Specific Practices and 7 Generic Practices. A total of 14 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Transition, Fielding, and Sustainment: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of the Transition, Fielding & Sustainment process is to prepare for and execute the support, maintenance, repair, and disposal of a product while ensuring it is safe, suitable, and effective while it is fielded and operated. Sustainment is the planning, programming, and executing of a support strategy. It includes specific activities in all phases of a product lifecycle from product concept formulation to demilitarization and disposal. There were 4 Process Goals assessed which were broken down into 15 Specific Practices and 7 Generic Practices. A total of 22 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Technical Management and Control: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Technical Management and Control is to provide an understanding of the program’s technical progress so that appropriate corrective actions can be taken when the program’s performance deviates significantly from the plan. There were 4 Process Goals assessed which were broken down into 15 Specific Practices and 7 Generic Practices. A total of 22 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Verification and Validation: Validation Assessment Rating: (Enter the rating from AFSAT) 100% Self Assessment Rating: (Enter the rating from AFSAT) 100% Description: The purpose of Verification is to ensure that work products meet their specified requirements. The purpose of Validation is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment. There were 4 Process Goals assessed which were broken down into 10 Specific Practices and 7 Generic Practices. A total of 17 practices were addressed within this process area. Assessor Comments: Enter assessor comments for this Process Area. Strengths: Enter strengths or “None identified”. Improvement Opportunities: Enter improvement opportunities or “None identified”. Recommendations: Enter recommendations or “None identified”. Specific Practice Results (Cont’d)

I n t e g r i t y - S e r v i c e - E x c e l l e n c e 28 Consolidated Strengths and Improvement Opportunities INSERT VALIDATION ASSESSMENT CONSOLIDATED STRENGTHS AND IMPROVEMENT OPPORTUNITIES TABLE FROM THE AFSAT TOOL

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Validation Assessment Reporting 29 Program results briefed to Division Leadership and Director of Engineering (DoE) only Division Director and DoE should determine if the results require briefing to the BES Directorate Leadership Final Validation Assessment Report will be prepared and distributed to the Program Office and Division leadership BPD CCB will use Program Validation Assessment raw data to compile BES Directorate statistics (overall organizational health) Identify systemic and organizational strengths / improvement opportunities Analyze implement organizational process changes necessary Brief BES Directorate Leadership on organizational results/trends

I n t e g r i t y - S e r v i c e - E x c e l l e n c e Summary 30 EXECUTIVE SUMMARY: The overall validation assessment of (Program Name) shows that the program team is/is not following well structured systems engineering processes throughout the lifecycle of the program. The overall rating average was NN% indicating the (Program Name) program has a (high, moderate, or low) degree of implementation of the standard SE processes addressed in the AF SEAM and the BES Process directory (BPD). Closing Remarks?