1 ProgramSuccessMetrics Al Moseley DSMC - School of Program Managers How will you measure your program’s success? PMSC 8 Dec.

Slides:



Advertisements
Similar presentations
Program Assessment Briefing (PAB) Instructions Chart 1: Overview Provide a narrative mission description – Self explanatory Provide executive level program.
Advertisements

©2003 Rolls-Royce plc The information in this document is the property of Rolls-Royce plc and may not be copied or communicated to a third party, or used.
Global Congress Global Leadership Vision for Project Management.
N O T E “CLICK” TO CONTINUE… If the slide show is not launched, click on View  Slide Show in the menu bar at the top of the Power Point window. When the.
Monitoring and Control Earned Value Management (EVM)
Managing the Information Technology Resource Jerry N. Luftman
Action Implementation and Evaluation Planning Whist the intervention plan describes how the population nutrition problem for a particular target group.
Section 4.0 Project Implementation. Factors that Ensure Success  Update the project plan  Stay within scope  Authorized change implementation  Providing.
Readiness Index – Is your application ready for Production? Jeff Tatelman SQuAD October 2008.
Project Management Process Project Description Team Mission/ Assignment Major Milestones Boundaries Team Identification Measures of Success Roles & Responsibilities.
Earned Value Management
1 PROGRAM SUCCESS PROBABILITY John Higbee DAU 2 June 2004.
Chapter 10 Project Monitoring and Control
EARNED VALUE MANAGEMENT SYSTEM A Project Performance Tool
Office of Project Management Metrics Report Presentation
UNCLASSIFIED Joint and Coalition Warfighting Mr. John Vinett March 2012 Technical Baseline Capability.
Program Success Metrics John Higbee DAU 19 Jan 2006.
Understanding Earned Value Analysis
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force As of:1 Value added with EVM for Senior Level Portfolio Management.
U.S. Department of Energy Project Management: Communicating Progress – Celebrating Success Paul Bosco, PE, PMP, CFM, LEED-AP Director, Office of Procurement.
Dave Bachman DAU / CDSC September 9, 2004
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
1 Probability of Program Success Component Standardization : Status Brief Ms. Jane Rathbun Special Assistant, Program Management Improvement OSD(AT&L)(A&T)(PSA)
0 PM Metrics Briefing COL Stephen L. Rust PM ITTS 25 Mar 04.
1 13 Feb 2004 Army Performance Management and Integration Programs & Strategy Directorate (SAFM-CE) New Horizons in Costing and Performance.
1 SPSRB Decision Brief on Declaring a Product Operational Instructions / Guidance This template will be used by NESDIS personnel to recommend to the SPSRB.
DEVELOPMENTAL TEST & EVALUATION DT&E – From Concept to Combat Integrated Test Process Darlene Mosser-Kerner Developmental Test & Evaluation OUSD(AT&L)/Systems.
DA HIEF YSTEMS NGINEER 1 ASN (RDA) Chief Systems Engineer ASN (RDA) Chief Systems Engineer DoN PoPS to ICPM Core Common Metrics and OSD DAPS NDIA ICPM.
RDA CHSENG, April 2009 DA HIEF YSTEMS NGINEER ASN (RDA) Chief Systems Engineer ASN (RDA) Chief Systems Engineer Naval Probability of Program Success (PoPS)
1 [insert briefing type here] [insert program name here] for [insert name and organization of person(s) to be briefed here] [insert “month day, year” of.
How to Review a SAR (Key attributes of a good SAR) October 2007 DAMIR Conference Dana Harrison (703)
Advanced Project Plan Tracking Lesson 15. Skills Matrix SkillsMatrix Skill Record actual start, finish, and duration values of tasks Enter actual start.
Copyright © 2008 Industrial Committee on Program Management. All rights reserved. Predictive Measures of Program Performance March 19, 2009 Industrial.
Life Cycle Logistics.
Earned Value Management Presented By: Steve Krivokopich May , 2006.
Probability of Success. 2 AIM Home Page 3 AIM Contact Page Select to request training.
Program Health Metrics/Templates
UU Master Class Earned Value 17 July Earned Value Contents What is Earned Value / Earned Value Analysis Why and where is it used A brief history.
Guidance Roadmap Target Affordability and Control Cost Growth -Mandate affordability as a requirement -Implement “should cost” based management -* AT&L.
0 2 Nov 2010, V1.4 Steve Skotte, DAU Space Acquisition Performance Learning Director New Space Systems Acquisition Policy.
1 PROGRAM SUCCESS – A NEW WAY TO PREDICT IT John Higbee DAU 25 August 2003.
Report Performance Monitor & Control Risk Administer Procurement MONITORING & CONTROLLING PROCESS.
Introduction To Earned Value November 14, Definition Earned Value is a method for measuring project performance. It compares the amount of work.
Anthony Indelicato DOE-Princeton Site Office February 2014 Construction Progress Review for the NSTX Upgrade Project Construction Progress Review for the.
LOG235/236 Performance Based Logistics Bruce Hatlem Logistics Functional IPT September 2007.
Where Module 04 : Knowing Where the Project Is 1.
Power to the Edge A Net-Centric DoD NII/DoD CIO IT Acquisition Management Transformation Rapid Improvement Team (RIT) Principals Meeting November 18, 2003.
EVM – Do You Really Know What the Numbers Mean? Booz | Allen |Hamilton Seth Huckabee EVP, PMP, PMI-SP.
Vendor Date VALU Monthly Project Review (VMPR) Project Name/IN #
October 11, EARNED VALUE AND PERFORMANCE MANAGEMENT.
Program Success Metrics
Schedule Margin July 2013 NAVY CEVM
EVM 202—Lesson 7 Tools Analysis Case Study
DSMC - School of Program Managers
Earned Value Management
Earned Value - What is it
Dave Bachman DAU / CDSC September 9, 2004
EVM 202—Lesson 7 Tools Analysis Case Study
Development Test Overview
Program Success Metrics
PROGRAM SUCCESS – A NEW WAY TO PREDICT IT
SmartMeterTM Steering Committee Update – September 2012
Schedule Margin July 2013 NAVY CEVM
Schedule Margin July 2013 NAVY CEVM
Missile Defense Agency EVM Update
Independent Expert Program Review (IEPR)
Probability of Success
Managing Project Work, Scope, Schedules, and Cost
KEY INITIATIVE Internal Control and Technical Accounting
PROGRAM SUCCESS PROBABILITY
Presentation transcript:

1 ProgramSuccessMetrics Al Moseley DSMC - School of Program Managers How will you measure your program’s success? PMSC 8 Dec 2009

2 Backdrop ASA(ALT) tasking [Claude Bolton (March 2002)] –There are still too many surprises using traditional metrics: (Poorly Performing /Failing Programs) Being Briefed “Real Time” to Army Senior Leadership DAU (with Industry representatives) was asked to: –Identify a Comprehensive Method to Better Determine the Probability of Program Success –Recommend a Concise “Program Success” Briefing Format for Use by Army Leadership Objective – provide a tool that would: –Allow Program Managers to More Effectively Run Their Programs –Allow Army Leadership to Manage the Major Program Portfolio by Exception

3 PSM Tenets What defines success? Program Success: Holistic Combination of: –Internal Factors -- Requirements, Resources, Execution –Selected External Factors -- Fit in the Capability Vision, and Advocacy –“Level 1 Factors” -- Apply to All Programs, Across all Phases of Acquisition Life Cycle Program Success Probability is Determined by: – Evaluation of Program Against Selected “Level 2 Metrics” for Each Level 1 Factor –“Roll Up” of Subordinate Level 2 Metrics to Determine Each Level 1 Factor Contribution –“Roll Up” of the Level 1 Factors to Determine Program’s Overall Success Probability CostSchedule Performance Traditional Factors (Rolled into Internal Factors) Resources Requirements Execution Fit in Capability Vision Advocacy External Factors Level 1 Factors Internal Factors Success

4 PSM Status AgencyStatusComments Web-Enabled application across Army ACAT I/II programs (Apr 05) Primary Army Program Metric/Process Implementation Complete Apr 05 PoPS -- Probability of Program Success Piloted at AF acquisition centers (Mar-Apr 06) Selected by AF Acquisition Transformation Action Council (ATAC) as metric to manage all USAF programs (28 Apr 06) Implementation complete Mar 07 Implementation complete Sep 08 Army Air Force Establish common program health measures – establish small working group to determine feasibility of migrating toward a common PoPS configuration among all three components Navy/USMC OSD (USD(AT&L)) PoPS Initiative 18 Nov 09 Memo DHS (Dept of Homeland Security) Segments of DHS implemented PSM as primary program reporting metric Implementation complete Feb 07 PoPS -- Probability of Program Success Piloted programs Navy PoPS Handbook, Guidebook & Spreadsheets for various Gates

Navy PoPS Handbook, Guidebook & Spreadsheets September 2008 Navy PoPS Handbook, Guidebook & Spreadsheets September 2008 U.S. Air Force PoPS Spreadsheet Operations Guide July 2007 U.S. Air Force PoPS Spreadsheet Operations Guide July 2007 Army PoPS Operations Guide 2005 Army PoPS Operations Guide 2005 PSM Status (Cont’d) Program Success Metrics Information DAU Acquisition Community of Practice Program Success Metrics Information DAU Acquisition Community of Practice Probability of Program Success (PoPS) “…POPS. This was a process to assess, in a very disciplined fashion, the current State of a program’s health and to forecast the probability of success of the program as it moves through the acquisition process.” -- Col William Taylor, USMC, PEO Land systems

6

7 Key Attributes of PSM Conveys program assessment process results concisely and effectively Uses summary display organized like a Work Breakdown Structure Program Success Factor Metric Factor Metric Factor Metric Factor Metric Factor Metric Level 0 Level 1 Level 2 Relies on information keyed with colors & symbols Easier to absorb Minimizes slides More efficient use of acquisition leader’s time Metric

8 PROGRAM SUCCESS PROBABILITY SUMMARY Program Success (2) Program Requirements (3) Program Execution Contract Earned Value Metrics (3) Program “Fit” in Capability Vision (2) Program Parameter Status (3) DoD Vision (2) Transformation (2) Interoperability (3) Army Vision (4) Current Force (4) Testing Status (2) Program Risk Assessment (5) Contractor Performance (2) Program Resources Budget Contractor Health (2) Manning Program Advocacy OSD (2) Joint Staff (2) War Fighter (4) Army Secretariat Congressional Industry (3) Fixed Price Performance (3) Program Scope Evolution Sustainability Risk Assessment (3) Joint (3) Technical Maturity (3) Legends: Colors: G: On Track, No/Minor Issues Y: On Track, Significant Issues R: Off Track, Major Issues Gray: Not Rated/Not Applicable Trends: Up Arrow: Situation Improving (number): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating PEO XXX COL, PM Date of Review: dd mmm yy Program Acronym ACAT XX INTERNAL FACTORS/METRICSEXTERNAL FACTORS/METRICS Program Life Cycle Phase: ___________ Future Force International (3)

9 PROGRAM SUCCESS PROBABILITY SUMMARY Program Success (2) Program Requirements (3) Program Execution Contract Earned Value Metrics (3) Program “Fit” in Capability Vision (2) Program Parameter Status (3) DoD Vision (2) Transformation (2) Interoperability (3) Army Vision (4) Current Force (4) Testing Status (2) Program Risk Assessment (5) Contractor Performance (2) Program Resources Budget Contractor Health (2) Manning Program Advocacy OSD (2) Joint Staff (2) War Fighter (4) Army Secretariat Congressional Industry (3) Fixed Price Performance (3) Program Scope Evolution Sustainability Risk Assessment (3) Joint (3) Technical Maturity (3) Legends: Colors: G: On Track, No/Minor Issues Y: On Track, Significant Issues R: Off Track, Major Issues Gray: Not Rated/Not Applicable Trends: Up Arrow: Situation Improving (number): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating PEO XXX COL, PM Date of Review: dd mmm yy Program Acronym ACAT XX INTERNAL FACTORS/METRICSEXTERNAL FACTORS/METRICS Program Life Cycle Phase: ___________ Future Force International (3)  What does this metric do? Evaluates program status in meeting performance levels mandated by warfighers  What does the metric contain? Usually contain all KPPs … and can include non-KPPs if PM believes it’s important to include them  How often is this metric updated? Quarterly  What denotes a Green, Yellow, or Red?  GREEN (8 to 10 points): Performance requirements are clearly understood, are well managed by warfighter, and are being well realized by PM. KPP/selected non-KPP threshold values are met by latest testing results (or latest analysis if testing has not occurred)  YELLOW (6 TO <8 points): Requirements are understood but are in flux (emergent changes from warfighter); warfighter management and/or PM execution of requirements has created some impact to original requirements set (set de-scope, or modification to original Obj/Thres values has/is occurring). One or more KPP/selected non- KPPs are below threshold values in pre-Operational Assessment testing (or analysis if OA testing has not occurred)  RED (<6 points): “Killer Blow”, or requirements flux/ “creep” has resulted in significant real-time changes to program plan requiring program rebaselining/restructure. One or more KPP/selected non- KPPs are below threshold values as evaluated during OA/OPEVAL testing Program Parameter Status

10 Combat Capability Threshold Objective C4I Interoperability (Strategic, Theater, Force Coord.,Force Control, Fire Control) Endurance Position diamond along bar to best show where each item is in terms of its threshold - objective range. Cost Manning (Non-KPP) Sustained Speed Program Acronym ACAT XX Date of Review: dd mmm yyCOL, PM PEO XXX (EXAMPLES) -Status as of Last Brief (eg 12/06) Comments: REQUIREMENTS - PROGRAM PARAMETER STATUS Y (3) Current Y Predictive

11 REQUIREMENTS - PROGRAM SCOPE EVOLUTION Requirement Funded Pgm Schedule (Budgeted/Obl) (Used / Planned) OriginalCDD/CPD(date) $#.#B / NA NA / 120 Months CurrentCDD/CPD(date) $#.#B / $#.#B 170/210 Months Stable Increased Descoped COL, PM Date of Review: dd mmm yy PEO XXX Program Acronym ACAT XX Comments: Y Predictive Y Current

12 RESOURCES - BUDGET SUFF R/Y/G FY04OBL /EX P FY05OBL /EXP FY06OBL/ EXP FY07FY08FY09FY10FY11FY12 RDT&E, A Xx% /yy% OPA N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A APA N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A WPA N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A O&M,A N/A Xx% /yy% N/A Xx% /yy% MILCON N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A Program Acronym ACAT XX PEO XXX COL, PM Date of Review: dd mmm yy Army Goals (Obl/Exp): First Year Second Year Third Year RDT&E,A 95%/58% 100%/91% OP,A 70%/--- 85%/ %/--- OM,A Comments: G Predictive G Current

13 RESOURCES - MANNING PEO XXX Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX G Current G Predictive Provides Status for Several Key Aspects of Program Office Manning Program Office Billets – Fill Status Covers Civil Service (Organic and Matrixed), Military, SE/TA, and Laboratory “Detailees” Performing Program Office Functions Identification of Vacant Billets and Status of Filling Them Identification of Key Specialty/DAWIA Certification Deficiencies, and Plans to Resolve Them Program Leadership Cadre Stability Tenure status for PM / DPM / PM Direct Reports Looked at Individually, and as a Cadre Are Critical Acquisition Personnel (e.g. PM) observing Mandated Tenure Requirements (4 years or successful Milestone Decision)? Bottom line -- Is Program Office Properly Resourced to Execute Assigned Scope of Responsibility?

14 PROGRAM SUCCESS PROBABILITY SUMMARY Program Success (2) Program Requirements (3) Program Execution Contract Earned Value Metrics (3) Program “Fit” in Capability Vision (2) Program Parameter Status (3) DoD Vision (2) Transformation (2) Interoperability (3) Army Vision (4) Current Force (4) Testing Status (2) Program Risk Assessment (5) Contractor Performance (2) Program Resources Budget Contractor Health (2) Manning Program Advocacy OSD (2) Joint Staff (2) War Fighter (4) Army Secretariat Congressional Industry (3) Fixed Price Performance (3) Program Scope Evolution Sustainability Risk Assessment (3) Joint (3) Technical Maturity (3) Legends: Colors: G: On Track, No/Minor Issues Y: On Track, Significant Issues R: Off Track, Major Issues Gray: Not Rated/Not Applicable Trends: Up Arrow: Situation Improving (number): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating PEO XXX COL, PM Date of Review: dd mmm yy Program Acronym ACAT XX INTERNAL FACTORS/METRICSEXTERNAL FACTORS/METRICS Program Life Cycle Phase: ___________ Future Force International (3)  What does this metric do? Evaluates ability of PM to execute his or her responsibilities  GREEN (2 to 3 points):  90% or above of all Program Office authorized/funded billets are filled.  90% (or more) of all DAWIA-qualified billets are filled with personnel possessing at least the required qualification level.  SETA funding levels are below Congressionally mandated limits  YELLOW (1 TO <2 points):  80% to 89% of all Program Office authorized/funded billets are filled.  80% to 89% of all DAWIA-qualified billets are filled with personnel possessing at least the required qualification level.  SETA funding levels at or below Congressionally mandated limits  RED (<1 point):  Less than 80% of all Program Office authorized/funded billets are filled.  Less than 80% of all DAWIA-qualified billets are filled with personnel possessing at least the required qualification level.  SETA funding levels are above Congressionally mandated limits Manning

15 RESOURCES – CONTRACTOR HEALTH Corporate Indicators –Company/Group Metrics Current Stock P/E Ratio Last Stock Dividends Declared/Passed Industrial Base Status (Only Player? One of __ Viable Competitors?) –Market Share in Program Area, and Trend (over last Five Years) Significant Events (Mergers/Acquisitions/ “Distractors”) Program Indicators –Program-Specific Metrics “Program Fit” in Company/Group Key Players, Phone Numbers, and their Experience Program Manning/Issues Contractor Facilities/Issues Key Skills Certification Status (e.g. ISO 9000/CMM Level) PM Evaluation of Contractor Commitment to Program –High, Med, or Low Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM Y Predictive Y (2) Current

16 $ % 56%$50 100% $90 122%$110 00% 04/02 04/0408/04 04/00 EXECUTION – CONTRACT EARNED VALUE METRICS [give short contract title] YYMMDD Axxxxx-YY-CxxxxContractor Name [Prime or Significant Sub] Program Acronym ACAT XX Date of Last Rebaselining: JAN02 Number of Rebaselinings: 1 Date of Next Rebaselining: MMM YY KTR’s EAC: 104M Date of Last Award Fee: MMM YY Date of Next Award Fee: MMM YY 1.18 PM’s EAC Total Spent Total Calendar Schedule $M 0 % TAB BAC ACWP EAC EV % Spent 50% [TCPI EAC = 0.76] CV = $2.0 M SV = $2.9 M 100%108% 01/02 SPI 1.18 Ahead of Schedule and Underspent Behind Schedule and Underspent Ahead of Schedule and OverspentBehind Schedule and Overspent CPI 01/00 10/99 07/99 04/99 05/02 04/02 03/02 02/02 10/01 07/01 04/01 1/01 10/00 07/00 04/00 01/02 42% PM’s Projected Performance at Completion for CPI and Duration. PEO XXX Date of Review: dd mmm yy COL, PM (1.1,1.1) (1.1, -0.95)(-0.95, -0.95) (-0.95, 1.1) Y Predictive Y (3) Current

17 Program Acronym ACAT XX EXECUTION – CONTRACTOR PERFORMANCE PEO XXX Date of Review: dd mmm yy COL, PM Y Predictive Y (2) Current

18 EXECUTION – FIXED PRICE PERFORMANCE DCMA Plant Rep Evaluation –Major Issues Delivery Profile Graphic (Plan vs Actual) –Major Issues Progress Payment Status –Major Issues Other Metrics are Available – Example – Status/Explanation for Production Backlog Date of Review: dd mmm yy COL, PM PEO XXX Program Acronym ACAT XX G Predictive G (3) Current

19 (4) (2) 5 (3) EXECUTION - PROGRAM RISK ASSESSMENT Likelihood Consequence High Medium Low A brief description of Issue # 5 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 5 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 1 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 1 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 3 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 3 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 2 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 2 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 6 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 6 and rationale for its rating. Approach to remedy/mitigation Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM Trends: Up Arrow: Situation Improving (#): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating ( ) Y Predictive Y (5) Current

20 1: Training 2: Support Equipment 3: Publications 4: Facilities 5: Maintenance Concept 6: Supply Support 7: MTBF/Ao/Reliability Sustainability Areas Sustainability Areas (examples) Consequence Likelihood Low Risk Medium RiskHigh Risk Program Acronym ACAT XX Date of Review: dd mmm yyCOL, PM PEO XXX RISK # 4 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. RISK #5 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. RISK # 6 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. EXECUTION – SUSTAINABILITY RISK ASSESSMENT Y (3) Current Y Predictive

21 EXECUTION – TESTING STATUS Contractor Testing (e.g. Qualification, Integration) - Status (R/Y/G) –Major Points/Issues Developmental Testing – Status (R/Y/G) –Major Points/Issues Operational Testing – Status (R/Y/G) –Major Points/Issues Follow-On Operational Testing – Status (R/Y/G) –Major Points/Issues Special Testing – Status (R/Y/G) (Could Include LFT&E, Interoperability Testing (JITC), Etc.) –Major Points/Issues TEMP Status Other (DOT&E Annual Report to Congress, etc – As Necessary) Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM G Predictive G (2) Current

22 EXECUTION – TECHNICAL MATURITY PEO XXX Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX CDR Program Initiation Milestone C Y Predictive Y (3) Current

23 PROGRAM “FIT” IN CAPABILITY VISION AREA(Examples)STATUS TREND DoD Vision G (2) Transformation G (2) Interoperability Y (3) Joint G (3) Service/Agency Vision Y (4) Current Force Y (4) Future Force (N/A) (N/A) Other (N/A)(N/A) Overall Y (2) Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM Y Predictive Y (2) Current DoD Vision Service/Agency Vision

24 PROGRAM ADVOCACY AREA(Examples)STATUS TREND OSD Y(2) –(Major point) Joint Staff Y(2) –(Major point) Warfighter Y(4) –(Major point) Service Secretariat G –(Major point) Congressional Y –(Major point) Industry G(3) –(Major Point) International G(3) –(Major Point) Overall Y Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX PEO XXX Y Current Y Predictive

25 EXECUTIVE SUMMARY PEO XXX Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX Program Success (2) Program Requirements (3) Program Resources Program Execution Program Fit in Capability Vision (2) Program Advocacy Comments/Recap – PM’s “Closer Slide” Includes PEO, Service Staff Review Comments

26 “Killer Blow” Concept Action taken by a decision maker in the chain of command (or an “Advocacy” player) resulting in program non-executability until remedied – results in immediate “Red” coloration of Overall PS metrics until remedied Program Success Factor Metric Factor Metric Factor Metric Factor Metric Advocacy Metric Congress Level 0 Level 1 Level 2 Congress zeroes out program Metric

27 “Killer Blow” Concept (Cont’d) Level 2 factor score is zero (0) – a “Killer Blow” is recorded when a non-executable situation exits. Color this metric Red, the factor above it Red, and the Program Success block Red Program Success Reqt’s Pgm Parameter Score=0 Pgm Scope Factor Metric Factor Metric Factor Metric Level 0 Level 1 Level 2 KPP cannot be met – program restructure/rebaseline required Metric Factor Metric

28 Backups

29 PROGRAM SUCCESS PROBABILITY SUMMARY Program Success (2) Program Requirements (3) Program Execution Contract Earned Value Metrics (3) Program “Fit” in Capability Vision (2) Program Parameter Status (3) DoD Vision (2) Transformation (2) Interoperability (3) Army Vision (4) Current Force (4) Testing Status (2) Program Risk Assessment (5) Contractor Performance (2) Program Resources Budget Contractor Health (2) Manning Program Advocacy OSD (2) Joint Staff (2) War Fighter (4) Army Secretariat Congressional Industry (3) Fixed Price Performance (3) Program Scope Evolution Sustainability Risk Assessment (3) Joint (3) Technical Maturity (3) Legends: Colors: G: On Track, No/Minor Issues Y: On Track, Significant Issues R: Off Track, Major Issues Gray: Not Rated/Not Applicable Trends: Up Arrow: Situation Improving (number): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating PEO XXX COL, PM Date of Review: dd mmm yy Program Acronym ACAT XX INTERNAL FACTORS/METRICSEXTERNAL FACTORS/METRICS Program Life Cycle Phase: ___________ Future Force International (3)

30 Acquisition Phases Terminology/ Values Terminology/ Values Terminology/ Values Terminology/ Values Terminology/ Values Program Planning (100 pts max) Program Requirements 20 Program Resources 18 Program Planning 22 Fit in Vision 15 Advocacy 25 Pre-Milestone B (100 pts max) Program Requirements 25 Program Resources 16 Program Execution 24 Fit in Vision 15 Advocacy 20 Post- Milestone B (100 pts max) Program Requirements 20 Program Resources 20 Program Execution 20 Fit in Vision 15 Advocacy 25 Post- Milestone C (100 pts max) Program Requirements 16 Program Resources 25 Program Execution 30 Fit in Vision 9 Advocacy 20 Sustainment (100 pts max) Program Requirements 5 Program Resources 35 Program Execution 55 Fit in Vision 1 Advocacy 4 * Sustainment is a new add as of Jul 07 Air Force POPS Calculation Aligned with Acquisition Phases

31 Frequency of Data Input