1 PROGRAM SUCCESS PROBABILITY John Higbee DAU 2 June 2004.

Slides:



Advertisements
Similar presentations
©2003 Rolls-Royce plc The information in this document is the property of Rolls-Royce plc and may not be copied or communicated to a third party, or used.
Advertisements

1 SW Project Management (Planning & Tracking) Dr. Atef Z Ghalwash Faculty of Computers & Information Helwan University.
Monitoring and Control Earned Value Management (EVM)
CSSE Oct.2008 Monitoring and reporting project status Chapter 10, pages
Readiness Index – Is your application ready for Production? Jeff Tatelman SQuAD October 2008.
Earned Value Management
EARNED VALUE MANAGEMENT SYSTEM A Project Performance Tool
Office of Project Management Metrics Report Presentation
Project Monitoring and Control. Monitoring – collecting, recording, and reporting information concerning project performance that project manger and others.
Project Management Methodology Project monitoring and control.
Army Directorate of Public Works Support Contractor of the Year Carlos Garcia Owner/CEO KIRA Maximizing Return on Investment in Business Development.
Program Success Metrics John Higbee DAU 19 Jan 2006.
Understanding Earned Value Analysis
1 ProgramSuccessMetrics Al Moseley DSMC - School of Program Managers How will you measure your program’s success? PMSC 8 Dec.
U.S. Department of Energy Project Management: Communicating Progress – Celebrating Success Paul Bosco, PE, PMP, CFM, LEED-AP Director, Office of Procurement.
Dave Bachman DAU / CDSC September 9, 2004
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
0 PM Metrics Briefing COL Stephen L. Rust PM ITTS 25 Mar 04.
Software Project Management Lecture # 7. What are we studying today? Chapter 24 - Project Scheduling  Effort distribution  Defining task set for the.
1 13 Feb 2004 Army Performance Management and Integration Programs & Strategy Directorate (SAFM-CE) New Horizons in Costing and Performance.
Mr. Charles Riechers Principal Deputy Assistant Secretary for Acquisition and Management 17 April 2007 SAE/CAE Panel on Acquisition of Services.
DEVELOPMENTAL TEST & EVALUATION DT&E – From Concept to Combat Integrated Test Process Darlene Mosser-Kerner Developmental Test & Evaluation OUSD(AT&L)/Systems.
DA HIEF YSTEMS NGINEER 1 ASN (RDA) Chief Systems Engineer ASN (RDA) Chief Systems Engineer DoN PoPS to ICPM Core Common Metrics and OSD DAPS NDIA ICPM.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
1 Alternative EVM Applications Not a One-Size-Fits-All.
RDA CHSENG, April 2009 DA HIEF YSTEMS NGINEER ASN (RDA) Chief Systems Engineer ASN (RDA) Chief Systems Engineer Naval Probability of Program Success (PoPS)
1 [insert briefing type here] [insert program name here] for [insert name and organization of person(s) to be briefed here] [insert “month day, year” of.
How to Review a SAR (Key attributes of a good SAR) October 2007 DAMIR Conference Dana Harrison (703)
Peg Johnson, Owner EVMS-Solutions 1 Peg Johnson
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems The Integrated Master Plan (IMP) and the Integrated Master Schedule.
Copyright © 2008 Industrial Committee on Program Management. All rights reserved. Predictive Measures of Program Performance March 19, 2009 Industrial.
1 1 Nunn-McCurdy Legislation/Program Impacts Della McPhail Chief Financial Officer 308 ARSW DISTRIBUTION STATEMENT A: Approved for public release; distribution.
Earned Value Management Presented By: Steve Krivokopich May , 2006.
Probability of Success. 2 AIM Home Page 3 AIM Contact Page Select to request training.
Program Health Metrics/Templates
Insert Project Title Presentation of SSEB Findings to the Source Selection Authority {Insert Date} Presented by: Insert Name & Title Insert Name, Contracting.
Independent Expert Program Review (IEPR) February 2006.
UU Master Class Earned Value 17 July Earned Value Contents What is Earned Value / Earned Value Analysis Why and where is it used A brief history.
IT Acquisition Management Transformation Rapid Improvement Team (RIT) PM / Pilot Team Meeting 19 November 2002.
Evaluating EVM January 13, EVM Issues and Opportunities ▪ EVM Basics ▪ EVM Issues ▪ EVM ETC and EAC Calculations ▪ EVM Cost and Schedule Value Calculations.
1 PROGRAM SUCCESS – A NEW WAY TO PREDICT IT John Higbee DAU 25 August 2003.
Report Performance Monitor & Control Risk Administer Procurement MONITORING & CONTROLLING PROCESS.
Introduction To Earned Value November 14, Definition Earned Value is a method for measuring project performance. It compares the amount of work.
Strength through Industry & Technology How is the Government Managing for Value? Program Management Systems Committee March 11, 2007 The Voice of the Industrial.
Where Module 04 : Knowing Where the Project Is 1.
Cmpe 589 Spring Fundamental Process and Process Management Concepts Process –the people, methods, and tools used to produce software products. –Improving.
EVM – Do You Really Know What the Numbers Mean? Booz | Allen |Hamilton Seth Huckabee EVP, PMP, PMI-SP.
Supplier Management Can’t live with them, Can’t live without them!
Vendor Date VALU Monthly Project Review (VMPR) Project Name/IN #
Program Success Metrics
Project Management (x470)
Schedule Margin July 2013 NAVY CEVM
IEVMC Lesson 8 FY14 Analysis Case Study Performance Management Analysis Tools Student guide page 8-5.
EVM 202—Lesson 7 Tools Analysis Case Study
DSMC - School of Program Managers
Program Success – A Different Way to Assess It
Earned Value - What is it
Dave Bachman DAU / CDSC September 9, 2004
EVM 202—Lesson 7 Tools Analysis Case Study
Program Success Metrics
PROGRAM SUCCESS – A NEW WAY TO PREDICT IT
Fix it or Forget it? Dealing with Troubled Projects
Schedule Margin July 2013 NAVY CEVM
Schedule Margin July 2013 NAVY CEVM
Missile Defense Agency EVM Update
Independent Expert Program Review (IEPR)
Probability of Success
Managing Project Work, Scope, Schedules, and Cost
PROGRAM SUCCESS PROBABILITY
Presentation transcript:

1 PROGRAM SUCCESS PROBABILITY John Higbee DAU 2 June 2004

2 STARTING POINT Tasking From ASA(ALT) Claude Bolton (March 2002) –Despite Using All the Metrics Commonly Employed to Measure Cost, Schedule, Performance and Program Risk, There are Still Too Many Surprises (Poorly Performing /Failing Programs) Being Briefed “Real Time” to Army Senior Leadership DAU (with Industry Representatives) was Asked to: –Identify a Comprehensive Method to Better Determine the Probability of Program Success –Recommend a Concise “Program Success” Briefing Format for Use by Army Leadership

3 PROCESS PREMISE Classical Internal Factors for Cost, Schedule, Performance and Risk (Largely Within the Control of the Program Manager) Provide an Important Part of Program Success Picture – But NOT the WHOLE Picture –Program Success also Depends on External Factors (Largely Not Within the PM’s Control, but That the PM Can Influence By Informing/Using Service/OSD Senior Leadership) Accurate Assessment of Program Success Probability Requires a Holistic Combination of Internal and External Factors –Internal: Requirements, Resources, and Execution –External: Fit in the Vision, and Advocacy Next Step - Develop An Assessment Model/Process Using Selected Metrics For Each Factor - Providing an Accurate “Program Pulse Check” –“Five Factors” are Consistent Across All Programs/All Acq. Cycle Phases –Metrics for Each Factor are Tailorable by PM/PEO to Specific Program Situation (Program Type/Phase of Acq. Process) “Don’t Force Everyone into a Size 4 AAA Shoe…”

4 BRIEFING PREMISE Significant Challenge – Develop a Briefing Format That – Conveys Program Assessment Process Results Concisely/Effectively – Is Consistent Across Army Acquisition Selected Briefing Format: –Uses A Summary Display Organized Like a Work Breakdown Structure –Program Success (Level 0); Factors (Level 1); Metrics (Level 2) –Relies On Information Keyed With Colors And Symbols, Rather Than Dense Word/Number Slides Easier To Absorb –Minimizes Number of Slides More Efficient Use Of Leadership’s Time – Don’t “Bury in Data”!

5 PROGRAM SUCCESS PROBABILITY SUMMARY Program Success (2) Program Requirements (3) Program Execution Contract Earned Value Metrics (3) Program “Fit” in Capability Vision (2) Program Parameter Status (3) DoD Vision (2) Transformation (2) Interoperability (3) Army Vision (4) Current Force (4) Testing Status (2) Program Risk Assessment (5) Contractor Performance (2) Program Resources Budget Contractor Health (2) Manning Program Advocacy OSD (2) Joint Staff (2) War Fighter (4) Army Secretariat Congressional Industry (3) Fixed Price Performance (3) Program “Smart Charts” Program Scope Evolution Sustainability Risk Assessment (3) Joint (3) Technical Maturity (3) Legends: Colors: G: On Track, No/Minor Issues Y: On Track, Significant Issues R: Off Track, Major Issues Gray: Not Rated/Not Applicable Trends: Up Arrow: Situation Improving (number): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating PEO XXX COL, PM Date of Review: dd mmm yy Program Acronym ACAT XX INTERNAL FACTORS/METRICSEXTERNAL FACTORS/METRICS Program Life Cycle Phase: ___________ Future Force International (3)

6 Combat Capability Threshold Objective C4I Interoperability (Strategic, Theater, Force Coord.,Force Control, Fire Control) Endurance Position diamond along bar to best show where each item is in terms of its threshold - objective range. Cost Manning (Non-KPP) Sustained Speed Program Acronym ACAT XX Date of Review: dd mmm yyCOL, PM PEO XXX (EXAMPLES) -Status as of Last Brief (mm/yy – e.g. “01/03”) Comments: REQUIREMENTS - PROGRAM PARAMETER STATUS Y (3) Historical Y Predictive

7 REQUIREMENTS - PROGRAM SCOPE EVOLUTION RequirementFunded PgmSchedule (CE to FUE) (Budgeted/Obl)(Used / Planned) OriginalORD (date)$#.#B / NANA / 120 Months CurrentORD (date)$#.#B / $#.#B170/210 Months Stable Increased Descoped COL, PM Date of Review: dd mmm yy PEO XXX Program Acronym ACAT XX Comments: Y Predictive Y Historical

8 RESOURCES - BUDGET Program Acronym ACAT XX PEO XXX COL, PM Date of Review: dd mmm yy Army Goals (Obl/Exp): First Year Second Year Third Year RDT&E,A 95%/58% 100%/91% OP,A 70%/--- 85%/ %/--- OM,A Comments: SUFF R/Y/G FY01OBL/ EXP FY02OBL/ EXP FY03OBL/ EXP FY04FY05FY06FY07FY08FY09 RDT&E, A Xx% /yy% OPA N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A APA N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A WPA N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A O&M,A N/A Xx% /yy% N/A Xx% /yy% MILCON N/A Xx% /yy% N/A Xx% /yy% N/A Xx%/ yy% N/A G Predictive G Historical

9 RESOURCES - MANNING PEO XXX Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX OCT 00MAR 01OCT 01MAR 02OCT 02MAR 03 Comments: What Key Billets are Vacant? DPM Billet Still Vacant (Estimate Fill in Two Months) Lead Software Engineer (Emergent Loss) – Tech Director Filling In Need S/W Experienced GS-14 ASAP Is the Program Office Adequately Staffed? Yes (except as noted above) G Historical G Predictive

10 RESOURCES – CONTRACTOR HEALTH Corporate Indicators –Company/Group Metrics Current Stock P/E Ratio Last Stock Dividends Declared/Passed Industrial Base Status (Only Player? One of __ Viable Competitors?) –Market Share in Program Area, and Trend (over last Five Years) Significant Events (Mergers/Acquisitions/ “Distractors”) Program Indicators –Program-Specific Metrics “Program Fit” in Company/Group Program ROI (if available) Key Players, Phone Numbers, and their Experience Program Manning/Issues Contractor Facilities/Issues Key Skills Certification Status (e.g. ISO 9000/CMM Level) PM Evaluation of Contractor Commitment to Program –High, Med, or Low Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM Y Predictive Y (2) Historical

11 $ % 56%$50 100% $90 122%$110 00% 04/02 04/0408/04 04/00 EXECUTION – CONTRACT EARNED VALUE METRICS [give short contract title] YYMMDD Axxxxx-YY-CxxxxContractor Name [Prime or Significant Sub] Program Acronym ACAT XX Date of Last Rebaselining: JAN02 Number of Rebaselinings: 1 Date of Next Rebaselining: MMM YY KTR’s EAC: 104M Date of Last Award Fee: MMM YY Date of Next Award Fee: MMM YY 1.18 PM’s EAC Total Spent Total Calendar Schedule $M 0 % TAB BAC ACWP EAC EV % Spent 50% [TCPI EAC = 0.76] CV = $2.0 M SV = $2.9 M 100%108% 01/02 SPI 1.18 Ahead of Schedule and Underspent Behind Schedule and Underspent Ahead of Schedule and OverspentBehind Schedule and Overspent CPI 01/00 10/99 07/99 04/99 05/02 04/02 03/02 02/02 10/01 07/01 04/01 1/01 10/00 07/00 04/00 01/02 42% PM’s Projected Performance at Completion for CPI and Duration. PEO XXX Date of Review: dd mmm yy COL, PM (1.1,1.1) (1.1, -0.95)(-0.95, -0.95) (-0.95, 1.1) Y Predictive Y (3) Historical

12 Program Acronym ACAT XX EXECUTION – CONTRACTOR PERFORMANCE PEO XXX Date of Review: dd mmm yy COL, PM Y Predictive Y (2) Historical

13 EXECUTION – FIXED PRICE PERFORMANCE DCMA Plant Rep Evaluation –Major Issues Delivery Profile Graphic (Plan vs Actual) –Major Issues Progress Payment Status –Major Issues Date of Review: dd mmm yy COL, PM PEO XXX Program Acronym ACAT XX G Predictive G (3) Historical

14 (4) (2) 5 (3) EXECUTION - PROGRAM RISK ASSESSMENT Likelihood Consequence High Medium Low A brief description of Issue # 5 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 5 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 1 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 1 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 3 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 3 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 2 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 2 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 6 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 6 and rationale for its rating. Approach to remedy/mitigation Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM Trends: Up Arrow: Situation Improving (#): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating ( ) Y Predictive Y (5) Historical

15 : Overall Assessment 1: Training 2: Support Equipment 3: Publications 4: Facilities 5: Maintenance Concept 6: Supply Support 7: MTBF/Ao/Reliability Sustainability Areas Sustainability Areas (examples) Consequence Likelihood Low Risk Medium RiskHigh Risk Program Acronym ACAT XX Date of Review: dd mmm yyCOL, PM PEO XXX RISK # 4 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. RISK #5 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. RISK # 6 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. EXECUTION – SUSTAINABILITY RISK ASSESSMENT Y (3) Historical Y Predictive

16 EXECUTION – TESTING STATUS Contractor Testing (e.g. Qualification, Integration) - Status (R/Y/G) –Major Points/Issues Developmental Testing – Status (R/Y/G) –Major Points/Issues Operational Testing – Status (R/Y/G) –Major Points/Issues Follow-On Operational Testing – Status (R/Y/G) –Major Points/Issues Special Testing – Status (R/Y/G) (Could Include LFT&E, Interoperability Testing (JITC), Etc.) –Major Points/Issues TEMP Status Other (DOT&E Annual Report to Congress, etc – As Necessary) Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM G Predictive G (2) Historical

17 EXECUTION – TECHNICAL MATURITY PEO XXX Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX CDR Program Initiation Milestone C Y Predictive Y (3) Historical

18 PROGRAM “FIT” IN CAPABILITY VISION AREA(Examples)STATUS TREND DoD Vision G (2) Transformation G (2) Interoperability Y (3) Joint G (3) Army Vision Y (4) Current Force Y (4) Future Force (N/A) (N/A) Other (N/A) (N/A) Overall Y (2) Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM Y Predictive Y (2) Historical

19 PROGRAM ADVOCACY AREA(Examples)STATUS TREND OSD Y (2) –(Major point) Joint Staff Y (2) –(Major point) War Fighter Y (4) –(Major point) Army Secretariat G –(Major point) Congressional Y –(Major point) Industry G (3) –(Major Point) International G (3) –(Major Point) Overall Y Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX PEO XXX Y Historical Y Predictive

20 FINDINGS / ACTIONS PEO XXX Date of Review: dd mmm yy COL, PM Program Acronym ACAT XX Program Success (2) Program Requirements (3) Program Resources Program Execution Program Fit in Capability Vision (2) Program Advocacy Comments/Recap – PM’s “Closer Slide”

21 STATUS/FUTURE PLANS Status –Multiple Acquisition Staffs (Navy, Air Force, USD(AT&L), NSA, MDA and Space Acq Executive) Have Requested the Product and are Reviewing /Considering It for Use –Multiple DoD and Industry Program Managers (including F/A-22, THAAD, In Service CVN) have Adopted It as an Assessment/ Reporting Tool –Some Int’l Interest (UK Nat’l Audit Office; Australian DMO) OCT 2002 – ASA(ALT) Briefed on Effort; Expressed Intent to Implement Program Success Factors Across Army DEC 2002 – Program Success Factors Pilot Commences in Two Army Programs in PEO (IEW&S) JULY 2003 – Army Decides to Phase-Implement Program Success Factors Across Army Acquisition; ALTESS Begins Automation Effort on Army AIM System DEC 2003 – ASA(ALT) Signs Out PSF Implementation Memo JAN 2004 – PSF Application Goes “Live” on AIM MAR 2004 – First Four Programs (BLACKHAWK, COMANCHE, WIN- T and FBCB2) Submit PSF Reports

22 BACKUP SLIDES

23 QUANTIFICATION PROCESS First Three Factors (Requirements, Resources, and Execution) Represent How the Program is Operating –Nominally 60% in Aggregate Last Two Factors (Fit in Strategic Vision and Advocacy) Represent Whether or Not the Program Should/Could be Pursued –Nominally 40% in Aggregate First Three Factors (in Aggregate) Have “Greater Effect” on Program Success than Last Two Factors, but NOT a “Much Greater Effect”

24 PROBABILITY OF PROGRAM SUCCESS “BANDS” Green (80 to 100) –Program is On Track for Providing Originally-Scoped Warfighting Capability Within Budgeted Cost and Approved Schedule Issues are Minor in Nature Yellow (60 to <80) –Program is On Track for Providing Acceptable Warfighting Capability With Acceptable Deviations from Budgeted Cost and Approved Schedule Issues May Be Major but are Solvable within Normal Acquisition Processes Red (< 60, or Existing “Killer Blows” in Level 2 Metrics) –Program is OFF Track Acceptable Warfighting Capability – will NOT be Provided, or –Will ONLY be Provided with Unacceptable Deviations from Budgeted Cost and Approved Schedule Issues are Major and NOT Solvable within Normal Acquisition Processes (e.g. Program Restructure Required)

25 “KILLER BLOW” “Killer Blow” at the Sub-Factor (Level II) Level –Action Taken By A Decision Maker In The Chain Of Command (Or An “Advocacy” Player) Resulting In Program Non-Executability Until Remedied For Example: Zeroing Of Program Budget By Congressional Committee/Conference –Results In Immediate “Red” Coloration Of Associated Level 2, Level 1 And Overall PS Metrics Until Remedied

26 EXECUTION – TECHNICAL MATURITY  CRITICAL TECHNOLOGY MATURITY CRITICAL TECHNOLOGYDESCRIPTION/ISSUETRLG/Y/R  PROGRAM DESIGN MATURITY  ENGINEERING DRAWINGS G/Y/R  PERCENTAGE OF DRAWINGS APPROVED /RELEASED FOR USE  ISSUES  PROGRAM INTEGRATION/PRODUCTION FACTORS  INTREGRATION/PRODUCTION FACTORDESCRIPTION/ISSUEIRL/PRLG/Y/R  PROGRAM PRODUCTION MATURITY  KEY PRODUCTION PROCESSES G/Y/R  PERCENTAGE OF KEY PROD. PROC. UNDER STAT. PROCESS CONTROL  ISSUES Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM G Predictive G (2) Historical

27 EXECUTION – CONTRACTOR PERFORMANCE Contractor Performance Assessment (Drawn From CPARS/PPIMS, etc) –Last Evaluation (Provide Summary of Evaluation Last Provided to Contractor, Along with PM evaluation of Current Status) –Highlight Successes as Well as Areas of Concern –Performance Trend (over the Contract Period of Performance) Highlight Successes as Well as Areas of Concern Award/Incentive Fee History –Summary of Actual Award/Incentive Fees Provided to Contractor If Different than Specified in Fee Plan, Discuss Reasons/Actions Indicated from the Situation Are Fee Awards Consistent with Contractor Performance Assessments? Program Acronym ACAT XX PEO XXX Date of Review: dd mmm yy COL, PM G Predictive G (2) Historical