AIR FORCE STUDIES BOARD 1 Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs RADM Rand Fisher, Chair Dr. Dan.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Program Management Office (PMO) Design
Global Congress Global Leadership Vision for Project Management.
’s TechStat Framework Agency Seal. Agenda 1.TechStat Overview 2.TechStat Process 3.Managing Outcomes 2.
PROJECT RISK MANAGEMENT
Course: e-Governance Project Lifecycle Day 1
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
It’s Time to Talk About Risk and Control
Continuous Process Improvement (CPI) Program Update Colonel Ric Sherman, United States Army Office of the Assistant Deputy Under Secretary of Defense for.
IT Planning.
By Saurabh Sardesai October 2014.
Quality evaluation and improvement for Internal Audit
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Office of the Auditor General of Canada The State of Program Evaluation in the Canadian Federal Government Glenn Wheeler Director, Results Measurement.
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
Application Streamlining Initiative Kickoff Meeting August 23, 2011.
Purpose of the Standards
Emerging Latino Communities Initiative Webinar Series 2011 June 22, 2011 Presenter: Janet Hernandez, Capacity-Building Coordinator.
Internal Auditing and Outsourcing
Office of the Under Secretary of Defense (Comptroller) Physical Inventory Best Practices 1 Briefing for Joint Physical Inventory Working Group September.
What is Business Analysis Planning & Monitoring?
Unclassified. Program Management Empowerment and Accountability Mr. David Ahern Director, Portfolio Systems Acquisition AT&L(A&T) 14 April 2009 The Acquisition.
United States Army Freedom of Information Act (Freedom of Information Act Managerial Training)
Challenges Faced in Developing Audit Plans and Programs 21 st March, 2013.
Copyright Course Technology 1999
9 Closing the Project Teaching Strategies
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
Donald R. Rainey, Sr., CPPB/VCO Director, Office of General Services Virginia Department of Social Services.
NIST Special Publication Revision 1
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
The Challenge of IT-Business Alignment
Prepared by Opinion Dynamics Corporation May 2004.
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
1.  Describe an overall framework for project integration management ◦ RelatIion to the other project management knowledge areas and the project life.
Private & Confidential1 (SIA) 13 Enterprise Risk Management The Standard should be read in the conjunction with the "Preface to the Standards on Internal.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Ahmad Al-Ghoul. Learning Objectives Explain what a project is,, list various attributes of projects. Describe project management, discuss Who uses Project.
Management & Development of Complex Projects Course Code MS Project Management Perform Qualitative Risk Analysis Lecture # 25.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
1 Clinger Cohen Act (CCA) (Title 40): An Emerging New Approach to Oversight – Overview and Program Pilot June 27, 2006 Mr. Edward Wingfield Commercial.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AND COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
Improving Certification, Training, & Professional Development for the AT&L Community Project Update Robert Hausmann, CNAC Judith Bayliss, DAU.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
D Appendix D.11. Toward Net-Centric Acquisition Oversight A Proposal for an Acquisition Community of Interest (COI) MID 905 Streamlined Acquisition.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
UNCLASSIFIED NDIA CPM 01/13/10 Page-1 Streamlining Program Reviews Terry Jaggers Principle Director, Systems Engineering Office of the Director, Defense.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 1 Diploma of Project Management.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Independent Expert Program Review (IEPR) February 2006.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
| 1 Weapon System Acquisition Reform- Product Support Assessment DAU SYMPOSIUM 13 April 2010 Presented by: Basil Gray Where Innovation.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Continual Service Improvement Methods & Techniques.
Power to the Edge A Net-Centric DoD NII/DoD CIO IT Acquisition Management Transformation Rapid Improvement Team (RIT) Principals Meeting November 18, 2003.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
1 1 The FAA's ECSS Acquisition — An Innovative Approach to Procuring Enterprise-Wide Support Services Breakout Session # 782 Scott M. Bukovec Senior Contracting.
JMFIP Financial Management Conference
Identify the Risk of Not Doing BA
TechStambha PMP Certification Training
Research Program Strategic Plan
Sandhills Chapter Audit Update 23 August 2017
Portfolio, Programme and Project
Define Your IT Strategy
Presentation transcript:

AIR FORCE STUDIES BOARD 1 Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs RADM Rand Fisher, Chair Dr. Dan Stewart, Vice Chair

AIR FORCE STUDIES BOARD 2 Bottom Line Unless they do it better than they are now, Air Force and DOD attempts to address poor acquisition program performance with additional reviews will fail The Air Force and DOD need to –Engage in timely planning for reviews –Align reviews with program decision points and milestones –Before creating new reviews, determine whether existing reviews could accomplish objectives –Staff review teams with subject matter experts –Document all review outputs

AIR FORCE STUDIES BOARD 3 Background DOD spends over $300 billion per year to develop, produce, field, and sustain weapons systems –Air Force over $100 billion per year Large cost overruns and schedule delays lead to loss of confidence in acquisition system and people “DOD is not receiving expected returns on its large investment in weapon systems. Our analysis does not show any improvements in acquisition outcomes as programs continue to experience increased costs and delays in delivering capabilities to the warfighter. In fact, when compared to the performance of the fiscal year 2000 portfolio of major defense acquisition programs, cost and schedule performance for current programs is actually worse. Without improved acquisition outcomes in the future, achieving DOD’s transformational objectives in a constrained fiscal environment is highly unlikely.” (GAO, 2008)

AIR FORCE STUDIES BOARD 4 Background DOD response: More reviews

AIR FORCE STUDIES BOARD 5 Background Significant increase in the number and frequency of management reviews at the program, service and OSD levels since 1996 –Separate milestone reviews for evolutionary acquisition increments –Air Force reviews such as Sufficiency Reviews and IPAs –OSD-level reviews such as CSBs and PSRs –Specialty reviews (logistics, manufacturing, technical readiness –Previously discretionary reviews made mandatory

AIR FORCE STUDIES BOARD 6 Background

AIR FORCE STUDIES BOARD 7 Background Reviews and prereviews required at multiple levels— both vertical and horizontal –Only PM experiences full breadth and depth of review process—program office must support all –Overall magnitude of review efforts significantly increases program office workload and diverts attention from day-to- day management of program No evidence of earlier work focused on impact of review process on resources spent by the program office or effect of diverting PM’s attention from day-to- day management of his or her programs

AIR FORCE STUDIES BOARD 8 Background SAF/AQ requested Air Force Studies Board (NRC) investigate How can Air Force and DOD review of Air Force acquisition programs be made more effective and its cost and burden on the program manager lessened?

AIR FORCE STUDIES BOARD 9 Statement of Task Examine program reviews and assessments that Air Force space and non-space system acquisition programs undergo Assess resources required to accomplish reviews Assess contribution reviews make to successful acquisition Identify overlaps Evaluate options to increase cost-effectiveness and lessen workforce impact of reviews Recommend changes that Air Force and DOD should make

AIR FORCE STUDIES BOARD 10 Study Committee RAND H. FISHER, Chair, The Aerospace Corporation J. DANIEL STEWART, Vice Chair, University of Tennessee JOHN A. BETTI, Department of Defense (retired) CHRISTOPHER L. BLAKE, Lockheed Martin Aeronautics CLAUDE M. BOLTON, JR., Defense Acquisition University ALLAN V. BURMAN, Jefferson Solutions JOHN T. DILLARD, U.S. Naval Postgraduate School CHARLES E. FRANKLIN, Raytheon (retired) CHARLES L. JOHNSON II, Boeing

AIR FORCE STUDIES BOARD 11 Study Committee LESLIE KENNE, LK Associates ANDREW P. SAGE, George Mason University MARK SCHAEFFER, ManTech SRS Technologies GEORGE R. SCHNEITER, Consultant ROBERT J. SKALAMERA, RJS Consulting RICHARD SZAFRANSKI, Toffler Associates RANDALL S. WEIDENHEIMER, Northrop Grumman Mission Systems REBECCA A.WINSTON, Winston Strategic Management Consultants

AIR FORCE STUDIES BOARD 12 Approach Literature review Presentations to committee and interviews Comparative matrix Survey of PMs and PEOs Can changes in number, content, or sequence of reviews help PM execute program more successfully? Program success orientation + PM perspective = key question:

AIR FORCE STUDIES BOARD 13 Findings Reviews are essential elements of program success –Facilitate program execution, technical and programmatic support, problem discovery and resolution –Inform decisions –Share awareness –Engender program advocacy Reviews are not “free”—there are significant costs –Money –Time spent preparing, presenting, follow-up –Attention diverted from executing program

AIR FORCE STUDIES BOARD 14 Findings Many reviews do not contribute to program in proportion to their costs –In every case, those interviewed or surveyed cited significant costs to carry out reviews Most also noted adverse impact on other responsibilities –Several survey respondents cited reviews that had no positive impact on program cost, schedule, performance Some even cited reviews with negative impact

AIR FORCE STUDIES BOARD 15 Findings No one in Air Force or DOD responsible for monitoring number, workload, costs, effectiveness, or impact of reviews –Many survey respondents described DOD staff as a stove- piped bureaucracy Domain “czars” have purview over breadth of programs but not horizontally integrated for knowledge sharing or synergy PMs have to prepare separate information brief for each OSD staff does not integrate information across domains for optimal decision making by MDA

AIR FORCE STUDIES BOARD 16 Findings Sequencing, timing, and frequency of reviews often not tied to program schedule to most effectively support program execution –Survey respondents suggested least beneficial reviews could be more effective if conducted less frequently and at more appropriate time in program’s life cycle (generally earlier) –Speakers gave examples of requirements reviews being conducted after contracts had been awarded

AIR FORCE STUDIES BOARD 17 Findings Reviews often not attended by right personnel (review principals, key stakeholders, and subject matter experts) or, in some cases, attended by too many personnel –Many reviews conducted without “right” people present –Survey respondents noted that more effort should be given to ensuring that right subject matter experts and appropriate senior officials attend program reviews and that number of attendees be limited to those who can add value to meeting

AIR FORCE STUDIES BOARD 18 Findings For some reviews, number of preparatory reviews is excessive and reviews do not contribute value to program’s management –Many PMs stated that proliferation of meetings and premeetings was taking time away from management of programs –Elimination of IIPT reviews leading to more individual premeetings with Joint Staff, program management offices, and OSD –Sharing of responsibilities between NII/AT&L offices cited as another factor –“The problem isn’t the review... it’s the numerous premeetings.”

AIR FORCE STUDIES BOARD 19 Findings PMs spending time on multiple reviews with similar objectives –Committee matrix named 31 formal reviews—10 identified as duplicating or partially duplicating other reviews –Survey respondents believed that selected reviews could be combined

AIR FORCE STUDIES BOARD 20 Findings Purpose, scope, information needs, key issues, and expected outcomes of many reviews not specified –Often ill-defined, based on presumed agendas or issues of the day –Often no metrics for assessing effectiveness of review Many PMs found that PSRs and IPAs added value –Comprehensive in nature –Well defined processes, outcomes, and metrics –Socialized with PMs and staff –Conducted by subject matter experts –Well documented

AIR FORCE STUDIES BOARD 21 Findings Reviews focus on single system instead of system of systems of which system is a part. Reviews that attempt to address larger system-of-systems perspective often unable to cope with complex interfaces among programs –Seventy percent of ACAT I PMs responding to survey characterized amount of external interface of their programs with other efforts as extensive –Survey written-in responses and PM discussions with full committee noted that some reviews did not take into account connections with and dependencies on other programs for mission accomplishment

AIR FORCE STUDIES BOARD 22 Conclusions Reducing number of reviews or combining them can increase time available to PMs to more effectively manage their programs Reviews could be more effective if sequenced and timed to provide information needed for program execution Required attendance at program review meetings is not clearly communicated nor effectively controlled Streamlining or combining reviews and associated prebriefs in both vertical and horizontal directions could increase efficiency

AIR FORCE STUDIES BOARD 23 Conclusions Important that program review planning is accomplished in thoughtful, purposeful manner with standard approach to address need for communication of expectations and outcomes Review format and design need to reflect greater complexity and interrelationships in current programs to ensure that system of systems works across organizational constructs

AIR FORCE STUDIES BOARD 24 Key Recommendation Engage in timely planning for reviews –Governance process directed by SAE –Owner of review process including reviews, policies, control of review proliferation, pre- and post review mechanisms –Deliberate planning and direction to PM and OPR well in advance –At minimum, review direction to include objectives with metrics, materials to be supplied, criteria for success –Review team report with findings, recommendations, lessons learned –PM closeout report with action plan; open, closed, in- process items; issues or risks –SAE tracking of review process metrics

AIR FORCE STUDIES BOARD 25 Key Recommendation Align reviews with program decision points and milestones –Minimize number of reviews preceding decision points and milestone –Ensure review content is pertinent –Alignment may allow reviews to be consolidated –Could reduce costs and burden on PM and staff

AIR FORCE STUDIES BOARD 26 Key Recommendation Before creating new reviews, determine whether existing reviews could accomplish objectives –Determine whether broadening stakeholders for a given review could accomplish objectives rather than adding new review –Apply same criteria to all pre-reviews –Stakeholders should work together to consolidate prereview process –Establish guidance for managing prereview so it has minimal impact schedule, cost, and program management staff –Encourage OSD to do the same

AIR FORCE STUDIES BOARD 27 Key Recommendation Staff review teams with subject matter experts –Maintain roster of experts in standard technical areas, taking into account back-ups, to guarantee expert is available to attend review –Prepare process guidance document for selection, formation, and use of subject matter expert teams –Consider objectives of review when staffing review team –Ensure continuity of effort—availability of subject matter experts not only during review but also for periods before and after review

AIR FORCE STUDIES BOARD 28 Key Recommendation Document all review outputs –It’s a best practice to capture lessons learned, identify root causes of problems and risks, and document findings, observations, and recommendations made during review Review team report and PM closeout report –Create database for storing and sharing lessons learned; needs to be searchable and updated regularly SAE is suggested owner

AIR FORCE STUDIES BOARD 29 Concluding Thoughts Together, these recommendations form a gold standard for conduct of reviews If implemented and rigorously managed by SAE, review effectiveness and efficiency can be increased Recommendations exemplify continual learning process that builds from one review to next Bottom line is to help PMs successfully execute their programs

AIR FORCE STUDIES BOARD 31 Backup Slides

AIR FORCE STUDIES BOARD 32 Statement of Task The NRC will: 1.Review the prescribed program reviews and assessments that U.S. Air Force space and non-space system acquisition programs in all Department of Defense (DOD) acquisition categories (ACATs) are required to undergo, consistent with the various phases of the acquisition lifecycle, to verify appropriate planning has occurred prior to concept decision, Milestone/Key Decision Point (KDP) A, Milestone/KDP B, and Milestone/KDP C. 2.Assess each review and the resources required to accomplish it, including funding, manpower (people and knowhow), work effort, and time. 3.Assess the role and contribution that each review and the combined reviews make to successful acquisition.

AIR FORCE STUDIES BOARD 33 Statement of Task (cont.) 4.Identify cases where different reviews have shared, common, or overlapping goals, objectives, content, or requirements. 5.Identify and evaluate options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase the cost-effectiveness and to lessen workforce impact of the reviews as a whole, including examination and discussion of review processes used by other agencies (such as, the National Aeronautics and Space Administration and the Department of Energy), the other military departments (the U.S. Army and the U.S. Navy), and industry. 6.Recommend changes that the Air Force and DOD should make to the reviews of Air Force programs, including review goals, objectives, content, and requirements.