Presentation on theme: "Commander, Electronic Systems Center"— Presentation transcript:
1 Commander, Electronic Systems Center Acquisition Physics 101: Everything I Learned About Acquistion, I Learned From HollywoodLt Gen Ted F. BowldsCommander, Electronic Systems Center
2 Everything I learned about acquisition I learned from Hollywood Acquisition Physics 101Everything I learned about acquisition I learned from Hollywood"Each problem that I solved became a rule which served afterwards to solve other problems.”- Rene Descartes ( ), "Discours de la Methode"
3 Does anyone believe commercial industry has the secret formula to perfect program acquisition?
4 Commercial Example from Aerospace As of December 2008, first flight expected 2nd Quarter 2009, 1st delivery 1st Quarter 2010Boeing’s July prediction at the roll-out event:First flight September 2007First Commercial passengers May 2008Announced delays of at least 20 monthsTechnical – Schedule
5 Commercial Example from Computer/Software March, 2006: Microsoft plans to delay the consumer launch of its much-anticipated Windows Vista operating system to January 2007.It had originally aimed to launch Vista - the first major update since Windows XP was introduced five years ago - in the second half of 2006.Now, a version will be available for corporate customers from NovemberVista will then be rolled out for consumers after the holiday shopping seasonAnnounced delays of at least 10 months, quickly followed by system patchesTechnical – Schedule
6 Commercial Example from Construction The Big Dig was estimated in 1985 at $2.8 billionA July 17, 2008 article in The Boston Globe stated, "In all, the project will cost an additional $7 billion in interest, bringing the total to a staggering $22 billionOn January 23, 2008, it was reported that the consortium that oversaw the project, would pay $407 million in restitution for its poor oversight of subcontractors (some of whom committed outright fraud), as well as primary responsibility in the death of a motorist.Cost – Schedule
7 So, how does commercial program management experiences compare to AF acquisition programs?
8 Sources of Weapon System Cost Growth Analysis of 35 Major Defense Acquisition Programs CategoryDevelopmentCost Growth (%)ProcurementTotalError19.614.714.6Cost Estimate18.08.410.1Schedule Estimate1.0.9Technical Issues0.65.43.5Decisions30.757.441.6Requirements17.59.512.9Affordability-1.9-0.5-1.3Quantity4.340.821.9Schedule6.010.08.9Inter- or intraprogram transfers4.8-2.4-0.7Total (development plus procurement) cost growth is dominated by decisions, which account for more than two-thirds of the growth. Most decision-related cost growth involves quantity changes (22 percent), requirements growth (13 percent), and schedule changes (9 percent). Cost estimation (10 percent) is the only large contributor in the errors category. Growth due to financial and miscellaneous causes is less than 4 percent of the overall growth.The dominant influence of decisions is somewhat unexpected, because previous studies have reported nearly the reverse. A clear contributor is our inclusion of quantity changes, which are responsible for more than one-third of total cost growth and more than half of the total for the decisions category. Also somewhat unexpected is the small contribution of technical issues to average cost growth. Such issues contributed to cost growth in only a few programs.Sources of Weapon System Cost Growth, Analysis of 35 Major Defense Acquisition Programs, 2008 RAND Project Air Force
9 Well then, who should we turn to for advice on how to improve Air Force acquisition?
10 Let’s give Hollywood a shot at acquisition advice! There have been numerous DOD studies over the past decades on how to improve acquisition, maybe it’s time for guidance from a different source…
11 Acquisition Advice from Hollywood Program Manager’s ResponsibilityClear and Executable RequirementsTechnical RealismAccurate CostBelievable ScheduleEffective Communication
12 Lt. Col. Kirby Yorke’s View on Program Manager Responsibilities Regardless of all the “help” and guidance you might receive, in the end, it is all the Program Manager’s fault.
13 Ben Hur’s View of Working In A Program Office "We keep you alive to serve this ship (program). So row well and live."
14 John 'Bluto' Blutarsky Comments on Setting Requirements The user may be responsible for establishing requirements, but the program manager is responsible for ensuring they are executable.
15 Initial Requirements Definition/Tradeoff The Value of Initial Requirements and Concept DefinitionActively Manage RequirementsThe initial requirements definition and tradeoff phase is rarely performed with sufficient rigor. In many agencies, responsibility for requirements definition, resource allocation, and acquisition are spread across multiple organizations without a process for making explicit tradeoffs among cost, schedule, and performance. The importance of spending sufficient time and resources in this initial phase cannot be overemphasized. This figure from the 1992 Lessons Learned, Cost/Schedule Assessment Guide, National Aeronautics and Space Administration, by W. Gruhl) shows acquisition program cost performance as a function of the amount spent on initial requirements and concept definition. Performance improves dramatically when a significant proportion (up to 15 percent) of the total program cost is for requirements and concept definition.“Performance improves dramatically when a significant proportion … of the total program cost is for requirements and concept definition.”Information Technology Acquisition: A Common Sense Approach, Defense AT&L, Mar-Apr 09, Alfred Grasso
16 Police Chief Martin Brody on Spiral Requirements There is a greater chance for success with smaller, more manageable capability releases.
17 Smaller, More Manageable Capability Releases Small Diameter BombPhase I: Fixed Target VariantMS B to IOC: ~ 3 YearsPhase II: Moving Target VariantMSB Planned FY 10Remote Ops Video Enhanced Rec’r (ROVER)I: Video a/a (Predator to AC-130)II: Video a/g downlink (Aircraft to JTACS)III: Multi-bandeROVER III: Enhanced receptionIV: Added S-band, other improvementsV: Handheld design with encryption capabilitySDB DETAILS:Joseph Diamond, the Air Force's program executive officer for weapons, told Defense Daily in an interview yesterday, Mar 01: "The idea is to reduce the cycle time to bring something to the warfighter quickly. The two-phased approach is definitely going to be the approach we take."SDB I: Based on the source selection decision by the SDB Program Executive Officer, on October 17, 2003, the SDB contracting officer awarded a cost-plus-award-fee-type contract (FA C-0019) to Boeing for the Phase I (fixed targets) and carriage system portion of the SDB Program. After the contract was awarded, the Air Force issued an addendum to the sole-source justification and approval to include the previously deferred Phase II requirements. On November 10, 2004, Lockheed filed a formal protest with the Government Accountability Office concerning the SDB procurement because of Ms Druyun’s bias towards Boeing. On February 18, 2005, the Government Accountability Office sustained the Lockheed protest.ROVER DETAILS:ROVER: Allowed AC-130 gunships to receive live video streams from MQ-1 Predator unmanned armed reconnaissance platforms as the gunships approached the target area. The Predator would approach the target area at the same angle as the gunship so as to give the gunship crew an exact preview of what they would soon see.ROVER II: Enabled JTACS to view the video from overhead PredatorsROVER III: Comprises a ruggedized laptop computer, two antennas and a receiver as well as the Falcon View software for displaying the imagery and a classified database for generating GPS targeting coordinates. In addition to Predators, ROVER III can now receive imagery from platforms such as the Army's Hunter and Shadow unmanned aerial vehicles, Navy's P-3 reconnaissance aircraft as well as the Air Force's LITENING targeting podROVER IV: An evolution of ROVER III that adds an S-band antenna and the capacity to shut off certain bands, if necessary, so they do not interfere with the imagery transmission in a particular band at a given time.ROVER V: Will be a self-contained unit, with the all of the antennas, video receiver element, screen and interfaces embedded in the device. Will be able to connect to laptop computers for increased functionalityTotal dollars for Rovers (Total cost to date)?$141,709,532-- ROVER 2 - Qty $5,680,000-- ROVER 3/eROVER - Qty $84,633,182-- ROVER 4 - Qty $46,578,109-- ROVER 4 - Qty 20 on order - $670,000-- ROVER 5 - Qty 84 on order - $4,148,241TOTAL - Qty 4,031 - $141,709,532
18 Requirements Stability Requirements Changes, Research and Development Cost Growth, and Delays in Providing Initial Operational CapabilitiesREPORT EXCERPT:Programs that changed key system requirements after starting development had added instability. Twenty-two of the 52 programs in our assessment that provided data on requirements changes had at least one change (addition, reduction, or deferment) in a key performance parameter since development start. The average increase in research and development costs over first estimates for these 22 programs was more than three times greater than for those programs with no requirements changes. The average delay in the delivery of initial operational capabilities was also twice as long for programs with changes in key performance parameters as for programs with no requirements changes (see fig. 5). Further, 6 programs with requirements changes experienced a decrease in planned quantities of 25 percent or more compared to only 2 programs without requirements changes.GAO SP, Assessments of Major Weapon Programs, Mar 09
19 HAL 9000 on TechnologyWhile we would like to believe otherwise, technology is not perfect, expect errors or have a plan B.
20 Technology Readiness Levels Cost and Schedule Experiences on Product DevelopmentsREPORT EXCERPT:The TRLs were pioneered by the National Aeronautics and Space Administration and adopted by the Air Force Research Laboratory to determine the readiness of technologies to be incorporated into a weapon or other type of system. The Joint Advanced Strike Technology program--from which the JSF program evolved--madeextensive use of TRLs to assess early maturity levels for many of the current JSF technologies. The program also identified TRL 7 as the acceptable readiness level for low-risk transition into the engineering and manufacturing development phase.We also used TRLs in our prior work when, at the request of the Chairman of the Senate Armed Services Committee’s Subcommittee on Readiness and Management Support, we assessed the impact of technology maturity on product outcomes. During that work, we reviewed commercial and DOD experiences in incorporating 23 different technologies into new product and weapon system designs. Table [above] shows that cost and schedule problems arose when programs started when technologies were at low readiness levels and it conversely shows that programs met product 10 objectives when the technologies were at higher levels of readiness when the programs were started.GAO/T-NSIAD Joint Strike Fighter Acquisition, 16 Mar 00
21 Technology Maturity Weapon Systems Acquisition Reform Act of 2009 Section 103. Technological Maturity Assessments. For years now, the Government Accountability Office (GAO) has reported that successful commercial firms use a “knowledge-based” product development process to introduce new products. Although DOD acquisition policy embraces this concept, requiring that technologies be demonstrated in a relevant environment prior to program initiation, the Department continues to fall short of this goal. Last Spring, GAO reviewed 72 of DOD’s 95 major defense acquisition programs (MDAPs) and reported that 64 of the 72 fell short of the required level of product knowledge. According to GAO, 164 of the 356 critical technologies on these programs failed to meet even the minimum requirements for technological maturity. Section 103 would address this problem by making it the responsibility of the Director of Defense Research and Engineering (DDR&E) to periodically review and assess the technological maturity of critical technologies used in MDAPs. The DDR&E’s determinations would serve as a basis for determining whether a program is ready to enter the acquisition process.According to GAO, 164 of the 356 critical technologies on these programs failed to meet even the minimum requirements for technological maturity.Weapon Systems Acquisition Reform Act of 2009:Introduced Feb 09 by Sen Carl Levin (D-MI)Requires, with respect to Department of Defense (DOD) weapon systems acquisition organization: (1) a report on systems engineering capabilities; (2) the establishment of a Director of Developmental Test and Evaluation; (3) an assessment of the technological maturity of critical technologies of major defense acquisition programs (MDAPS); (4) the establishment of a Director of Independent Cost Assessment; and (5) the Joint Requirements Oversight Council to seek and consider input from commanders of combatant commands in identifying joint military requirements.Requires, with respect to DOD weapon systems acquisition policy: (1) the Secretary of Defense to develop and implement mechanisms to ensure the consideration of tradeoffs between system cost, schedule, and performance; (2) the milestone decision authority for an MDAP to have received a preliminary design review and conducted a formal post-preliminary design review assessment before an MDAP may receive Milestone B or Key Decision Point B approval; (3) the Secretary to ensure that each MDAP acquisition plan includes measures to maximize competition at both the prime contract and subcontract level throughout the MDAP's life cycle; (4) the Secretary to undertake specified actions in the event of MDAP critical cost growth; (5) addressing organizational conflicts of interest by contractors in the acquisition of major weapon systems; (6) the establishment of an Organizational Conflict of Interest Review Board; and (7) the Secretary to award DOD military and civilian personnel for performance excellence in the acquisition of DOD products and services.The DDR&E’s determinations would serve as a basis for determining whether a program is ready to enter the acquisition process.Weapon Systems Acquisition Reform Act of 2009: Introduced by Sen Levin (D-MI) on 23 Feb 09
22 Dr Evil’s View on Cost Estimating Don’t under estimate the cost of a program, demand funding to support the requirements (80% confidence...maybe).
23 A Typical Air Force Program? NOTES FROM COL SHIMEL:A cost distribution that appears to be high confidence but fails to account for the realistic underlying conditions of risk and uncertainty will get you into trouble, no matter how much you are funded.In this example, NPOES was originally bid at $6B. It was estimated that for an extra 3% ($200M) the estimate could get to 80% confidence. The government program office countered that they thought it would take $6.4B and for 6% ($400M) more funding, they could reach 80% confidence. The independent estimators predicted $7.7B, and it would take an added 10% ($800M) of funding to reach 80% confidence. Actual costs are now past $13B--even after de-scoping significant requirements.The key to this apparent unchecked growth in costs is a failure to learn about and address the actual conditions of the system that was being developed, not failing to throw enough money at the program. The 80% confidence level of a poorly defined or understood program is unreasonably expensive. Optimistic plans will not fix that.A different process drives each of these curves (bidding, budget, estimating). Unless the underlying assumptions are all the same and addressed critically, consistently and honestly, it is foolish to think they will produce consistent, dependable results.A program manager who optimistically discounts the real issues s/he may face or fails to communicate them will never escape a pattern of struggle-- a RUT of RISK, UNCERTAINTY and TROUBLE!
24 Cost Estimating (Guessing) “It’s tough to make predictions, especially about the future.”-Yogi Berra
25 Dusty Bottoms’ on the Importance of Good Plan (Schedule) Take the time upfront to develop a believable and executable plan (IMP/IMS) with measurable inch-stones and use it.
26 What’s Driving All The Delays? WHAT GAO FOUND: REPORT EXCERPTSSince 2003, DOD’s portfolio of major defense acquisition programs has grown from 77 to 96 programs; and its investment in those programs has grown from $1.2 trillion to $1.6 trillion (fiscal year 2009 dollars). The cumulative cost growth for DOD’s programs is higher than it was 5 years ago, but at $296 billion, it is less than last year when adjusted for inflation. For 2008 programs, research and development costs are now 42 percent higher than originally estimated and the average delay in delivering initial capabilities has increased to 22 months. DOD’s performance in some of these areas is driven by older programs, as newer programs, on average, have not shown the same degree of cost and schedule growth.DOD’s 2008 portfolio of major defense acquisition programs includes 96 programs—a net increase of 1 from a year ago and 19 since The total investment in research, development, test and evaluation (RDT&E) and procurement funds for this portfolio is still about $1.6 trillion, while the funding needed to complete the programs in it has decreased by about $89 billion from a year ago. The total cost growth for DOD’s portfolio of major defense acquisition programs is higher than it was 5 years ago, but at $296 billion, it is actually less than the 2007 portfolio’s cost growth of $301 billion.Note: 31 Mar AT&L Letter to DEPSECDEF Contends $296B is a Sensationalized Figure
27 Ground Element MEECN System (GEMS) 140% (~$138M) Overrun (EAC) And 4 Years LateHeavy Dependency On JTRS Cluster 1 Products – Didn’t MaterializeCritical Path Elements, Linkages InsufficientNot Linked To EVM System Or Accessible Program MetricsTasks Would Push To The Right Without Any FlagsInadequate Planning And Processes Made It Difficult To Manage And Measure Program ProgressGEMS BACKGROUND:There were a number of contributing factors to that overall program slip and cost overrun. What’s pertinent in this particular illustration is that the magnitude and impacts of slips were not immediately obvious to decision makers because of a lack of visibility into a complex IMS and no automated linkage to EVM. Very few people have time to analyze a detailed IMS for an entire development program. We typically create upper level schedules that theoretically track to that IMS, but don’t always if there’s a manual update process…human error, and a lot of times, don’t indicate when tasks have slipped. The new Tier Schedules we’ve established roll-up the IMS and display baseline slips automatically. A change in the IMS automatically adjusts the Tier schedules and tracks in EVM so variances are reported.The systems we use haven’t changed, we’re doing a better job of using them by integrating them and of course, establishing a good IMS with task linkage and critical path well defined. We use MS Project for the IMS and SAP for the EVM system. The problem wasn’t the systems but how we used them and what we put into them…Improvements In Work--Program Re-baselineBuilt 7000-line Detailed IMSCritical Path Defined; Will Link To New EVM BaselineFeeds Tiered Schedules (Easily Readable For PMs And Decision Makers) So Low-level Task Slips Are Flagged
28 So it all boils down to the basics; cost schedule and technical performance with a clear understanding and acceptance of the risks.
29 Common Symptoms of Nunn-McCurdy “Class of 2007” Programs Initial program structure from a big A perspectiveRequirements definition - inability to manage requirements trade-spaceCost estimatingFundingScheduleRisk assessment/technology maturityProgram ExecutionIneffective IMS with lack of critical pathSystems engineering, particularly early in acquisition cycleAdding new capabilities within existing program baselineDT&E failed to accurately predict OT&E resultsPremature entry into productionNo consequences for poor program performanceLack of program stability – requirements, funding, quantities
30 Is It As Bad As The GAO Says? Mr. Young’s Take … Mar 09 GAO Evaluation of 2008 Portfolio96 Programs; $296B Cost GrowthMr. Young’s Analysis:Low Cost EstimatesFluid RequirementsOptimistic SchedulesExcessive Certification StdsImmature Design/TechnologyPoor Govt/Industry PerformanceAmount# ProgramsDriverConclusion$95.7B18Qty IncreasesNot True Cost Growth$72.2B9Qty Decrease-$57B39Negative Growth (Qty Decreases)Good unit cost control$166.6B28MultipleInternal & External$277.5B94EXERPTS FROM MR. YOUNG’s 31 MAR 09 MEMO:The new GAO report continues to sensationalize the assessed cost growth of $296 billion on 96 programs. This number has been cited by many people as a condemnation of the defense procurement process. I have analyzed the components ofthis GAO number, and I would suggest that the number is misleading, out-of-date, and largely irrelevant to the current management of DoD programs. First, the GAO analysis is grounded in a dissimilar comparison of programs. Only 58 programs are common between the 2003 and the 2008 portfolios. The GAO analysis highlights 77 programs producing $183 billion of cost growth in the 2003 portfolio and 96 programs producing $296 billion of cost growth in the 2008 portfolio. The GAO states that the "cumulative cost growth is higher than it was five years ago." However, 20 programs in the 2003 portfolio are not included in the 2008 portfolio. More importantly, 39 new programs were included in the 2008 portfolio which were not included in the 2003 portfolio. Thus, a total of 59 programs moved into or out of the portfolio between 2003 and Therefore, it is not possible to draw meaningful conclusions about trends or performance by comparing these dissimilar portfolios.I previously wrote you a memo assessing 41 major defense programs. I will use a similar framework to review the 96 programs and the $278 billion of higher cost. In summary:• $95.7B, 18 programs is a result of increased quantities over the original program baseline. Higher costs due to increased quantities do not constitute true cost growth and do not reflect a problem with defense acquisition processes or defense industry.• $72.2B, 9 programs is a result of lowering procurement quantities and slowing program execution. Higher costs due to slower procurement and quantity reductions as a result of budget cuts and program stretches do not constitute true cost growth and do not reflect a problem with defense acquisition processes or defense industry.• A negative $57B, 39 programs. 27 of these programs are being completed within 10% of their original cost baseline. As I have often expressed to you, our goal is to have no cost growth. However, this small amount of cost growth is a result of a number of factors: DoD weapons programs frequently use exotic materials, include a significant amount of software, are purchased under limited competition, are driven by excessive DoD certification requirements, and are subject to annual budget fluctuations imposed by the Defense Department and the Congress. Limited cost growth under these circumstances would not necessarily reflect a failed DoD acquisition process. The larger, negative number (cost reduction) is primarily attributable to several programs which experienced quantity reductions and were still executed with good unit cost control - generally a result of buying the lower quantity efficiently and not stretching the program.These three categories cumulatively account for a net cost growth of$110.9 billion, or 40%, of DoD's $278 billion figure and 66 of the 96 programs in the portfolio. Again, these programs do not constitute legitimate cost growth that can be attributed to a failed defense acquisition management process. Indeed, this data makes clear that 66 programs have performed reasonably well, sometimes even in the face of budget chum and quantity changes. Thus, I would reiterate the view that the $296 billion is a sensational number that is misleading, out-of date, and irrelevant to the current DoD procurement process.Finally, I have identified a group of programs which have not performed to their initial cost estimate.• $166.6 billion (net) of cost growth is attributable to 28 programs which experienced cost growth and the dominant feature was not quantity increases or decreases.“I do not think it is possible to conclude that the DoD acquisition process alone is broken” – Mr. John Young
31 Captain, Road Prison 36 on the Importance of Communication Clear and constant communication to everyone involved is essential.
32 It only works with determination and leadership. 2 quick data points
33 Amazing Catch Super Bowl XLII The defining play of the game: faced with third down and five yards to go from their own 44-yard line with 1:15 remaining, Giants quarterback Eli Manning avoided what looked like a sack and completed a 32-yard pass to wide receiver David Tyree, who made a leaping catch by pinning the ball on his helmet. Four plays later Giants score the winning touchdown with 0:35 left in the game.Question: Who was the PM for the helmet?Answer: Doesn’t matter, the PM isn’t going to get a Super Bowl ring much less any credit.Leadership: “If anything goes bad, I did it. If anything goes semi-good, we did it. If anything goes real good, then you did it. That's all it takes to get people to win football games for you.” Paul Bear Bryant
34 Do You Recognize this Weapon System? First F-117 Production AircraftCrashed on its maiden flightCause was a mis-wired yaw and pitch gyrosDid not change the determination to continue with the program!Lockheed SkunkWorks – Palmdale, CA
35 SummaryWeapon System Acquisition is a contact sport, it’s not for the timidIt has never been perfect, never willRisk management is key, both high risk and low riskDon’t forget the 5 P’s…Proper Planning Prevents Poor PerformanceNoli nothis permittere te terere**Don't let the bastards wear you down
Your consent to our cookies if you continue to use this website.