We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byIrvin Tush
Modified about 1 year ago
© 2014 The MITRE Corporation. All rights reserved. Approved for Public Release. Distribution Unlimited Authors: Gina Guillaume-Joseph, PhD Candidate Dr. James Wasek Dr. Enrique Campos-Nanez, and Dr. Pavel Fomin The George Washington University School of Engineering and Applied Sciences Engineering Management and System Engineering Dept.
© 2014 The MITRE Corporation. All rights reserved. Predictive Analytics is a data driven technology used to predict and influence the future. We develop a Predictive Model that determines failure points in the SELC and relates them to specific causal factors of testing. Our work attempts to optimize project data and information to provide informed and real-time decisions that combat financial risks incurred with failed projects. 2
© 2014 The MITRE Corporation. All rights reserved. Ewusi-Mensah, 2003 offers an empirically grounded study on software failures and proposes a framework of abandonment factors 1 that highlight risks and uncertainties present in the SELC phases of a software project. Takagi et al, 2005 analyzed the degree of confusion 2 of several software projects using logistic regression analysis to construct a model to characterize confused 2 projects. 1 Ewusi-Mensah, Kweku “Software Development Failures.” MIT Press (1): Software Development Failures.” 2 Takagi, Yasunari, Osamu Mizuno, and Tohru Kikuno. “An Empirical Approach to Characterizing Risky Software Projects Based on Logistic Regression Analysis.” Empirical Software Engineering, volume 10, number 4, pages , December
© 2014 The MITRE Corporation. All rights reserved. This work introduces the Project Testing Confidence Metric (PtcM) and the corresponding Predictive Model. The Model developed from data of software project failures and successes is based on a framework that identifies significant influencing failure factors and impact on the four major phases of the SELC. 4
© 2014 The MITRE Corporation. All rights reserved. The failure factors in the testing phase have the greatest impact on software project failure. The variables are used to develop the Model. 5
© 2014 The MITRE Corporation. All rights reserved. Software Project failures are costly and often result in an organization losing millions of dollars due to termination of a poor quality project (Jones, 2012). Software engineering is a risky endeavor whose outcome often cannot be predetermined. Software Testing is a critical component of mature software engineering; however, project complexities make it the most challenging and costly phase of the Systems Engineering Lifecycle (SELC) (Jones, 2012). Jones, Capers. “Software Quality Metrics: Three Harmful Metrics and Two Helpful Metrics”; June 2012; Retrieved from website: engineering/free%20resources/Software%20Quality%20Metrics%20Capers%20Jones% pdf. 6
© 2014 The MITRE Corporation. All rights reserved. The Predictive Model leverages a development organization’s past project performance to predict outcomes of future work. The PtcM uses that data to determine the effectiveness of testing by correlating previous project failure with inadequate testing to isolate those areas for improvement. 7
© 2014 The MITRE Corporation. All rights reserved. The Predictive Model and the resulting PtcM provide the organization’s leadership insight into determining which projects to embark upon within the project portfolio. 8
© 2014 The MITRE Corporation. All rights reserved. The Predictive Model and the PtcM will assist in maturing an organization’s testing and quality assurance capabilities by implementing institutional learning. By predicting the likelihood of project failure during the early planning phase, this work will promote a more successful project portfolio for the organization. Our work helps organizations answer the question, “What will happen in the future and how can we act on this insight?” 9
© 2014 The MITRE Corporation. All rights reserved. Ms. Gina Guillaume-Joseph The MITRE Corporation Systems Engineering, Ph.D. Candidate The George Washington University Contact: 10
1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Process Improvement IS301 – Software.
1 Systems Engineering A Way of Thinking A Way of Doing Business Enabling Organized Transition from Need to Product August 1997 Systems Engineering Technical.
Best Value Procurement MnDOT State Aid for Local Transportation Minnesota Local Road Research Board MnDOT Office of Construction and Innovative Contracting.
What is the Value of Architecture Andrew L Macaulay Global Head of Architects Community March 2006 In collaboration with Microsoft Architect Insight Conference.
© 2007 The MITRE Corporation. All rights reserved Driving Change in SOA Implementations: Can Organizations & People Really Change? September 29, 2008 Dr.
Q. Assessing Leadership and Measuring Its Effects Only 8% of Fortune 1000 executive directors rate their leadership capacity as excellent, while 47% rated.
Prepared and Presented By Sally Al-Gazzar September 2013.
09 November 2007 Insights from Assessing the Risk Management Programs of Major Defense Acquisition Programs Peter Lierni, PMP, CISA Lierni © 2007.
1 Implementation of Application Portfolio Management Overview July 2006.
Microsoft Solutions Framework Executive Overview Microsofts Best Practices For IT Solutions Success Kyle Korzenowski Product Planner Microsoft Business.
Supply Chain Management Workshop Buenos Aires, 13 de Agosto de 2004.
System Implementation Success Factors; It’s not just the Technology by Paula J. Vaughan University of Colorado at Boulder.
Find the Treasure! Putting the Scraps of Knowledge Together to Map Your Way to New Models of Care Seeking New Models of Care Treasures This poster was.
© 2014 by McGraw-Hill Education. This is proprietary material solely for authorized instructor use. Not authorized for sale or distribution in any manner.
P-20 Data Collaborative Grant University/College Work Group February 24, 2010.
From Managing Instruction to Ensuring Learning By Susan Beck Ellen Oderman Phyllis Veith Student.
This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the intended Gartner.
11 Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
The Field of Engineering Systems and its Impact on Systems Engineering Presented By Dr. Donna Rhodes Massachusetts Institute of Technology August 9 th,
Lecture 8: Testing, Verification and Validation Dr Valentina Plekhanova University of Sunderland, UK.
Blackboard Client Support – Built for Success Craig Chanoff Vice President of Blackboard Client Support April 2005.
Monitoring and Predicting General Vegetation Condition Using Climate, Satellite, Oceanic, and Biophysical Data Tsegaye Tadesse 1, Brian D. Wardlow 1, and.
Beginning Action Research Learning Cedar Rapids Community Schools October, 2004 Dr. Susan Leddick.
BI Project Business Intelligence Cookbook A Project Lifecycle Approach Using Oracle Technology John Heaton.
Developing and Using Institutional Plans. Christopher D. Lambert Associate Director of Commission Relations ACCSCT.
The Importance of Data Analytics in Physician Practice Massachusetts Medical Society March 30, 2012 James L. Holly, MD CEO, SETMA, LLP Adjunct.
Applying Results-Based Accountability™ to Connecticut Juvenile Justice Programs Presented by: Ron Schack, Ph.D. Director The Charter Oak Group, LLC June.
McGraw-Hill/Irwin © The McGraw-Hill Companies, All Rights Reserved BUSINESS PLUG-IN B14 Systems Development.
13 June 2013 | Virtual Business Analytics Chapter A Best Practices Framework for Data Mining Mark Tabladillo, Ph.D., Data Mining Scientist Artus Krohn-Grimberghe,
© 2016 SlidePlayer.com Inc. All rights reserved.