Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan Qi Li Mei He

2 University of Southern California Center for Systems and Software Engineering Table of Content Overview of IDPD Quality Management Platform (QMP) Project –Project Information –Data Collection –Data Analysis Results –Discussion Q&A

3 University of Southern California Center for Systems and Software Engineering Incremental Development Productivity Decline (IDPD) Example: Site Defense BMD Software –5 builds, 7 years, $100M –Build 1 productivity over 300 SLOC/person month –Build 5 productivity under 150 SLOC/PM Including Build 1-4 breakage, integration, rework 318% change in requirements across all builds IDPD factor=20% productivity decrease per build –Similar trends in later unprecedented systems –Not unique to DoD: key source of Windows Vista delays

4 University of Southern California Center for Systems and Software Engineering QMP Project Information Project Information: –Web-based application –System is to facilitate the process improvement initiatives in many small and medium software organizations –6 builds, 6 years, different increment duration –Size after 6 th build: 548 KSLOC mostly in Java –Average staff on project: ~20

5 University of Southern California Center for Systems and Software Engineering Data Collection –Most data come from release documentation, build reports, and project member interviews/surveys –Data include product size, effort by engineering phase, effort by engineering activities, defects by phase, requirements changes, project schedule, COCOMO II driver ratings (rated by project developers and organization experts) –Data collection challenges: Incomplete and inconsistency data Different data format, depends on who filled the data report No system architecture documents available

6 University of Southern California Center for Systems and Software Engineering Data Analysis – Staffing BuildTEAMPCONACAPPCAPAPEXLTEXPLEX 1NOMHINOM HIVLO 2HI NOM HI LO 3HI NOM HI NOM 4HI NOM HI 5VHILONOM HI 6VHILOHINOM HI Staffing and Personnel Capabilities Ratings Staffing is stable for most early builds, and enough talents stayed in the project to overcome loss of developers Some staff turnover occurred during build 5 and build 6 Experience gained – application and platform Team cohesion improved

7 University of Southern California Center for Systems and Software Engineering Data Analysis – Phase Effort Distribution BuildRqtDesignImplTest 113.2%16.5%51.3%18.9% 213.7%16.3%48.5%21.6% 313.9%16.7%48.6%20.8% 418.8%21.5%28.2%31.6% 52.6%6.9%39.5%50.9% 610.9%6.9%34.3%48.0% Phase Effort Percentage Experienced major integration difficulties in build 3 – major drop in productivity (see next slide) Forced re-architecting in build 4 – increase in requirement and design effort Re-architecting paid off in build 5 and 6, which focused primarily on implementation and testing Testing and fixing as a major source of added integration effort – testing phase effort increased from build to build

8 University of Southern California Center for Systems and Software Engineering Data Analysis – Productivity Trends The slope of the trend line is -0.76 SLOC/PH per build Across the five builds, this corresponds to a 14% average decline in productivity per build. This is smaller than the 20% Incremental Development Productivity Decline (IDPD) factor for a large defense program Most likely because the project is considerably smaller in system size and complexity Build Size (KSLOC) Effort (PH) Productivity (SLOC/PH) Productivity Variance 169.1271959.610.0% 26490807.05-26.6% 31966472.86-59.4% 439.28787.54.4656.1% 520731684.56.5346.5% 66516215.14.01-38.6%

9 University of Southern California Center for Systems and Software Engineering Discussion Staffing stability helps to improve team cohesion and developer experience, thus provide positive contribution to productivity outcome Design deficiency and code breakage causes productivity declines –In our case study, the development team encountered integration difficulties in build 3, where the original design was insufficient to accommodate additional modules, and a re-architecting effort was necessary to put this project back on track – as what they have done in build 4. –Inserting new code into the previous build adds effort to read, analyze, and test both the new and old code in order to ensure nothing is broken, this extra effort may be mitigated by experienced staff Extra testing and fixing effort, particularly regression and integration tests, is inevitable, and the amount of this extra effort will increase as the system becomes larger and larger

10 University of Southern California Center for Systems and Software Engineering Future Research Using COCOMO II Cost Drivers to normalize new size and effort: –Product Effort Multipliers on Size –Personnel Effort Multipliers on Effort –Find Significant Effort Multipliers and analyze its impacts on productivities Calibrate Equivalent New Size –Calculate equivalent new size based on CodeCount TM “Diff” for each increment and compare that with actual size –Use the results to adjust parameters for calculating equivalent new size with integration rework consideration

11 University of Southern California Center for Systems and Software Engineering Q & A Questions? Comments? Thank you very much


Download ppt "University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan."

Similar presentations


Ads by Google