Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Efforts at the NRO Cost Group 21 st International Forum on COCOMO and Software Cost Modeling November 8, 2006.

Similar presentations


Presentation on theme: "Software Efforts at the NRO Cost Group 21 st International Forum on COCOMO and Software Cost Modeling November 8, 2006."— Presentation transcript:

1 Software Efforts at the NRO Cost Group 21 st International Forum on COCOMO and Software Cost Modeling November 8, 2006

2 Purpose Explain how the NCG uses USC’s code counter data Introduce NCG’s Software Database Insight into Difference Application Results and Software Trends

3 Background NCG maps the USC output files to a CSCI/CSC and Work Breakdown Structure (WBS) –Mapping is most meaningful when mapped to the lowest functional level possible within the WBS Mapping is very labor intensive if done through excel spreadsheets or other manual methods

4 CSCI/CSC: DP/DPAP CSCI/CSC: DP/DPCC CSCI/CSC: DP/DPCU

5 Software Database NCG created a software database which automates the mapping process Software database is primary tool for storing ALL NCG software related data. Database will provide –Low level functional breakout –Traceability to past programs –Historical representation of development process Database will help us better understand trends for –Code Counts –Staffing Profiles –Discrepancy Reports (DRs) –Schedules

6 Database Functionality Database allows for the importation of: –Code counter output files for any language –Difference Application output files –CSCI/CSC listing –Staffing data –DR data –Cost and Hours –Programmatic/Technical data Database allows the mapping of output file folder paths to a WBS and CSCI/CSC

7 Walkthrough of Software Databases’ Key Functionalities

8

9

10

11

12

13 Unmap

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33 Insight into Difference Application Results and Software Trends

34 Introduction NCG collects datasheet information regarding program “Complexity Attributes” –This can be defined as “Program Development Environment” data e.g. Number of years of experience of programmers … We also collect USC output files for LOC and Reuse trends We collect Staffing Profiles –Staffing Profiles are broken out by CSCI We collect Discrepancy Reports (DRs) –DRs are broken out by Priorities and ranked in some manner Are there any useful trends if we analyze all the data collectively?

35 CSCI #1 Example Below is a summary of CSCI #1 Code reaches stable point Increase in staff to fix DR’s New code looks proportional to Staffing !!! Heritage code written in C and remainder written in C++

36 CSCI #2 Example Below is a summary of CSCI #2 Keep an eye on the peak staffing levels Heritage code written in C and remainder written in C++

37 CSCI #3 Example Below is a summary of CSCI #3 Peak staffing trend? Heritage code written in C and remainder written in C++

38 CSCI #4 Example Below is a summary of CSCI #4 Low Modified code trend continues Heritage code written in C and remainder written in C++

39 CSCI #5 Example Below is a summary of CSCI #5 Any guess on why this looks different from the previous trends?

40 Reuse This CSCI shows the usefulness of breaking out New and Deleted code from the Total code counts What assessment can be made of this development? –Unstable requirements? –Re-writing same code? Code cleanup occurring after each New code delivery? Total counts show change, but without “Diff”, you can’t see why Notice that there is nearly 5,000 New SLOC and 5,000 Deleted SLOC Doesn’t look STABLE Deleted code size is proportional to previous New code size Code looks stable at this point, but in reality, it was dynamic!!

41 More on Reuse USC “Diff” results can provide better insight into how much Reuse came from heritage programs Example: –Program B uses Program A software as a starting point –Program A metrics: 50,841 Total Logical Code Counts –Program B is completed and returns the following metrics: 1,937,167 Total Logical Code Counts

42 More on Reuse (cont.) Run “Diff” counter on the initial baseline (in this case, Program A) and the final baseline (Program B) –“Diff” results show: New Logical code: 1,918,011 Deleted Logical code: 31,685 Modified Logical code: 5,701 Unmodified Logical code: 13,455 –Compute Reuse from Program A: –Unmodified (at Program B completion) / Total (at Program B start) –13,455 / 50,841 = 26% »26% of Program A was “DIRECT” reuse into Program B »This is not 26% of Program B! –This is one way to simplify the Reuse problem

43 Operators / Tokens Here are 8 types of operators that could be counted: –Logical &&, || –Trigonometric Cos(), Sin(), Tan(), Cot(), Csc(), Sec() –Log Log(), Ln () –Preprocessor #, Sizeof() –Math +, -, *, /, sqrt(), %, ^ –Assignment = –Pointers –Conditional if, else, switch, case, ? As well as, Nesting of the loops –Level 1 loop Level 2 loop –Level 3 loop »Level 4 loop

44 Complexity Density Ratios Here are snapshots at various baselines of a CSCI –If we only looked at the final numbers, we would assume nearly 450,000 Logical lines of code –What do you make of this???? A growth of 371,868 Logical SLOC What productive developers!

45 Complexity Density Ratios (cont) Looking at the Complexity Density Ratio for different operators, we can understand more about the development Code looks different

46 Complexity Density Ratios (cont) Looking at the SW as an entity, composed of many different elements, the Complexity Density Ratio allows us to see the make up of the SW –This can be characterized as a signature of the SW development! In the previous example, the program received SW from another project (not directly associated to the current project) –The Complexity Density Ratio validates that the SW is different for the ongoing development This should be taken into account when trying to come up with any productivities –ANALYST BEWARE! »DON’T BLINDLY USE DATA

47 Summary The NCG continues to standardize our code counting efforts –Essential for normalizing our data across multiple programs, multiple contractors The NCG works closely with USC to develop a complete USC Code Counting Tool Suite –Addressing necessities such a new ways of looking at reuse, complexities, trends, etc. The NCG has invested extensive resources to use the USC code counter files and parse the USC output files Our goal is to establish consistency across the Intelligence Community –Primarily involves our industry contractors


Download ppt "Software Efforts at the NRO Cost Group 21 st International Forum on COCOMO and Software Cost Modeling November 8, 2006."

Similar presentations


Ads by Google