Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Software Project Estimation I (Overview) Barry Schrag Software Engineering Consultant MCSD, MCAD, MCDBA

Similar presentations


Presentation on theme: "Introduction to Software Project Estimation I (Overview) Barry Schrag Software Engineering Consultant MCSD, MCAD, MCDBA"— Presentation transcript:

1 Introduction to Software Project Estimation I (Overview) Barry Schrag Software Engineering Consultant MCSD, MCAD, MCDBA barryschrag@hotmail.com

2 What Are Your Expectations From this class?

3 Introductions Your background Reason for taking this workshop An unusual fact to remember you by!

4 What we will learn This course focuses on an overview of Software Estimation knowledge to include the tools, techniques, and key concepts used in estimating size and effort on software projects. Class content, including hands-on exercises, is designed to provide team members and Project Managers with resources and skills to more accurately apply estimating techniques at various points in a software engineering process and how to interpret the result, reducing risk during the software engineering effort. Prerequisite: Experience in the field of software engineering or project management, or relevant coursework/knowledge.

5 Info Breaks Food Facilities

6 Workshop Format Introduction Exercise without tools Break Lines of Code Lesson Function Point Lesson Exercise applying Function Point technique Break Lessons learned

7 Software Project Definitions Successful: The project is completed on time and on budget, with all features and functions as specified Challenged: The project is completed and operational, but over budget, late, and with fewer features and functions that initially specified Failed: The project is cancelled before completion, never implemented, or scrapped following installation

8 Project Success is Rare 2004: 15% failed, 51% challenged, 34% succeeded 2000: 23% failed, 49% challenged, 28% succeeded 1995: 40% failed, 33% challenged, 27% succeeded 1994: 31% failed, 53% challenged, 16% succeeded Source: Randy Miller, Microsoft

9 Project Status (Standish 2004)

10 Overview Why care about estimates? Repeatability Accuracy to a date Manage resources and cost FP standards group says: +-4%  12% LOC estimation error is history dependent +- 200% down to +-5% with PSP/PROBE

11 What is an "estimate" ? Our Premise: Effort in hours from size More accurate for your team/process Do engineers estimate? Do architects? Do scientists? Size  Effort hours  Schedule

12 Cone of Uncertainty It is very large during requirements It gets smaller as you proceed through your process!

13 Scenario I Read Exercise 1 Initial Estimate Solidify any questions and assumptions Can you make an initial estimate in hours? If not, make a guess! If you can, estimate how many defects there will be

14 Results What were your estimates? What tools would you use to perform this task?

15 Some facts help in estimation Organization Is there IT Support? What is their current and projected load? Engineering team in place? Engineering Methodology System Specs OS Platform Architecture Backend Database/File System, etc… Calculation rule specifics? Report output method /Architecture in place? Retention rate of data? Quality Specification Performance? Concurrency? Sales Lead time Pipleline already full? Project Management Supplier lead time Partner co-operation Partner dependencies

16 Important for Estimation What quality is required How many defects are acceptable after release? How many defects do we need to find? What is our team size What is our team productivity (do we know?) How many hours is the team is available to work on this project per day? 6? 8? 10? Are vacation days scheduled?

17 An Estimator uses Product Specs (size only) Team Size (size  effort) PSP uses a team size of 1 Hours available per day for work (effort  schedule) Days off (effort  schedule)

18 The Cone of Uncertainty What does it look like? What percent of software projects are on time and on budget? 29% Where does estimation error come from? Guessing Historical Analysis

19 Estimation Error Unknown specs account for 33% of error (McConnell 2006) Lost project hours account for error on schedule estimation, not size estimation Remember Size leads to Effort leads to Schedule

20 Scope Creep 2% increase per calendar month in design and coding phases (McConnell 2006) From end of requirements to start of coding phases chart: (Capers Jones, 2002)

21 Estimation Formula Effort in hours Size of Specs Size of Team How do we find the size of specs? Only two ways accepted as better than guessing! Function Points Lines of Code using PROBE in the PSP

22 Line of Code Definitions KLOC (thousands) SLOC (source) LOC CLOC (commented) NCLOC (non commented)

23 System Size: Lines of Code Review Table 1 The Paradox of Source Code Metrics Read Reading 1 Lines of Code as a metric Read Reading 2 A Few Words about the PSP and PROBE

24 Size estimating LOC/PROBE

25 LOC PSP/PROBE defects A benefit of this method: 30-100% of defects can be found before first compile!

26 LOC Has Problems! No theoretical foundation No relationship between lines of code and program operation C = A+B; C = *A + *B; Complexity and errors can increase with equal LOC

27 System Size: Lines of Code There is no standard way to measure Wide range of estimates for a language Visual Basic 15 to 41 LOC! LOC counts can be easily misinterpreted and misused. Don’t mix LOC counts from different languages and types of code (i.e. test support, product etc…) You can generate almost any productivity number you want by changing the way you count LOC. Tools? Code Counter Pro Can estimate ratio of Comment Lines per SLOC

28 Break Be back in …. Next up Function Points

29 System Size: Function Points 1979, A.J. Albrecht of IBM published a Function Point metric as a ‘measure of software productivity’ or unit of work.

30 An Ideal Size Measure Should Be Measurable Be Accountable Be Precise Be Independent of the measurer

31 System Size: Function Points Albrecht considered five operations The inputs to the application The outputs from it Inquiries by users (define user) The data files that would be updated by the application The interfaces to other applications

32 The generic application Data values in Output simple data Output Calculated data Data store Application

33 Modern Function Points After research, empirical weighting factors became a standard The number of inputs was weighted by 4 The Outputs by 5 Inquiries by 4 The data file updates by 10 The interfaces by 7 These weights change based on number of data fields used by each operation

34 System Size: Function Points IFPUG (International Function Point Users Group) FP ISO 20626:2003 www.ifpug.orgwww.ifpug.org COSMICON (COmmon Software Measurement International CONsortium) FFP ISO/IEC 19761:2003 http://www.cosmicon.com http://www.cosmicon.com FP measures size of an operation not the direct complexity of algorithms Abstracted from language or implementation

35 FP Defect Rates are Known DEFECT ORIGINSPer FP Requirements1.00 Design1.25 Coding1.75 Documentation0.60 Bad Fixes0.40 TOTAL5.00

36 Generic Application Data values in External Input Output Calculated data Data store (Internal Logical Files) Application Output simple data

37 External Outputs (EO) An elementary process in which derived data passes across the boundary from inside to outside the application A report where data is calculated is an example For elaborated definition see Glossary

38 External Inputs (EI) Is an elementary process in which data crosses the boundary from outside to inside the application For elaborated definition see Glossary

39 External inQuiry (EQ) An elementary process with both input and output components that result in data retrieval from one or more internal logical files and external interface files A report where data is pre-calculated is an example For elaborated definition see Glossary

40 Internal Logical Files (ILF) A group of logically related data within the application boundary Storage location for the users profile, for product, system control info… For elaborated definition see Glossary

41 Rating Logical Files

42 External Interface Files (EIF) Data used for reference purposes which resides entirely outside the application and is maintained by another application This is an Internal Logical File for another application For elaborated definition see Glossary

43 EQ and EO IFPUG is very clear on the difference between EQ and EO EQ should NOT update an ILF, no derived data is created and NO formula or calculations should be performed i.e. an EQ is only a query May require a certified FPA to be sure!

44 Function Point Terms Diagram

45 Note About Terms There may be more than one of ANY of the 5 FP operations Often more than one ILF in even the smallest project Imagine one ILF for Users, another for Invoice, Products, System Control data

46 FPA Exercise (Exercise I) We want to build an internet based system which signs up users who want service. The system will record the user information. It will look up the service price for the monthly payment and display this to the user before they approve. It will use the payment rates which are held in a legacy system. Once this payment is calculated, it is stored with the users data and is not re-calculated again. The system must provide two reports; 1) a monthly report which details the invoice for the user, and 2) an on- demand report for management which aggregates the projected income across all signed up users for the monthly period

47 Exercise I – How Many? Match these concepts to the exercise ILF (Internal Logical Files) EIF (External Interface Files) EI (External Input) EO (External Output) -- calculated EQ (External inQuiry) -- just a query

48 Exercise I - Calculate FP Count The number of External Inputs __*4= __ The External Outputs __*5= __ External inQuiries __*4= __ The Internal Logical Files __*10= __ The External InterFaces __*7= __ TOTAL FP Estimate = __ TOTAL Defects Estimate = FP * 5 = __

49 Exercise I - Calculate FP Count The number of External Inputs 2*4= __ The External Outputs 2*5= __ External inQuiries 2*4= __ The Internal Logical Files 1*10= __ The External InterFaces 1*7= __ TOTAL FP Estimate = __ TOTAL Defects Estimate = FP * 5 = __

50 Exercise I - Calculate FP count The number of External Inputs 2*4= 8 The External Outputs 2*5=10 External inQuiries 2*4= 8 The Internal Logical Files 1*10= 10 The External InterFaces 1*7= 7 TOTAL FP Estimate = 43 TOTAL Defects Estimate = FP * 5 =215

51 Exercise I TOTAL FP Estimate = 43 Note that this is an independent measure of the SIZE of the counted functionality!

52 Exercise I hours to release TOTAL FP Estimate = 43 EFFORT = FP * process efficiency Now apply the variables -- 516 hours to release = 43 unadjusted function points * 12 hours/fp Note that 12 is a LOW estimate of process efficiency

53 Note the Weighting Factors Weighting changes based on complexity but for that use a certified FP analyst! We will assume average complexity Example: Data Elements, see counting practices manual!

54 Using Agile process + FPA Agile: Analyze, Test, Code, Deliver Agile FPA: Analyze, Measure, Test, Code, Deliver Use the count as an opportunity to elaborate on the requirements Track estimates over time

55 Historical Effort Estimation ISBSG (International Software Benchmarking Standards Group) Assumes Average Productivity Staff Months =.425 * (FP Count ^.488) * (Maximum Team Size ^.697) Assumes 132 project focused hours per staff month Assumes 3rd GL, calibrated by 600 projects

56 Historical Measurement Thousands of projects Consistent sizing with FPA Record of time for each activity Trends emerge Some activities are not performed on every project Cost for the activity doesn’t vary based on project type

57 Activities by Project Type Activity by Project Type ActivityEnd UserMISOutsourceCommercialSystemsMilitary Requirements XXXXX PrototypingXXXXXX Architecture XXXXX Project Plans XXXXX Initial Design XXXXX Detailed Design XXXXX Design Reviews XXXX CodingXXXXXX Reuse AcquisitionX XXXX Package Purchase XX XX Code Inspections XXX Independent verification and validation X Configuration Management XXXXX Integration XXXXX User DocumentationXXXXXX Unit TestingXXXXXX Function Testing XXXXX Integration Testing XXXXX System Testing XXXXX Field Testing XXX Acceptance Testing XX XX Independent Testing X Quality Assurance XXXX Installation and Training XX XX Project Management XXXXX Commonly Used Activities*51620212225

58 National Average Productivity Work Hours per FP ActivityMinimumModeMaximum Requirements 0.380.752.64 Prototyping 0.530.885.28 Architecture 0.260.441.32 Project Plans 0.090.260.66 Initial Design 0.330.752.64 Detailed Design 0.440.885.28 Design Reviews 0.330.591.76 Coding 0.662.648.80 Reuse Acquisition 0.070.220.33 Package Purchase 0.090.330.38 Code Inspections 0.440.881.76 Independent verification and validation 0.661.061.76 Configuration Management 0.040.080.13 Integration 0.260.530.88 User Documentation 1.321.896.60 Unit Testing 0.330.881.89 Function Testing 0.440.885.28 Integration Testing 0.330.751.76 System Testing 0.260.661.32 Field Testing 0.260.591.76 Acceptance Testing 0.220.381.76 Independent Testing 0.440.661.32 Quality Assurance 0.440.884.40 Installation and Training 0.220.380.88 Project Management 0.661.328.80 Total hours per Function Point9.5019.5669.39

59 EFFORT is Estimated by Wideband Delphi-consensus-based NO! Perform organization calibration to get Hours per Function Point Historical Data gets better over time

60 Estimation Influences Are Additive Error due to Size  Effort hours  Schedule Error in Size Estimate Error in Effort Estimate Productivity changes due to New team size Work tasks change Hours available to work are altered

61 Estimation Techniques Function Points estimate size independently, and can find effort hours after one use PSP/PROBE Proxy-based estimates guess about a size to find the effort hours, but get better over time See chart slide 24

62 Software Estimation Tools PSP/PROBE is an estimation tool Function Points are an estimation tool LOC counting can be automated, but is only useful for comment lines and PSP Function Points are not easy to automate!

63 Estimation Procedures First Estimate Size Count Function Points as a size measurement Or estimate LOC using PSP/PROBE method Determine Productivity Hours/FP or Hours/LOC Calibrate using local history Total Effort Hours Size FP * Hours/FP or Use PSP/TSP/PROBE to determine total hours

64 Estimation Method Costs Function Point certification PSP/TSP and PROBE training

65 Project Cost Estimation Effort x (Salary + Burden) = Cost COCOMO 82  COCOMOII (2000)  F2COCOMO Requirement phase and cost is not estimated by any COCOMO method! Assumes 152 hours project time/month F2COCOMO see; www.cs.uwaterloo.ca/~adcaine/f2cocomo. pdf

66 Size Issues Using FP Little applicability to effort without historical data Some Standards are in place, ISO, IEEE History is available using ISBSG database

67 Size Issues Using LOC Problems applying it to different code bases i.e. SQL Data Driven Applications XML, XSLT, ASP, VBScript, Jscript, ASP.NET No standards Can’t convert between size models From LOC to FP or FP to LOC NO! Range of LOC/FP is too large to use!

68 Effort Estimation Issues Effort = Size * Productivity Productivity measured as hours/Function Point Use local productivity Data and ISBSG averages Team history and cohesion do affect results Main point - Record hours worked

69 Schedule Estimation Issues Support Documentation QA (How many defects do you want removed?) Change Requests which are implemented Turnover Team History Record hours worked to do your next estimate Process change

70 Process Change Issues About failure rates of metrics estimation initiatives Steven Kan, 2000 Management buy-in is most critical!

71 Exercise II Read Exercise II in the handout and perform a rough Function Point size estimate using the information given Derive hours to complete product using 12 hours per function point efficiency Estimate defects in product using the US standard of 5 defects per Function Point

72 Scenario II -- what is critical? The user will be presented with a calendar They may choose a date or a holiday They will identify it with a label They will then choose an email notification This data is recorded in the database The system will provide an email notification

73 Scenario II how many items? The user will be presented with a calendar Calendar Date ILF, EO They may choose a date or a holiday EI They will identify it with a label EI They will then choose an email notification EI This data is recorded in the database User Event ILF The system will provide an email notification EO

74 Scenario II EI 3*4 = 12 EO 2*5 = 10 EQ 0 * 4 = 0 ILF 2 * 10 = 20 We are still assuming average complexity! TOTAL Unadjusted Function Points = 42 DEFECTS = 42 * 5 = 210

75 Exercise II hours to release TOTAL FP Estimate = 42 EFFORT = FP * process efficiency Now apply the variables -- 840 hours to release = 42 unadjusted function points * 20 hours/fp Remember that 20 is an average estimate of process efficiency

76 Effort Analysis FP count of SIZE is about equal Exercise I FP count = 43 about 516 hours Exercise II FP count = 42 about 840 hours Why is there a large hours difference? Calibrate your process efficiency

77 What did we learn? Overview of Software Estimation knowledge Tools, techniques, and key concepts Size and Effort Resources and skills to more accurately apply estimating techniques

78 Conclusion Choose an estimating technique Make it part of your process at each step and for each change requested It can reveal process efficiency Track error over time and use to predict cone of uncertainty in the next cycle

79 Conclusion -- Costs PSP training cost = 2 weeks (80 hours) After 2 hours of use, your team process is CMMi Level 5 TSP training is available for managers Function Point certification costs = 2 days (16 hours) Will help if you keep history between projects No projections about CMMi level

80 Follow OnCourse Function Point Analysis Certification PSP Training Be sure to write it in your course evaluation if you are inclined to work toward either certification


Download ppt "Introduction to Software Project Estimation I (Overview) Barry Schrag Software Engineering Consultant MCSD, MCAD, MCDBA"

Similar presentations


Ads by Google