Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 COCOMO II Estimation Example: Transaction Processing System (TPS) II Based on Chapter 3 of COCOMO II book Used for new microeconomics examples –Replaces.

Similar presentations


Presentation on theme: "1 COCOMO II Estimation Example: Transaction Processing System (TPS) II Based on Chapter 3 of COCOMO II book Used for new microeconomics examples –Replaces."— Presentation transcript:

1 1 COCOMO II Estimation Example: Transaction Processing System (TPS) II Based on Chapter 3 of COCOMO II book Used for new microeconomics examples –Replaces TPS example in SW Engr. Economics book

2 2 Outline TPS II business context TPS II system architecture Overall TPS II COCOMO II estimate TPS II server COCOMO II estimate

3 3 TPS II Business Context Current TPS inadequate for growing workload –Travel reservations: air, rail, car, hotel –Top performance: 1000 transactions/second (tr/sec) –Need 2000 tr/sec soon –Need growth to 4000 tr/sec COTS server capability can provide over 2000 tr/sec –But can’t achieve 4000 tr/sec Consider developing your own server software

4 4 TPS II System Architecture - variant on COCOMO II book architecture –Companies –Travel agents –About 10/region Regional Concentrators Clients Local Concentrators Financial DB DB Server Services DB 1N Server … ………… … … ……… …

5 5 TPS II Concept of Operation Clients prepare and send travel-itinerary requests to local concentrator –Package of air, rail, car, hotel reservation requests Local concentrators validate requests and forward them to regional concentrators at server –Usually about 10 local concentrators per region N Regional concentrators use DB server to develop best- match travel itinerary package –Send back to clients via local concentrators –Multiprocessor overhead due to resource contention, coordination

6 6 TPS II Software Capabilities Client Software Server Software Systems functions  Command processing  Communications  Protocol conversion  Security and integrity controls  Utilities User applications  Report generation  Screen processing  Transaction processing Authentication Receipt and display  User applications  Fault diagnosis  Built-in testing  Fault isolation and recovery management Systems functions  Command processing  Communications  Protocol conversion  Security and integrity controls  Resource management  Utilities/libraries Query processing  Database management  File management  Persistent database management Status monitoring  Checksum processing Fault diagnosis  Built-in testing  Fault isolation and recovery management

7 7 Recommended Estimating Process Requirements Stage 2 – Estimate Effort Using 1 st Approach (WBS, etc.) Stage 4 – Compare Estimates and Resolve Differences Stage 3 – Estimate Effort Using 2 nd Approach (COCOMO II, etc.) Stage 1 – Estimate the Size of the Job Final Estimate

8 8 Sizing: Differences Between COCOMO II Book and TPS II Estimates are for server software –vs. for client software in book Reused software: utilities, libraries, plus query processing User applications software: for server status monitoring, analysis, and display Size estimates same as in book –Systems software: 18 KSLOC new; 10 KSLOC reused –User applications: 800 SLOC new –Fault diagnosis: 300 function points new 24 KSLOC @ 80 SLOC/FP KSLOC: thousands of source lines of code

9 9 Reused Software Sizing: TPS II Design Modified: DΜ = 0 Code Modified: CM = 0 Integration Modified: IM = 50 AAF = 0.4 (0) + 0.3 (0) + 0.3 (50) = 15 Software Understanding: SU = 0 since DM = CM = 0 Adaptation Adjustment: AA = 4 –Evaluation, documentation ESLOC = 10,000[4 + 15(1+.02(0)(UNFM)]/100 = 10,000 (.19) = 1900 SLOC

10 10 Work Breakdown Structure Estimate: TPS II WBS Task Estimate (staff hours) Basis of Estimate 1. Develop software requirements 1,600 Multiplied number of requirements by productivity figure 2. Develop software 22,350 Multiplied source lines of code by productivity figure 3. Perform task management 2,235 Assumed a ten percent surcharge to account for the effort 4. Maintain configuration control 1,440 Assumed a dedicated person assigned to the task 5. Perform software quality assurance 1,440 Assumed a dedicated person assigned to the task TOTAL 29,065 Summary of WBS Estimate

11 11 COCOMO II Estimate Development Scenario Rate Scale Factors & Effort Multipliers Need for Further Adjustments Estimate Effort and Schedule Estimate Size Step 1 Step 2 Step 3 Step 4 Allocate Effort to Schedule Step 5 Start YES NO

12 12 Scale Factor Ratings and Rationale Scale Factor Rating Rationale PREC High The organization seems to understand the project’s goals and have considerable experience in working related systems. FLEX High Because this is an organization in the midst of a process improvement program, we have assumed that there is general conformity to requirements and a premium placed on delivering an acceptable product on time and within cost. RESL High We assumed and checked that attention placed on architecture and risk identification and mitigation is consistent with a fairly mature process. TEAM Very High This is a highly cooperative customer-developer team so far. It isn’t distributed and seems to work well together. PMAT Nominal A level 2 rating on the Software Engineering Institute’s Capability Maturity Model (CMM) scale is nominal.

13 13 Product Cost Driver Ratings and Rationale Cost Drivers Rating Rationale RELY Nominal Potential losses seem to be easily recoverable and do not lead to high financial losses. DATA Nominal Because no database information was provided, we have assumed a nominal rating. We should check this with the client to make sure. CPLX Nominal Based upon the guidelines in Table 20 in the COCOMO II Model Definition Manual and the available infrastructure software, we assume TPS software to be nominal with the exception of the fault diagnosis software. We would rate this module “high” because of the added complexity introduced by the neural network algorithms. RUSE Nominal Again, we assumed nominal because we did not know how to rate this factor. DOCU Nominal We assume the level of required documentation is right-sized to the life-cycle needs of the project. This seems to be an inherent characteristic of using the organization’s preferred software process.

14 14 Platform Cost Driver Ratings and Rationale Cost Drivers Rating Rationale TIME Nominal Execution time is not considered a constraint. STOR Nominal Main storage is not considered a constraint. PVOL Nominal By its nature, the platform seems stable. The rating was selected because it reflects normal change characteristics of commercially available operating environments.

15 15 Personnel Cost Driver Ratings and Rationale Cost Drivers Rating Rationale ACAP High We have commitments to get some of the highest ranked analysts available. However, the mix of personnel will be such that we can assume “high” as the norm for the project. PCAP Nominal Unfortunately, we do not know who we will get as programmers. Therefore, we assumed a nominal rating. PCON High Turnover in the firm averaged about 3% annually during the past few years. We have doubled this figure to reflect assumed project turnover, based on usual project experience of the firm. APEX High Most of the staff in the organization will have more than 3 years of applications experience in transaction processing systems. LTEX Nominal Because of the Java/C/C++ uncertainties, we will assume the experience level with languages & tools is between 1 and 2 years. PLEX High The mix of staff with relevant platform experience is high as wel1 based on the information given about the project (more than 3 years of experience).

16 16 Project Cost Driver Ratings and Rationale Cost Drivers Rating Rationale TOOL High We assume that we will have a strong, mature set of tools that are moderately integrated. SITE Low Because we don’t know how to rate this factor, we have conservatively assumed Low for now. SCED Nominal We will adjust this factor to address the schedule desires of management after the initial estimates are developed.

17 17 COCOMO II Nominal Output Screen

18 18 COCOMO II Schedule Constrained Output Screen

19 19 Comparing WBS and COCOMO II Estimates WBS: 29,065 hours - person (PH) - 1,600 for requirements (done) - 144 for CM, QA (152 vs 160 PH/PM) - 4,470 for above-average people (2.5 vs 2.0 SLOC/PH) -------- 22,851 PH COCOMO II : Baseline: 14,045 PH 17,539 PH (lower experience: APEX,.88 -> 1.0; PLEX,.91 -> 1.0) 20,520 PH (higher complexity: CPLX, 1.0 -> 1.17) = 135.0 PM @ $10K/pm = $1.35 M

20 20 Using COCOMO II for Risk Assessment - Share results with your customers and bosses to manage expectations “Techie” managers: ACAP, PCAP, TEAM New programming language tools: LTEX, TOOL, PVOL New processes: PMAT, RESL Estimate sizing growth and credibility: milestone updates Personnel turnover: PCON Volatile requirements: REVL Aggressive schedule: SCED

21 21 COTS vs. New Development Cost Tradeoff Build special version of server systems functions –To reduce COTS server software overhead, improve transaction throughput Server systems software size: 20,700 SLOC Server systems, library integration, status monitoring COTS license tradeoffs vs. number of regional concentrators N –Need 10 N licenses for local concentrators –$1k each for acquisition, $1K each for 5-year maintenance

22 22 COTS/New Development Cost Tradeoff Analysis COTS $KNew Development $K Cost to acquire 100606 Integrate & test 100Included Run-time licenses 10NNot applicable 5-year maintenance 10N151 200 + 20N757

23 23 COTS/New Development Cost Tradeoff 1200 1000 800 600 400 200 New COTS Life Cycle Cost, $K Number of Regional Concentrators, N 1020304050

24 24 COCOMO II: Airborne Radar System Example

25 25 Outline Overview the Airborne Radar System (ARS) Demonstrate progressive usage of different COCOMO sub-models within an evolutionary spiral development process Cover estimation of reuse, modification, COTS, and automated translation Show how an aggregate estimate is refined in greater detail

26 26 ARS Estimation Use Applications Composition, Early Design and Post-Architecture submodels Two Post-Architecture estimates are demonstrated: top-level and detailed –scale drivers apply to overall system in both estimates –cost drivers are rated for the aggregate system in the top- level estimate (17 ratings) –cost drivers are refined for each individual software component in the detailed estimate (17*6 components=102 ratings)

27 27 ARS System Overview

28 28 Software Components Radar Unit Control –controls radar hardware Radar Item Processing –extracts information from returned radar to identify objects Radar Database –maintains radar object tracking data Display Manager –high level displays management Display Console –user input device interface and primitive graphic processing Built In Test –hardware monitoring and fault localization

29 29 COCOMO Coverage in Evolutionary Lifecycle Process * both top-level and detailed estimates shown

30 30 Prototype Size and Effort Productivity is “high” at 25 NAP/PM Effort = NAP/ Productivity = 136.3/25 = 5.45 PM (or 23.6 person-weeks) Personnel = 23.5 person-weeks/6 weeks ~ 4 full-time personnel

31 31 Precedentedness (PREC) Development Flexibility (FLEX) Risk/Architecture Resolution (RESL) Team Cohesion (TEAM) Process Maturity (PMAT) Nominal Low High Nominal Scale Factors Rating Scale Factors for Breadboard

32 32 Early Design Cost Drivers for Breadboard High Very High High Nominal Effort Multipliers Rating Product Reliability and Complexity (RCPX) Required Reuse (RUSE) Platform Difficulty (PDIF) Personnel Capability (PERS) Personnel Experience (PREX) Facilities (FCIL) Schedule (SCED)

33 33 Breadboard System Size Calculations

34 34 Early Design Estimate for Breadboard

35 35 ARS Full Development for IOC Use Post-Architecture estimation model –same general techniques as the Early Design model for the Breadboard system, except for elaborated cost drivers Two estimates are demonstrated: top-level and detailed –scale drivers apply to overall system in both estimates –cost drivers are rated for the aggregate system in the top-level estimate (17 ratings) –cost drivers are refined for each individual software component in the detailed estimate (17*6 components=102 ratings)

36 36 ARS Top-Level Size Calculations

37 37 Post-Architecture Estimate for IOC (Top-level)

38 38 Post-Architecture Estimate for IOC (Detailed)

39 39 Sample Incremental Estimate

40 40 Increment Phasing

41 41 Increment Summary

42 42 We provided an overview of the ARS example provided in Chapter 3 We demonstrated using the COCOMO sub-models for differing lifecycle phases and levels of detail –the estimation model was matched to the known level of detail We showed increasing the level of component detail in the Post-Architecture estimates Incremental development was briefly covered Summary and Conclusions


Download ppt "1 COCOMO II Estimation Example: Transaction Processing System (TPS) II Based on Chapter 3 of COCOMO II book Used for new microeconomics examples –Replaces."

Similar presentations


Ads by Google