Presentation is loading. Please wait.

Presentation is loading. Please wait.

Hubbard Decision Research The Applied Information Economics Company Follow-up Bootstrap Case Study.

Similar presentations


Presentation on theme: "Hubbard Decision Research The Applied Information Economics Company Follow-up Bootstrap Case Study."— Presentation transcript:

1 Hubbard Decision Research The Applied Information Economics Company Follow-up Bootstrap Case Study

2 Hubbard Decision Research The Applied Information Economics Company The Measurement Choice  The follow-up decision was defined as whether to proceed with the project as planned or make a significant reduction in scope by removing functions ( )  The VIA of this decision indicated that risk of cancellation was a key variable  Further calibrated estimates and decomposition were uninformative and insufficient historical data exists for creating an “actuarial” model  Bootstrapping the chance of cancellation was judged to be the most feasible measurement method  Additional investments may use this bootstrap model

3 Hubbard Decision Research The Applied Information Economics Company Bootstrapping Overview  Historical analysis of IT investments  First Workshop: 3 Review history 3 Identify Success Factors 3 Confirm possible ranges  Design test assessments  Second Workshop 3 Calibrate for binary questions 3 Conduct collaborative assessment  Independent assessments  Compute regression model  Confirm model

4 Hubbard Decision Research The Applied Information Economics Company Questions for Initial Planning  Is bootstrapping necessary? (explain alternatives and when bootstrapping is good)  Have a kickoff: explain objectives and approach w/ specific examples/success stories, studies show that bootstrap are improved  What is the scope of the portfolio?  What outcome is to be bootstrapped?  What historical information is obtainable and where?  Who are the decision makers?  Who will be attending the workshops?  Schedule the workshops, interviews, and the presentation to validate the model

5 Hubbard Decision Research The Applied Information Economics Company Project Planning Estimates Historical data gathering: 1-2 people, 1-3 days Preparation for 2 workshops: 1-2 people, 2-4 hours each Conduct 2 workshops: 1-2 facilitators + participants, 1/2 day (3-4 hours) each Construct initial bootstrap list: 1 person, 1-3 hours Construct final bootstrap list: 1 person, 1-3 hours hour Build regression model: 1-2 people, 4-8 hours Prepare for presentation to confirm model: 1-2 people, 6-8 hours Conduct presentation to confirm model: 1-2 presenters + participants, 1 hour

6 Hubbard Decision Research The Applied Information Economics Company Historical Analysis  Determine scope of historical data needed 3 How far back do we need data? Up to 30 examples 3 Do we need investment size, duration, status, objective, etc.? (have standard list)  Identify historical data available on IT investments 3 Budgeting process/accounting data 3 IT staff memory 3 Any metrics efforts 3 Past strategic IT plans  Collect investment data  Consolidate data into single table for hand out

7 Hubbard Decision Research The Applied Information Economics Company First Workshop Objectives  The first Bootstrap workshop is meant to be a free- form brainstorming forum to address the following: 3 Introduce concepts/objectives to new participants 3 Review the historical data and attempt to spot trends and success factors 3 Which investments were extreme examples for the variable being bootstrapped 3 List potential predictive variables 3 Determine realistic values of predictive variables including combinations of values 3 Define criteria for bootstrap output 3 Agree on input consolidation rules – shall we just average the group, throw out highest/lowest, etc.

8 Hubbard Decision Research The Applied Information Economics Company Results of First Workshop  We identified the scope of the portfolio as any randomly chosen this organizations investments  There were 4 participants  We identified the following variables as pertinent to a follow- up measurement on chance of cancellation: 3 Is the investment a documented strategic initiative? 3 90% confidence interval for time remaining (months) 3 Is some part of the investment a compliance requirement? 3 The number of business units involved 3 Is sponsor business, IT or corporate? 3 % over-budget and % over-schedule 3 Test score of staff regarding project plan knowledge 3 Project manager and sponsor evaluation of project 3 % deliverables complete

9 Hubbard Decision Research The Applied Information Economics Company Design Test Assessments  Using the identified predictive variables, generate a list of hypothetical investments  The range of individual values should reflect the actual portfolio – ie. You should not have mostly investments over $50 million if that size is rare for this client  The combination of values in each hypothetical investment should be realistic – ie. The size and duration should fit each other  Make sure list represent investments in a range of possible bootstrapped output values  Produce a short table that lists each investment with hypothetical values and blanks for their input (perhaps 10 investments)

10 Hubbard Decision Research The Applied Information Economics Company Second Workshop  Calibrate for binary questions  Present trial investment list (just 5 investments), explain values shown and inputs needed  Discuss each investment as a group  Identify changes to list  Obtain calibrated estimates for each  Explain next steps

11 Hubbard Decision Research The Applied Information Economics Company Prepare Final Bootstrap  Modify constraints based on findings from second workshop 3 Clarify definitions/units of measure 3 Add/drop variables 3 Confirm input ranges  Generate new list of hypothetical investments  The list should be enough to produce at least 100 responses total and no less than 30+# of variables per evaluator  Randomize list order  Options: 3 Make some investments duplicates (for measuring consistency) 3 Include a few best/worst case investments

12 Hubbard Decision Research The Applied Information Economics Company Calibrated Estimation Results  Each evaluator assessed chance of cancellation for 48 investments  Variance between evaluators was often very large but may have been less if we did the trial evaluation or calibration  Olympic scoring throws out highest and lowest  Disagreement among evaluators averaged 16% but was as much as 60%  Difference between Olympic scores of duplicates was 6%  Nobody stood out as particularly inconsistent or consistent but Ando and Vinay were clearly more optimistic than Jean-Rene and Cecile  Clearly, these chances of cancellation are high for any RAVI project 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% VKU JRR CPP AAN Olympic

13 Hubbard Decision Research The Applied Information Economics Company Compute Regression  Aggregate inputs of various estimators  Convert input into quantities 3 Pivot tables on un-ordered and discrete but non-binary variables 3 Graph continuous variables against output to look for obvious non-linear relationships  For each output variable (confidence of success, chance of cancellation, etc.) compute a regression model  Try combinations of higher order terms where you think there is a compounding effect  Size is always a good candidate for higher-order terms  Compare model error to evaluator inconsistency (model error should be less)  Test changes in “controllable” success factors – this may identify sub-zones

14 Hubbard Decision Research The Applied Information Economics Company Confirm Results  To confirm results show each of the following: Plot of the original estimates vs. the model The test classification chart Plot actual projects on classification chart and discuss discrepancies  Determine volumes in each zone to check if support is realistic  Present results to group

15 Hubbard Decision Research The Applied Information Economics Company Regression Results  Each investment was described by 12 but the model reduced this to 8.  After a few regression models were tried, one was found with an R squared of 0.91  Higher-order variables were added such as one which considered level of over- budget only if the investment was neither strategic or compliance  Part of the variance from the Olympic to the Model was due to evaluator inconsistency, not actual error in the model Olympic score of calibrated estimates Comparison of estimates to model 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 00.20.40.60.81 Model Estimate


Download ppt "Hubbard Decision Research The Applied Information Economics Company Follow-up Bootstrap Case Study."

Similar presentations


Ads by Google