Presentation on theme: "Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The."— Presentation transcript:
Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The 4th International Seville Conference on Future-Oriented Technology Analysis (FTA) 12 & 13 May 2011
2 The Methodology – what we did Literature review Review of NRC TA documents Interviews with ITAs (Technical Advisors), ISs (Information Specialists) and clients
The Data Base Over 100 projects reviewed 15 users selected for interview based on frequency of TA use 15 TA users interviewed Projects ranged from short term orientated TA (under 1 year time horizon) to long term (40 year time horizon). Mix of methodologies and costs.
The Definition of FTA that I am using ForeLearn Foresight enhances such thinking by gathering anticipatory intelligence from a wide range of knowledge sources in a systematic way and linking it to today's decision making. FORESIGHT is a participative approach to creating shared long-term visions to inform short-term decision-making processes. (www.foresight-network.eu)
5 Literature review Setting the context – why assess performance? Measures from the foresight literature Measures from the CI literature Measures from services and consulting literature
6 Why Assess Performance? PurposeRelated Question 1. To Evaluate How well is our CI department, group, manager, task force or unit (etc.) performing? 2. To Control How can CI managers ensure their reports do the right things? 3. To Budget To what CI programs, people, projects, consultants, vendors or information sources should resources be allocated? 4. To Motivate How can CI executives motivate their reports as well as other functional stakeholders to do the things necessary to improve both CI and the enterprises performance? 5. To Promote How can CI managers convince their superiors and other relevant stakeholders that their function is doing a good job? 6. To Celebrate What CI accomplishments are worthy of the important organizational ritual of celebrating success? 7. To Learn What CI activities or efforts are working and not working, and why? 8. To Improve What should be done differently to improve CI performance, and by whom? Adapted from Behn (2003) in Blenkhorn & Fleisher 2007
Why assess performance Because the decision makers/funders are asking for it/Tough budgeting decisions in difficult times/Pragmatic reality. If we can come up with a methodology for doing this we will land up being evaluated by another persons wrong measures.
8 IMPACT DIMENSION ISSUE DIMENSION III. INITIALISING ACTIO TECHNOLOGICAL /SCIENTIFIC ASPECTS SCIENTIFIC ASSESSMENT a) Technical options assessed and made visible b) Comprehensive overview on consequences given AGENDA SETTING f) Setting the agenda in the political debate g) Stimulating public debate h) Introducing visions or scenarios REFRAMING OF DEBATE o) New action plan or initiative to further scrutinise the problem at stake p) New orientation in policies established SOCIETAL ASPECTSSOCIAL MAPPING c) Structure of conflicts made transparent MEDIATION i) Self-reflecting among actors j) Blockade running k) Bridge building NEW DECISION MAKING PROCESSES q) New ways of governance introduced r) Initiative to intensify public debate taken POLICY ASPECTSPOLICY ANALYSIS d) Policy objectives explored e) Existing policies assessed RE-STRUCTURING THE POLICY DEBATE l) Comprehensiveness in policies increased m) Policies evaluated through debate n) Democratic legitimisation perceived DECISION TAKEN s) Policy alternatives filtered t) Innovations implemented u) New legislation is passed Impacts of FTA Ladikas and Decker (2004)
9 EFMN (2005) (a) Quality of products Produce future-oriented material for the system to use Development of reference material for policymakers and other innovation actors Creating a language and practice for thinking about the future More informed STI priorities A source of inspiration for policy system actors (b) Organisation and quality of social interactions Aid discussions of the future Facilitate thinking out of the box Challenge mindsets Creation of new networks and clusters, re-positioning of existing ones Establishment of communication structures between innovation actors Support the empowerment of system actors Contribute towards development of actor identities
10 (c) Impacts in terms of learning effects Support system actors to create their own futures Creating a shared vision Gain insights into complex interactions and emerging drivers of change Build trust between system actors Detect and analyse weak signals to foresee changes in the future Facilitate better understanding of potential disruptive change Provide anticipatory intelligence to system actors Development of new ways of thinking Collective learning through an open exchange of experiences Highlighting the need for a systemic approach to both policymaking and innovation Stimulation of others to conduct their own foresight exercises after being inspired Accumulation of experience in using foresight tools and thinking actively about the future Enhanced reputational position and positive image of those actors running a foresight Better understanding (and visibility) of a territorys strengths and competencies (d) Impacts in terms of strategy formulation for action Support decision-making Improve policy implementation Better informed strategies in general Using foresight results to evaluate and future-proof strategies Better evidence-based policy Making the case for increased investments in R&D Achievement of long-term reform of the productive system through a raised emphasis on high technology Better manage external pressures and challenges Overcome path dependency and lock-ins
11 Competitive Intelligence Literature No universal method for measurement CI is a service, thus intangible and has a persuasive affect Cause-and-effect relationships cannot always be established due to many factors
12 Prescott & Bhardwaj (ref. Herring 1999, 15) Influencing decision makers Improved early warning Identifying new opportunities Exploiting competitor vulnerabilities Sharing of ideas Better serving the companys customers
13 Hard and Soft Measures of CI Success (Simon, 1998) Hard MeasuresSoft Measures Costs – CI contribution to the bottom line (input) 1.cost of doing the research 2.cost benefit of CI research 3.financial gain from ideas Quantitative measures (output) 1.clients serviced 2.projects completed 3.suggestions submitted 4.suggestions implemented 5.projects assisted 6.number of BI/CI staff 7.staff productivity 8.participants in the CI process (direct and indirect) Quality measures 1.Intelligence product measures 2.accuracy of information (validity and reliability) 3.immediate usability of results (no rework) Customer usability 1.work habits 2.user friendly reports 3.participation on teams 4.contributions to teams 5.communication skills 6.contact follow-ups 7.customer satisfaction ratings 8.understanding Acceptance and alliance measures 1.work climate 2.number of requests for service 3.number of repeated requests for service 4.requests for participation in team meetings 5.referrals from customers 6.further integration of CI projects CI practitioner performance measures initiative 1.implementation of new ideas 2.degree of supervision required 3.ability to set goals and objectives
14 Simon (Cont) Hard MeasuresSoft Measures Time measures 1.ability to produce timely information 2.efficiency 3.time saved by CI 4.on-time delivery CI practitioner performance measures 1.effective use of resources (resourceful and creative) 2.knowledge of CI methods 3.resourcefulness Unit and personnel effectiveness measures feeling/attitude 1.solicitation for services 2.attitude changes – clients taking you in to confidence or consulting with you 3.customer loyalty rating 4.perception of CI contributions 5.relationship building (sharing of personal information) 6.problem solver perception Personnel development/advancement rewards 1.job effectiveness 2.attendance at CI orientation and training programs (participant or teaching) 3.promotion 4.pay increases 5.work accomplishment acknowledgments
15 CI Measurement according to McGonagle and Vella (2002) Assignments and Projects 1.Meeting objectives 2.Number completed 3.Number completed on time 4.Number requested 5.Number requestedIncrease by End Users 6.Number of follow-up assignments 7.Number of projects assisted 8.Number of suggestions submitted Budget 1.Comparative cost savingscompared with cost of outsider 2.Comparative cost savingscompared with cost of untrained 3.Meeting project and function budget constraints Efficiency 1.Accuracy of analysis 2.Data quality 3.First time results (no reworking) 4.Meeting project time line 5.Time for research versus time for response End users 1.Creating compelling reasons to use CI 2.Effectiveness of implementation of findings 3.Meeting needs 4.Number of referrals 5.Number served Feedback 1.[Feedback]written 2.[Feedback]oral Financial 1.Cost avoidance 2.Cost savings 3.[Financial] goals met 4.Linking CI to specific investments 5.Linking CI to investments enhancement 6.Linking CI to specific savings from unneeded investments 7.Revenue enhancement 8.Value creation
16 McGonagle and Vella (cont) Internal Relationships 1.Building strong with end-users 2.Formulating relevant strategy and tactics 3.Quality of relationship with end-users 4.Quality of participation on cross-functional teams New Products and Services 1.Number developed due to use of CI 2.Cost savings/avoidance in development from use of CI Performance 1.Growth profitable for the unit or firm 2.Impact on strategic direction of unit or firm 3.Market share gains for unit or firm Report and Presentations 1.Number 2.Number of follow-ups 3.Production of actionable CI Sales effectiveness 1.Customer satisfaction 2.Linking to specific customer wins 3.Number of customers retained 4.Number of leads generated 5.Repeat business 6.Improvement in win-loss ratio Surveys 1.[Surveys]Written 2.[Surveys]Oral Time 1.Gained by CI input 2.Projects delivered on time 3.Saved by input
17 SCIP Study (2006) ResponsePercent Customer satisfaction Decisions made/ supported CI productivity/ output Strategies enhanced New Products or services ROI Calculation We have no effectiveness measures The value of CI: ResponsePercent New or increased revenue New products or services developed Cost savings or avoidance Time savings Profit increases Financial goals met We have no value measures
18 Consulting Performance Brought in as similar to TA it is a non tangible advice based service. Dominant measures: Service quality, Satisfaction, Service quality – expected service quality vs received, quality, value, trust, intention to use. Virtually all measures are subjective.
19 Taking these streams of literature We designed the following questionnaire designed to see whether it could be used to measure the impact of TA Used this as an element for an interview to better appreciate the broader issues that will be involved in measuring TA
20 Initial Questionnaire 1. Which impact do you think CTI has on direct clients? Scale Strongly disagree strongly agree Impact on Decision makers (ITA or business analysts) Client made decision in a more effective way (effectiveness) Client made decision more rapidly (timeliness) Client made better decision (appropriateness) Client made decision with more confidence (confidence) Clients analysis was confirmed (reassurance) Financial impact Client was able to save time (need to quantify) Client was able to save money (need to quantify) Client was able to save resources (need to identify which ones)
21 2. Which impact do you think CTI has on indirect clients? Scale Strongly disagree strongly agree Impact on Decision makers Client made decision in a more effective way (effectiveness) Client made decision more rapidly (timeliness) Client made better decision (appropriateness) Client made decision with more confidence (confidence) Clients analysis was confirmed (reassurance) Financial impact Client was able to save time (need to quantify) Client was able to save money (need to quantify) Client was able to save resources (need to identify which ones) Client was able to reduce costs Client was able to avoid costs Client was able to increase revenues
22 ITAs and Other Clients: Power Users We asked what are the benefits? We showed them the projects they had commissioned and asked for the benefits they received. We then gave them the questionnaire and asked if it captured the benefits they had received The questionnaire was then revised based on the clients comments
23 Results of the Interviews general Clients raved about the service Lots of stories of positive impacts Clear indication of difficulty in measuring impact due to Mediating variables (type of client, etc) Indirect nature of intelligence (intelligence is only one aspect of what is used for the decision) TA officers do not control implementation of the assessments Time – some of the TA recommendations affect decisions that can take up to 20 years for full project realization Lots of indirect flow producing significant benefit hard to measure. Lots of direct flow which will be difficult to directly link to the decisions.
24 Flow of TA Direct Investment Decision Director: Invest/Dont invest Lead ITA: Does it go forward for investment TBA/IS: Market Technical Intel scan BA: Business case Other ITAs: Technical assessment SME: I want investment Province: Potential funder Other Federal Others
25 Flow of intelligence Direct Other Client: Internal or external TBA/IS: Market Technical Intel scan SME: I want investment Province: Potential funder OGDs Other NRC personnel
26 But there is also secondary benefit Indirect benefit Intelligence spillovers Evidenced by looking at the actual flow of intelligence during a project. Network benefits
27 Examples of secondary impact arising from projects ITA reads the TBA report and integrates it in discussions with others and in decision making. TBA makes speech based on information developed for a TA report. TBA does a report using information gathered during a previous report. ITA/Client takes the intelligence and puts it in a public report Lots of back and forth during the project with the client so there is focusing and intel being passed on during the process including helping the client focus and improving the intelligence gathered. ITA/IS sends /newsletter updates to their network based on information gathered during the project Participants during process learn
28 Flow of intelligence Indirect Investment Decision Director: Invest/Dont invest Lead ITA: Does it go forward for investment TBA/IS: Market Technical Intel scan BA: Business case Other ITAs: Technical assessment SME: I want investment Province: Potential funder Other Federal Others
29 Client: Internal TBA/IS: Market Technical Intel scan SME Province: Potential funder OGDs Other NRC personnel Indirect – Other decisions Others
30 There was also informal intelligence – no project ITA/Other informal chat in the hallway with the TBA (happens in office, cafeteria, hallway) ITA/IS sends an to people information you might like to know Client/other drops by the ITA/IS office/calls them/ s them and asks what do you think of this idea
31 Other direct and secondary impact questions Was considered in making the decision/influenced the policy Made a difference Policy/decison was successful Assess the org/policy whatever it is as of a future date and compare it to the same before the project – look for the diffs, success of recommendations/advice New networks created Knowledge used in other places (spillover) Decision makers thinking longer term Futureaziation of the organization Social improvements Clients now doing some intelligence themselves I didnt have the time or patience to do it The TBA provided valuable analysis rather than information Is a skill set that I dont have It is a perspective that I do not have Provides information/analysis that I cant do Helped me/my client focus in on the right questions Provided more than just information I/my client could not afford to do it themselves I am getting busier (reality is more people are asking for TBA to do stuff so they must find it valuable)
32 Evaluating Performance Direct economic measures – difficult due to the flow of intelligence Attitude measures (asking clients what value they received) is easier to do and is what is most common in the literature Clients indicated that attitude measures were also the best ot use
33 The Performance Model – So what are we measuring?
34 Based on the literature and the interviews Organization performance: How well is he organization doing in their TA program? How well is it developing (this is a general area of measurement – entire organization) Individual performance: Are the TA officers doing their job well – again this is general area of measurement Project/process performance: Is the TA process being conducted appropriately Output performance: What is the quality of the TA output itself Impact performance: What direct and intended impact did the TA have Secondary impact performance: What indirect unanticipated impacts arose from the TA.
35 Measuring impact: Direct and secondary
36 Revised Questionnaire – still 1-5 but now distinction Between decision maker vs decision recommender These are the questions on impact and secondary benefit
37 Impact on ITAs/Advisors I made my recommendation in a more effective way (effectiveness) I made my recommendation more rapidly (timeliness) I made a better recommendation (appropriateness) Made me more confident on my recommendation My recommendation was validated (reassurance) I became aware of important issues that I was not aware of before It saved me time It saved me money It saved me resources It reduced uncertainty It gave me information that I was able to use in future projects It gave me important information that I previously was unaware of It gave me important new ideas It broadened my knowledge It has helped improve service to my clients Reduced bias in decision making/recommendation Has given me the information required to improve my clients proposal Has given me the information I needed to provide my client with good advice Has enabled me to do my job better Reduced the possibility of errors in my recommendation Helped to reduce risk Gave me information that is hard for me to get Gave me information that I did not know how to get Provided me with new network/contacts Could validate hypotheses, intuition; could identify potential market
38 Impact on decision makers Made the decision in a more effective way (effectiveness) Made the decision more rapidly (timeliness) Made a better decision (appropriateness) I was more confident in my decisions (confidence) It validated/confirmed my proposal/plan (reassurance) I became aware of important issues to address that I was not aware of before It saved time It saved money It saved resources It reduced uncertainty It gave me information that I was able to use in future projects It broadened my knowledge It made my decision less biased It made my proposal/plan better Has enabled me to do my job better Reduced the possibility of errors in my decision Helps to reduce risk in my decision It helped me to avoid making mistakes It helped me to pursue an opportunity It helped me improve management processes It helped me improve productivity It helped me improve R & D It helped me develop better strategy It helped me Identify new markets It helped me Identify new lines of business It helped me chose a technology direction It helped to chose a new product Prevented me from going in the wrong direction Prevented me from making the wrong decision Provided me with new network/contacts It gave me important information that I previously was unaware of It gave me important new ideas
39 Impact on researchers Stopped unproductive research Modified a research design
40 Direct benefit from TA Flows from how the intelligence is developed
41 Measuring the Organization Number of projects done Number of projects done on time Client satisfaction Projects/reports cited People reading/ordering reports, Requests for service Repeat clients Referral clients References to our material/citations Request for speeches Client intention to use the service again Client intention to use CTI more frequently Extent to which client experiences are positive Overall perceived quality of CTI services Overall cost Overall benefits Overall cost/benefit Extent recommendations are accepted Number of clients served Number of staff Projects/staff Staff productivity Customer loyalty Number of projects assisted
42 Measuring the Individual Acted in a professional manner Demonstrated professional conduct Has the appropriate skills Certification/knowledge testing Patents/papers done Number of invitations they get Being perceived as a valuable member of the clients team Being invited to key meetings Knowledge of the clients area Understanding of the problem Flexibility in adapting to requests Communication skills Collection skills Analytical skills Planning skills
43 Measuring the Process Followed proper practices Finished on time Proper mix of time: 15-20% planning, 25-35% collection, 25-35% assessment, 10-15% communication, 15-20% management. Used proper analytical techniques Proper mix of primary and secondary sources Clients needs were understood Clients need were dealt with
44 Measuring the output Number of recommendations made Met clients expectations Project professionally done ROI of the output Impact on decision Quality of recommendations Numbers of time report has been used, Number of times report has been quoted. Exceeded expectations Readability of the report Reliability of the intelligence Accuracy of the intelligence (over time) Clients overall perception of the quality of the intelligence Extent recommendations accepted Usability of the results User friendliness of reports
45 What the Research has driven home As a field we need to develop standards of practice that are measurable. We need to become recognized as a legitimate body of knowledge
For more information Jonathan Calof Telfer School of management, University of Ottawa Phone: