Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy.

Similar presentations


Presentation on theme: "Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy."— Presentation transcript:

1 Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy Wendy Hodge, Principal Consultant

2 This paper 2 1.Knowledge utilisation 2.The case studies

3 A very brief potted history of knowledge utilisation 3 “ The results of research are worthless if the are not used” (Last 1989)

4 Focus on evidence based policy Assumes that using knowledge will lead to better policy, programs Initiatives: Topic specific centres of excellence with structural links to government Systematic reviews of evidence e.g. Cochrane Collaboration Clearing Houses e.g. evidence and practice guidelines and summaries 4

5 Operating assumptions Knowledge is transferred from individual to individual and through organisational structures All knowledge is taken up subjectively Not just users and researchers who influence use of knowledge Knowledge is refined and adapted by the user Use and generation of knowledge are interdependent and complicated to improve Not all knowledge is intended to be directly applicable to policy development or program design 5

6 Ways evidence is used 6

7 Instrumental or direct use When findings/ data are used in specific or direct ways e.g. directly influence a program design or delivery, inform policy directions or professional practice Page 7

8 Conceptual use Involves using research evidence for general enlightenment. Users are exposed to new information, ideas but may not use the information directly Page 8

9 Symbolic or strategic use Using findings/ data to legitimise policy directions or to justify actions taken for other reasons. Page 9

10 Predictors of use “ Data are no use if the report on them is too late. They are precious little good if the relevant audience does not comprehend them” (Cronbach, 1977) 10

11 Predictors of use (dissemination model) Decision-makers know about the research Interdependence of policy makers and evaluators e.g. organisational links exist or joint planning Good personal relations between key players The right people - credible source The right evidence at the right time The inherent quality of evidence Whether the evidence conforms to commissioners beliefs and previous knowledge Whether data is interpreted in a way that suits the needs of the user Tells the story - clear, succinct formats, understood, user friendly 11

12 “Context matters-values matter – politics matter.” Brewer, 1983 “The interplay between science and policy is commonly neither purely instrumental nor purely political” Hertin et al 2007 12

13 Organisational and political predictors of use structure, culture and politics of the user organisation including assumptions about a program or policy’s worth and service models rewards and incentives for dissemination activity in both the “user” and “researcher” context value placed on evaluation or research evidence in the user context Other inputs on policy development or program design; lobbying, negotiations Boundaries of policy assessment analysis 13

14 ARTD cases considered Evaluation of drink driver education program Evaluation of a carers program Evaluation of drug education program 14

15 Case 1 – Evaluation of a drink driver education program Predictor of useThe evaluation Findings known to decision makersHigh-level interagency senior officer committee + report tabled in parliament Organisational linksContract+ interagency steering committee for project involved in planning discussion of findings Good relationsNone at start but built over 2 years Credible sourceUs + academic advisor + guru The right evidence-the right timeQuasi-experimental design + mixed methods. Timed to meet budget cycle. User friendly reportEvidence synthesised and report structured around evaluation questions Assumptions of program worthProgram valued; design based on best evidence Value placed on evaluationHighly valued; direct client research background 15

16 Case 2 – Evaluation of a carers program Predictor of useThe evaluation Findings known to decision makers Responsible officers commissioned evaluation. Able to drive changes to delivery models. Organisational linksContract provided formal structure. Good relationsFostered by regular informal reporting of progress and findings. Credible sourceSought evaluation specialists. Previous knowledge of area and experience in conducting large reviews. The right evidence-the right time Extensive consultation with carers and service delivery organisations. Findings delivered in time to inform renewal of 3-year contracts User friendly reportExecutive summary identified deficiencies of service model and suggested changes. Report told the story of carers and what respite was needed. Assumptions of program worth High given vulnerable nature of the carers...fits with national priorities Value placed on evaluationModerate to high...previous bad experience 16

17 Case 3 – Evaluation of an action enquiry as professional development Predictor of useThe evaluation Findings known to decision makersResponsible offices commissioned evaluation. Able to drive changes to program structure. Organisational linksContract provided formal structure and Premiers Panel. Good relationsFostered by regular informal reporting of progress and findings. Credible sourceLong standing clients. The right evidence-the right timeQualitative methods. Findings reported verbally initially to inform stage 2 planning User friendly reportAnswered evaluation questions; placed in context of adult learning principals Assumptions of program worthNew approach, testing Value placed on evaluationHigh, value independence 17

18 In summary Evaluators generate knowledge Our clients, policy officers and program designers are users and disseminators of knowledge in their own sphere As evaluators, we need to pay attention to predictors of use under our control Policy officers transform evidence to meet their needs Policy officers could also actively pay attention to predictors of use within the agency context 18

19 Wendy Hodge Principal Consultant Wendy.hodge@artd.com.au 02 9373 9900 Contact details


Download ppt "Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy."

Similar presentations


Ads by Google