Presentation on theme: "Issues in evaluation Nick Tilley. Why evaluate? To learn lessons for other places and times, –though care needs to taken in replications – they are never."— Presentation transcript:
Why evaluate? To learn lessons for other places and times, –though care needs to taken in replications – they are never exact For accountability, –though performance indicator driven evaluation can produce perverse incentives. To inform scheme adjustments, –though it is important to give schemes time to bed in.
The track record of evaluation Relatively little is evaluated. There is much lying in evaluation. There is little competence in evaluation. Methodology is heavily debated. Masses of implementation failure is found. In the most useful evaluations researchers have been involved in project design.
Problems for evaluation Record keeping regarding crime and disorder Data provision, data protection and data security. Data quality Records tracking interventions Political/administrative pressure on evaluators Ideology Technical skills
Whats worth evaluating? Not everything! –Its too expensive –Its too difficult –Nothing will be learned Prioritise! –Where there are significant decisions are at stake –Where there is a chance that evidence will be heard –Where competent implementation is likely –Where project workers and data custodians will play ball –Where there is inadequate or insufficient research to date
Rules for evaluation Work out the scheme theory – read and consult –How is the scheme expected to work and for whom? –What side effects might be expected, and for whom? Work out what to measure to test the theory. Measure properly. Dont expect to be able to prove conclusively what works. Tell the truth about unwelcome as well as welcome findings. Dont make wild generalisations. Don t come to premature conclusions.
Example: a scheme for evaluation The scheme starts in April 2003. The scheme focuses on reducing council house burglary in a local area. An evaluation report is asked for in July 2003. The LA wants to decide whether to cancel it, continue it or roll it out. Was it effective?
Conclusions It is easy to lie/mislead with data. Some technical skills are needed in evaluation – untutored self-evaluations tend to be very weak and self-serving. Side-effects, notably diffusion of benefit and displacement, should be explored. It is useful to find the active ingredients in initiatives. They will not always be obvious. It is dangerous to draw premature conclusions.