Presentation on theme: "Rob Paton IVAR Seminar - June 2010 Impact measurement: great question… lousy answer?"— Presentation transcript:
Rob Paton IVAR Seminar - June 2010 Impact measurement: great question… lousy answer?
2 1.The Sunday School story measurement… 2.… and why we must go beyond it 3.Signposts for those who want to travel in hope Overview
3 Measurement challenges for voluntary agencies Partners contributions Social changes Governance + Management People + Resources Processes + Activities INPUTSPROCESSESOUTPUTSOUTCOMES and/or IMPACT
4 ‘Sunday School’ story of measurement 1.Get clear what you are trying to do 2.Get clear how you will know if (and how far) you are succeeding 3.Systematically gather the information needed for (2), along with operational/ contextual data 4.Analyse that information to make decisions & improve performance
5 The impossibility of ‘good measurement’ in the social domain 1.Focus versus comprehensiveness 2.Cost versus quality and reliability 3.Consensus versus contestation 4.Necessary stability versus reality of churn Hence: measurement weakness and/or dysfunction comes with the territory [See: R Paton (2003) Managing and measuring social enterprises, SAGE]
6 Measurement fundamentalism ‘There is no alternative’ More is better – failure to acknowledge costs and dangers Theologically naïve – positivistic Monological (technocratic priesthood) – no interfaith dialogue
7 Living with the uncertainties Measurement is a better question than an answer Fundamentalists can be dangerous... Some become atheists and cynics… Living in hope but without illusions ie, taking measurement seriously, but not literally
8 Focus on convincing measures No measures, or pointless ones ‘Data-free measurement’: unsystematic and anecdotal Focus on cost & practicality LowHigh Measurement overload: Disproportionate and intrusive The Measurement Grail: Thoughtfully focussed and ingeniously simple High Low ‘So-so’ measurement: no-one very satisfied
10 User value: clarity, relevance, timeliness, etc Scientific credibility: Research design Low cost: ease of collection & analysis; non-intrusive, etc
11 Some signposts? Recognise the trade-offs Work at subsector level
12 Focus on common attributes Arbitrary Selection Starting from scratch: (Using idiosyncratic and untested indicators) Focus on distinctiveness LowHigh Imposed indicators or inappropriate copying (Some outputs neglected: innovation stifled) Collaborative learning (Selecting from and adding to an array of proven measures) High Low The Standardisation dilemma in measurement
13 Some signposts? Recognise the trade-offs Work at subsector level ‘Composite impact analyses’?
14 Composite impact analysis? Data-rich (multi-method/Q & Q, ICT-enabled) Analytically-assisted – a new division of labour? Dialogue–driven
15 Some signposts? Recognise the trade-offs Work at subsector level ‘Composite impact analyses’? But still: don’t mistake the map for the territory (impact is a mechanistic concept)
16 Thank you! Contact me at firstname.lastname@example.org@open.ac.uk