Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating Organizational Change: How and Why? Dr Kate Mackenzie Davey Organizational Psychology Birkbeck, University of London

Similar presentations


Presentation on theme: "Evaluating Organizational Change: How and Why? Dr Kate Mackenzie Davey Organizational Psychology Birkbeck, University of London"— Presentation transcript:

1 Evaluating Organizational Change: How and Why? Dr Kate Mackenzie Davey Organizational Psychology Birkbeck, University of London k.mackenzie-davey@bbk.ac.uk

2 2 Aims Examine the arguments for evaluating organizational change Examine the arguments for evaluating organizational change Consider the limitations of evaluation Consider the limitations of evaluation Consider different methods for evaluation Consider different methods for evaluation Consider difficulties of evaluation in practice Consider difficulties of evaluation in practice Consider costs and benefits in practice Consider costs and benefits in practice

3 3 Arguments for evaluating organizational change Sound professional practice Sound professional practice Basis for organizational learning Basis for organizational learning Central to the development of evidence based practice Central to the development of evidence based practice Widespread cynicism about fads and fashions Widespread cynicism about fads and fashions To influence social or governmental policy To influence social or governmental policy

4 4 Research and evaluation Research focuses on relations between theory and empirical material (data) Research focuses on relations between theory and empirical material (data) –Theory should provide a base for policy decisions –Evidence can illuminate and inform theory – Show what does not work as well as what does –Highlight areas of uncertainty and confusion –Demonstrate the complexity of cause-effect relations –Understand predict control

5 5 Pragmatic Evaluation: what matters is what works Why it works may be unclear Why it works may be unclear Knowledge increases complexity Knowledge increases complexity Reflexive monitoring of strategy links to OL & KM Reflexive monitoring of strategy links to OL & KM Evidence and cultural context Evidence and cultural context May be self fulfilling May be self fulfilling Tendency to seek support for policy Tendency to seek support for policy Extent of sound evidence unclear Extent of sound evidence unclear

6 6 Why is sound evaluation so rare? Practice shows that evaluation is an extremely complex, difficult and highly political process in organizations. Practice shows that evaluation is an extremely complex, difficult and highly political process in organizations. Questions may be how many, not what works Questions may be how many, not what works

7 7 Evaluation models 1. Pre-evaluation 2. Goal based (Tyler, 1950) 3. Realistic evaluation (Pawson & Tilley,1997; Sanderson, 2002) 4. Experimental 5. Constructivist evaluation (Stake, 1975) 6. Contingent evaluation (Legge, 1984) 7. Action learning (Reason & Bradbury, 2001) A study should be technically sound, administratively convenient and politically defensible. Alec Rodger A study should be technically sound, administratively convenient and politically defensible. Alec Rodger

8 8 1.1 Pre-evaluation (Goodman & Dean, 1982) The extent to which it is likely that... A has an impact on b Scenario planning Scenario planning Evidence based practice Evidence based practice –All current evidence thoroughly reviewed and synthesised –Meta-analysis –Systematic literature review Formative v summative (Scriven, 1967) Formative v summative (Scriven, 1967)

9 9 1.2 Pre-evaluation issues Based on theory and past evidence: not clear it will generalise to the specific case Based on theory and past evidence: not clear it will generalise to the specific case Formative: influences planning Formative: influences planning Argument: to understand a system you must intervene (Lewin) Argument: to understand a system you must intervene (Lewin)

10 10 2. 1. Goal based evaluation Tyler (1950) Objectives used to aid planned change Objectives used to aid planned change Can help clarify models Can help clarify models Goals from bench marking, theory or pre- evaluation exercises Goals from bench marking, theory or pre- evaluation exercises Predict changes Predict changes Measure pre and post intervention Measure pre and post intervention Identify the interventions Identify the interventions Were objectives achieved? Were objectives achieved?

11 11 2.2 Difficulties with Goal based evaluation Who sets the goals? How do you identify the intervention? –Tendency to managerialism (unitarist) –Failure to accommodate value pluralism –Over-commitment to scientific paradigm –What is measured gets done –No recognition of unanticipated effects –Focus on single outcome, not process

12 12 3.1 Realistic evaluation: Conceptual clarity (Pawson & Tilley,1997) Evidence needs to be based on clear ideas about concepts Evidence needs to be based on clear ideas about concepts Measures may be derived from theory Measures may be derived from theory Examine definitions used elsewhere Examine definitions used elsewhere Consider specific examples Consider specific examples Ensure all aspects are covered Ensure all aspects are covered

13 13 3.2 Realistic evaluation Towards a theory: What are you looking for? Make assumptions and ideas explicit Make assumptions and ideas explicit What is your theory of cause and effect? –What are you expecting to change (outcome)? –How are you hoping to achieve this change (mechanism)? –What aspects of the context could be important?

14 14 3.3 Realistic evaluation Context-mechanism- outcome Context: What environmental aspects may affect the outcome? Context: What environmental aspects may affect the outcome? –What else may influence the outcomes? –What other effects may there be?

15 15 3.4 Realistic evaluation Context-mechanism- outcome Mechanism: What will you do to bring about this outcome? Mechanism: What will you do to bring about this outcome? –How will you intervene (if at all)? –What will you observe? –How would you expect groups to differ? –What mechanisms do you expect to operate?

16 16 3.5 Realistic evaluation Context-mechanism- outcome Outcome: What effect or outcome do you aim for? Outcome: What effect or outcome do you aim for? –What evidence could show it worked? –How could you measure it?

17 17 4.1 Experimental evaluation: Explain, predict and control by identifying causal relationships Theory of causality makes predictions about variables eg training increases productivity Theory of causality makes predictions about variables eg training increases productivity Two randomly assigned matched groups: experimental and control Two randomly assigned matched groups: experimental and control One group experiences intervention, one does not One group experiences intervention, one does not Measure outcome variable pre-test and post-test (longitudinal) Measure outcome variable pre-test and post-test (longitudinal) Analyse for statistically significant differences between the two groups Analyse for statistically significant differences between the two groups Outcome linked back to modify theory Outcome linked back to modify theory The gold standard The gold standard

18 18 4.2 Difficulties with experimental evaluation in organizations Difficult to achieve in organizations Difficult to achieve in organizations Unitarist view Unitarist view Leaves out unforeseen effects Leaves out unforeseen effects Problems with continuous change processes Problems with continuous change processes Summative not formative Summative not formative Generally at best quasi-experimental Generally at best quasi-experimental

19 19 5.1 Constructivist or stakeholder evaluation Responsive evaluation (Stake, 1975) or Fourth generation evaluation (Guba & Lincoln, 1989) Responsive evaluation (Stake, 1975) or Fourth generation evaluation (Guba & Lincoln, 1989) Constructivist interpretivist hermeneutic methodology Constructivist interpretivist hermeneutic methodology –Based on stakeholder claims concerns issues –Stakeholders: agents, beneficiaries, victims

20 20 5.2 Response to an IT implementation (Brown, 1998) Theme The ward Laboratory IT Team Goal Improve quality to patients Improve quality for ward staff Clinical and financial benefits Outcome Waste of time and energy on a pointless system No improvemen t to adequate systems Technically competent system - misconceive d project

21 21 5.3 Constructivist evaluation issues No one right answer No one right answer Demonstrates complexity of issues Demonstrates complexity of issues Highlights conflicts of interests Highlights conflicts of interests Interesting for academics Interesting for academics Difficult for practitioners to resolve Difficult for practitioners to resolve

22 22 6 A Contingent approach to evaluation (Legge, 1984) Do you want the proposed change programme to be evaluated? (Stakeholders) Do you want the proposed change programme to be evaluated? (Stakeholders) What functions do you wish its evaluation to serve? (Stakeholders) What functions do you wish its evaluation to serve? (Stakeholders) What are the alternative approaches to evaluation? (Researcher) What are the alternative approaches to evaluation? (Researcher) Which of the alternatives best matches the requirements? (Discussion) Which of the alternatives best matches the requirements? (Discussion)

23 23 7. Action research Identify good practice (Reason & Bradbury, 2001) Action research Identify good practice (Reason & Bradbury, 2001) Action research –Responds to practical issues in organizations –Engages in collaborative relationships –Draws on diverse evidence –Value orientation - humanist –Emergent, developmental

24 24 Problems with realist models Tendency to managerialise Tendency to managerialise Over-commitment to scientific paradigm Over-commitment to scientific paradigm Context stripping, Context stripping, Over-dependence on measures Over-dependence on measures Coerciveness: truth as non-negotiable Coerciveness: truth as non-negotiable Failure to accommodate value pluralism Failure to accommodate value pluralism Every act of evaluation is a political act, not tenable to claim it is value free Every act of evaluation is a political act, not tenable to claim it is value free

25 25 Problems with Constructionist approach Evaluation judged by who for whom and in whose interests? Evaluation judged by who for whom and in whose interests? Identify different views, then what? Identify different views, then what? Who has power? Who has power? Leaves decisions open Leaves decisions open May lead to ambiguity May lead to ambiguity

26 26 Why not evaluate? Expensive in time and resources Expensive in time and resources De-motivating for individuals De-motivating for individuals Contradiction between scientific evaluation models and supportive, organization learning models Contradiction between scientific evaluation models and supportive, organization learning models Individual identification with activity Individual identification with activity Difficulties in objectifying and maintaining commitment Difficulties in objectifying and maintaining commitment External evaluation off the shelf inappropriate and unhelpful External evaluation off the shelf inappropriate and unhelpful

27 27 Why evaluate? (Legge, 1984) Overt Overt –Aids decision making –Reduce uncertainty –Learn –Control Covert Covert –Rally support/opposition –Postpone a decision –Evade responsibility –Fulfil grant requirements –Surveillance

28 28 Conclusion Evaluation is very expensive, demanding and complex Evaluation is very expensive, demanding and complex Evaluation is a political process: need for clarity about why you do it Evaluation is a political process: need for clarity about why you do it Good evaluation always carries the risk of exposing failure Good evaluation always carries the risk of exposing failure Therefore evaluation is an emotional process Therefore evaluation is an emotional process Evaluation needs to be acceptable to the organization Evaluation needs to be acceptable to the organization

29 29 Conclusion 2 Plan and decide which model of evaluation is appropriate Plan and decide which model of evaluation is appropriate Identify who will carry out the evaluation and for what purpose Identify who will carry out the evaluation and for what purpose Do not overload the evaluation process:judgment or development? Do not overload the evaluation process:judgment or development? Evaluation can give credibility and enhance learning Evaluation can give credibility and enhance learning Informal evaluation will take place whether you plan it or not Informal evaluation will take place whether you plan it or not


Download ppt "Evaluating Organizational Change: How and Why? Dr Kate Mackenzie Davey Organizational Psychology Birkbeck, University of London"

Similar presentations


Ads by Google