What evaluation attributes, stakeholder characteristics & contextual factors are important for evaluation influence? Sarah Appleton-Dyer Dr Janet Clinton,

Slides:



Advertisements
Similar presentations
Intensive Family Support Service Opportunities for Workforce Development: Potential for Transformational Change Rhoda Emlyn-Jones Dr Amanda Bremble March.
Advertisements

Introducing the Researcher Development Framework (RDF) Gill Johnston, University of Sussex.
Bridging the gap between good practice principles and research study realities. Using case studies to build descriptors of the public involvement role.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Foundation of Nursing Studies in partnership with the Burdett Trust for Nursing Patients First: Supporting nurse led innovation in practice Workshop 1.
CHOOSING A RESEARCH PROJECT © LOUIS COHEN, LAWRENCE MANION, KEITH MORRISON.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
Morag Ferguson and Susan Shandley Educational Projects Managers
The Rubric Partnership The Collaboration Rubric An action research approach to partnership building across community-based child and family networks ACWA.
Victorian Literacy and Numeracy Secretariat
Evaluation at The Prince’s Trust Fire Service Prince's Trust Association meeting 18 th February 2010 Subtitle.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
The Most Significant Change Technique (MSC) Dr Jessica Dart Clear Horizon.
OVERVIEW OF ClASS METHODS and ACTIVITIES. Session Objectives By the end of the session, participants will be able to: Describe ClASS team composition.
Paid Staff and Volunteers: Do Their Attitudes Differ? Riikka Teikari and Christeen George Abstract The aim of the present study was to explore whether.
Project Monitoring Evaluation and Assessment
Program Evaluation Essentials. WHAT is Program Evaluation?
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
SUPI Network Learning Project Toby Greany (Institute of Education) Qing Gu (University of Nottingham) Graham Handscomb (Consultant) Matt Varley (Nottingham.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Challenge Questions How good is our operational management?
A Case Study of Progression on a Foundation Degree Amanda Davis Stockport College Eamon O’Doherty University of Salford.
Evidence Aid: A resource for those preparing for and responding to natural disasters, humanitarian crises and major healthcare emergencies Claire Allen.
Developing a Personal Development Plan
Improving Outcomes by Helping People Take Control
The EMR Internationalising Education China Project Introductions.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Paper Title: “The influence of gender in the relation between Participatory Monitoring and Evaluation, and Citizen Empowerment” Conference Paper by: Kennedy.
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Building Research Capacity in social care: An untapped potential? Jo Cooke &Linsay Halladay University of Sheffield Others in the research team: Ruth Bacigalupo.
Impact assessment framework
Transforming lives through learning Profiling Education Scotland.
The Wheel Campus Engage Building Networks December 2013.
Needs Assessment: Young People’s Drug and Alcohol Services in Edinburgh City EADP Children, Young People and Families Network Event 7 th March 2012 Joanne.
Utilising a Theory of Change approach to Achieve Adaptive Co-Management Anna Evely, Ioan Fazey, Xavier Lambin and Michelle Pinard Universities of Aberdeen.
Ashley Briggs, Ed.D., case study lead Felix Fernandez, Ph.D, implementation lead ICF International 1 July 8, 2015.
How can school districts support the development of healthy school communities? Facilitated by: Rhonda Patton, Alberta Health Services Dr. Steve Manske,
Building on a living lab in dementia care: A transnational multiple case study Research Session 3: Case studies in Living Lab application domains Diana.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Human Services Integration Building More Effective Responses to Peoples’ Needs.
MERTON LOCAL INVOLVEMENT NETWORK MEETING 27 March 2008 Richard Poxton Centre for Public Scrutiny National Team.
CHAPTER 10 Choosing a Research Method. Choosing a research method What are research methods? Research methods are means through which you undertake the.
Evolving Directions & Initiatives Secwepemc Nation Injury Surveillance & Prevention Program Mary McCullough Three Corners Health Services Society Williams.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Advance Care Planning Project Margaret Colquhoun, Jackie Whigham & Peter McLoughlin.
School Improvement Partnership Programme: Summary of interim findings March 2014.
Overview What do we mean by a Learning Organisation? Why did we develop a People Development Framework? What was the process involved in building the.
Chapter 4 Developing and Sustaining a Knowledge Culture
Evaluation of the Quebec Community Learning Centres: An English minority language initiative Learning Innovations at WestEd May 21, 2008.
Using “Appreciative Inquiry” to build evaluation capacity.
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Qualitative research 25 th November 2015 RNIB Research day.
Māori Board Members and the District Health Board Model: Experiences, Issues and Challenges Te Mata o Te Tau Weekly Seminar Series 27 July 2006 Dr Amohia.
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Shaping the Future: A Vision for Learning Disability Nursing United Kingdom Learning Disability Consultant Nurse Network.
Crisis Care Concordat: Evaluation Karen James, Research Manager Susanne Gibson, Senior Researcher.
2016 Spring Grantee Convening IKF Evaluation Update Center for Community Health and Evaluation April 11, 2016 Foundation for a Healthy Kentucky.
Issues and challenges to scoping and focusing the question ESQUIRE Qualitative Systematic Review Workshop University of Sheffield 8 September 2011 Janet.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Research and Development Dr Julie Hankin Medical Director.
Developing Community Assessments
Professional Review Process for Heads / Principals
Poster 1. Leadership Development Programme : Leading Cultures of Research and Innovation in Clinical Teams Background The NHS Constitution is explicit.
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Age Friendly Communities
A Focus on Outcomes and Impact
Anna Gaughan Centre for Local Governance 26th March 2008
Workbook for Progressing Strategic Priorities at Local Level
Presentation transcript:

What evaluation attributes, stakeholder characteristics & contextual factors are important for evaluation influence? Sarah Appleton-Dyer Dr Janet Clinton, Dr Peter Carswell & Dr Rob Mc Neill AES Conference; Sydney, August 2011

Overview Identify the aims of the presentation. Describe the research used to inform the presentation. Present a theory of evaluation influence within a partnership context. Use survey and interview data to highlight the evaluation attributes, stakeholder characteristics and contextual factors that are important for evaluation influence. Get some feedback.

Background

Evaluation influence The multiple pathways and mechanisms through which evaluation can hope to influence attitudes and action, due to exposure to evaluation findings or to participation in evaluation (Mark & Henry, 2004). Mark, M. M., & Henry, G. T. (2004). The Mechanisms and Outcomes of Evaluation Influence. Evaluation, 10: 35-48

Aims To identify the types of influences experienced by population health partnership members. To identify the factors that facilitate evaluation influence.

Date The conceptual model Phase 2: Survey with partnership members Phase 3: Site visits with partnerships Phase 4: Data integration and analysis Similarities and differences are explored Phase 1: Literature review Overview of evaluation influenceIn-depth understanding of evaluation influence New insights and understandings

Conceptual model

Study 1 Online survey – 187 PHP members. April to May 2010 (6 weeks). Female (71%). Aged 35 to 49 (33%). NZ Euro (66%), Maori (18%) or Pacific Islanders (3%). 2 to 5 organisations (50%). Involved high levels of collaboration (55%). Provided programs or services (65%) and shared information between partners (57%). Range of roles and initiatives.

Evaluation 63% were involved in evaluation or had been over last 6 months (n=187). 30% had received feedback. 62% were underpinned by an evaluation theory.

Study 2 Site visits with 4 partnerships. –Document analysis & interviews. Identified in Study 1. Nov 2010 to March to 8 organisations. 5 to 23 members. Range of roles. Range of PH initiatives.

Evaluation Current/6 months evaluation. Range of reasons: –‘to see if it works, and how it was working’ –‘to improve our program and support learning’ –‘It was mainly driven by the [funders]’ External evaluators. Theory driven and participatory approaches based on Centres for Disease Control model.

Influences on individuals & partnerships ‘It made me understand more about the information we collect. Um, and how we could collect information to show that we are making progress’. Partnership B ‘As a result of the evaluation findings, they agreed to reduce the price for retailers to re- new their licence, so they could re-grade’. Partnership C

Data Integration Survey –Kruskal-Wallis and Mann Whitney U. –PCA and Linear Regression. Interviews Thematic analysis Integration –* statistically significant –Themes identified

Evaluation Attributes Approach –Theory* –Learning* –Sophistication & quality ‘I don’t think there was a very robust methodological approach. You know the way the scoping was carried out, there wasn’t much planning before it actually happened, so I’m not sure that was very helpful for us.’

Eval Attributes cont.. Feedback –Credibility –Timeliness Evaluation capacity building** Evaluator –External* –Skills & expertise –‘I rely on the credibility and independence of [organisation] when I share the findings, because it is what people base their opinion on’. Partnership C

Stakeholder Characteristics Existing knowledge & attitudes Evaluation readiness* –‘Everyone was on the same page, and we were all involved in the evaluation. It was something that we recognised the value of, and something that we still go back to’. Partnership D Participation*

Partnership Evaluation Behaviour Partnership evaluation readiness –‘Some members are more involved than others. I guess for some it was an extra expense and they would rather just get on with it’. Partnership D Partnership support* Leadership participation*

Partnership Functioning Clarity of purpose and functioning –‘I think if you ask everyone what we do you’ll get a different answer, so we all say yeah that’s good and agree, it just doesn’t seem to happen. We all have a different focus’. Partnership B Adaptation or change processes* –‘I lead that, I love that, I get so excited and I am reading this stuff and writing notes you know. I use the evaluators to help me, so I don’t misinterpret and then we take it to the Steering Group’. Partnership A Shared commitment to partnership Funding & competitive environments

Contextual Characteristics Policy context Funding environment Resources, time & support for evaluation –‘They’ve been very supportive. We’ve been given the time to get involved in this, and it (evaluation) is listened to. So that’s been hugely beneficial for us’. Partnership A Existing attitudes of key decision-makers –‘I rather worry about collecting quantitative data because funders, well, stories on their own for funders are not influential’. Partnership C

Summary Provides support for the model. Highlights the complexity of evaluation influence. Is evaluation an intervention? Evaluation influence should be judged by its inputs and within its context.

Facilitating Influence Evaluation influence is facilitated by: Evaluation theory & focus on learning. Skilled & credible evaluator. Stakeholder evaluation readiness. Stakeholders’ existing functioning and systems. Stakeholders’ capacity to respond to evaluation. Wider support for evaluation.

Acknowledgements Supervision team: Dr Janet Clinton, Dr Peter Carswell and Dr Rob McNeill Population health partnership members and health sector staff from across New Zealand. The University of Auckland Full Doctoral Scholarship.

Sarah Appleton-Dyer

Principal Components Analysis Reduced data set into 6 factors explaining 52.36% of variance in the data: –Partnership functioning. –Evaluation attributes. –Evaluation capacity building. –Partnership evaluation behaviour. –Evaluation readiness. –Evaluation influence.

Linear Regression Model accounts for 42.3% of variance in evaluation influence. Factors predict evaluation influence significantly well. Following factors make a significant contribution to evaluation influence: –Evaluation capacity building, (Beta =.576, p<0.05). –Evaluation readiness, (Beta =.248, p<0.05).

Potential effects and influences Changes organizations (Williams; Davidson). Develops learning environment (Preskill). Enhances program success – when embedded (Clinton). Influences attitude, motivation & behaviour – individual, interpersonal & collective level (Mark & Henry). Empowers (Fetterman; Patton).