Presentation on theme: "MADELEINE GABRIEL & JULIE DAS 9 FEBRUARY 2010. PURPOSE OF THIS SESSION Introductions Purpose of our input today Our role – we are evaluating the programme,"— Presentation transcript:
MADELEINE GABRIEL & JULIE DAS 9 FEBRUARY 2010
PURPOSE OF THIS SESSION Introductions Purpose of our input today Our role – we are evaluating the programme, we are not evaluating you as participants of it. Your role – to feed back your reflections, local evaluation data and be part of the research.
1 THE EVALUATION
WHY EVALUATE THIS ACTION LEARNING PROGRAMME? A pilot programme Policy importance Understand what works/does not work and why (Process) Understand what has changed (Outcomes) Independent feedback on the above Implications for roll out
OUR APPROACH - THE LOGIC MODEL
The Logic Model for the Patient Experience Action Learning programme RationaleActivities and inputsOutputsShort term outcomesLonger term outcomes/impact Programme logic Strong policy and other drivers to improve patient experience, but NHS organisations can find this challenging. Action learning supporting NHS staff to plan and implement projects in ‘real time’ can accelerate development and capture learning about what works that can be cascaded more widely Ten month programme, involving six full day national workshops, an online network and WebEx coaching sessions, aimed at 50 participants from a range of NHS organisations and across the ten SHA areas NHS staff from a range of organisations: Programme supports participants to: Participants are able to: In the longer term, outcomes for patients and patient satisfaction improve Evaluation questions How clearly the programme objectives articulated? How well are they understood by participants and their sponsors? How are needs and expectations of the individual participants understood and how far do they shape the way the programme is designed and delivered? How effectively are programme objectives translated into learning objectives for specific activities within the programme? How appropriate is design and structure to meet the programme’s objectives? For example, is the timescale sufficient? Is the frequency of workshops and coaching sessions right? Does the group size and composition enable the right type of learning to take place? How are participants selected/recruited? Do facilitators have the right level of kills and knowledge? How well is the programme administered? Are patients involved in the programme? Have target numbers been engaged, and does participation remain at a high level? What types of individual take part? What is their background and experience? Is the programme reaching its target audiences? How satisfied are participants with the programme? Does participation lead to improvements in individual capability? Do participants begin to view fellow participants as a community of practice? To what extent is this further developed outside of the programme? Will it be sustained afterwards? What actions do participants take as a result of the programme? Are they able to set up workplace projects? Have they established baseline and metric measures? What new approaches, tools and learning are generated by the programme? Have participants been able to get buy-in from colleagues to support new approaches? Is there evidence that they have transferred knowledge and skills to others? What new approaches to improving patient experience have been tried out? What has worked well and why? What’s been more challenging? How has the action learning programme supported participants to succeed, and what other factors have influenced success? What difference have these made/are these likely to make to patients? Is this likely to lead to longer term change in organisations’ ways of working? participate in a range of learning activities plan, implement and evaluate projects in their own organisations learn about patient experience models and tools develop own knowledge about good practice build peer networks pilot new approaches overcome obstacles and challenges faced cascade learning, change perceptions/attitudes within their own teams/organisations implement service changes through a patient experience approach
OUR RESEARCH METHODS A mixed method approach, involving: Interviews (Stakeholders) Observations (Workshops and WebEx) Survey (Participants) Case Studies (Projects) Evaluation Support (Secondary data)
The Plan January Initial Observations & planning January Initial Observations & planning February – March Case study research part 1 & survey February – March Case study research part 1 & survey April - August Analysis, interim report, observations April - August Analysis, interim report, observations September – October Case study research part 2, Survey September – October Case study research part 2, Survey October - November Final Report October - November Final Report
YOUR HELP 2
Your input is vital !! Case studies (10) March & October Workplace project evaluations (all) The Survey: February and March (all)
THE NEXT WORKSHOP… Guidance on evaluating your projects Thinking about the logic model for your evaluations Identifying the link from outputs to outcomes Measurement and capturing learning We will be delivering part of the March 2nd workshop
A SMALL TASK... 3
BE PREPARED The problem you are trying to tackle (be specific!) What changes you want to see: in the way you and your team work? for patients? Prepare some bullet points and be ready to discuss the following with regard to your projects.