» Purpose ˃Share the draft evaluation framework and get input from the broader forum ˃Share some of the preliminary evaluation findings
» Consider: ˃What is it that the coalition would like to know (from the M&E) that if they knew it, would help the coalition do a better job? ˃Does the proposed framework and results get you there? Why don’t you jot a question or two down, before you forget it?
» Key to its sustainability, long-term growth and continued relevance to the learning community served by the Coalition. » Will essentially address three key questions: How is the Coalition performing? How can the Coalition be strengthened? How is the Coalition planning for sustainability ?
Decide purpose Specify key evaluationquestions Determine which criteria willbe considered to define“success” www.betterevaluation.org
» Learning: It is anticipated that the M&E will be used by the coalition’s project manager and key decision makers to inform its future work, and improve its practice. » Accountability: To prove to its donors that the resources provided to the organization is spent well and is delivering good results » Building the Knowledge base: To supplement the management of Bridge’s knowledge base about how communities of practice like this one can function optimally. How is the Coalition performing? How can the Coalition be strengthened? How is the Coalition planning for sustainability ?
» It is important to consider the relevance, effectiveness, impact, budget and timeline compliance, and sustainability of the Coalition (Organization for Economic Cooperation and Development – Development Assistance Committee Criteria) » If the coalition is successful we would expect ˃People to react positively ˃People to learn something (Skills, knowledge, values & attitudes) ˃People to change their individual behaviours ˃Organizations to start transforming Donald Kirkpatrick, the Strategic Training Model How is the Coalition performing? How can the Coalition be strengthened? How is the Coalition planning for sustainability ?
Relevance is the extent to which the coalition and its activities are suited to the priorities and policies of the target group, recipient and donor. How has the coalition’s objectives and activities changed since inception, and why? To what extent are the objectives of the coalition still valid? Are the key activities and outputs of the coalition still relevant to the overall goals of the coalition? Are the key activities and outputs of the coalition still relevant to the intended o impacts and effects influencing policy and practice, (finding a solution to the education crisis)? o impacts in schools? POSSIBLE EVALAUTION METHODS: Interviews / survey with key stakeholders and thematic analysis. Review of documented objective statements over time.
Effectiveness is a measure of the extent to which the coalition attains its objectives. To what degree did the coalition support its membership base to achieve and maintain the coalition’s values / minimum criteria? To what extent were the planned objectives achieved / are likely to be achieved? What were the major factors influencing the achievement or non-achievement of the objectives? POSSIBLE EVALUATION METHODS: Outlined in relation to each of the aspects contained in the logical framework
ACTIVITIES OUTPUTSOUTOMESIMPACTS 4 x National workshops per year where each of the five work streams meet Seventeen participating schools attend 4 x national workshops (30 – 50 participants) 2011 – 2012 All participating schools send at least one participant to each of the meetings 2013: All participating schools send at least two participants (Leader and an instructional practitioner) to all national workshops By August 2012: Involvement of education department stakeholders, and education agencies in national workshop. Satisfaction: Members report that the coalition provides a safe space for difficult conversations (trust) (Members Survey) Attitudes: Members feel that the coalition members have a common purpose (Analyse meeting notes, Members survey) Knowledge: Individual participants and Member schools as a whole shared and learnt new working practices from one another (Analyse meeting notes for evidence of sharing, survey asking for concrete examples rated on a rubric) Behaviour: New working practices adopted by participating individuals and schools as a whole (Survey asking for concrete examples rated on a rubric) More coalition schools can show strong evidence of maintaining criteria for membership and implementation of principles of excellent practice (Annual reports from schools on criteria, Stories / Case studies for principles) Schools improve performance on reported peer review results over time (approximately 3 year cycle) Peer review reports where at least two are available for each school)
The positive and negative changes produced by the coalition, directly or indirectly, intended or unintended. What were the intended / unintended changes, that the coalition contributed to at different levels? How have individuals and member organizations used the gains from the coalition meetings? How many people have been affected? o Participation in meetings / peer review / social media o Number of children in Coalition schools, number of success cases in Coalition schools POSSIBLE EVALUATION METHODS: Monitoring records on participation and school enrolment figures Social media monitoring (Twitter, Facebook, Bridge online) Interviews / Survey questions
A consideration of the actual costs required to run the coalition in addition to the budgeted grant amounts, and an indication of the extent to which the coalition met timelines What financial and non-financial contributions did the coalition leverage for its work over time? (e.g. core donor funding versus inputs from schools vs. funding from other donors) In which ways did the coalition contribute towards using the financial and other contributions optimally (i.e. did the cost of participation go up or down over time?) Did the coalition meet key project milestones timeously and if not, why not? Were the meeting outputs made available to all participants timeously, and if not, why not? POSSIBLE EVALUATION METHODS: Cost analysis, Verification of milestone analysis
Sustainability is concerned with measuring whether the benefits the coalition are likely to continue after donor funding has been withdrawn. Is the coalition still valuable enough to keep participants involved? Are the networks that were established of significance to sustain future maintenance of the coalition? Does the coalition have a concrete plan to sustain its activities should current funding cease? (E.g. Are Network champions developed?) Is the model of the coalition scalable / replicable? What are the risks to sustainability? POSSIBLE EVALUATION METHODS: Stories / Survey, Key informant interviews
The Coalition is generally effective in delivering the anticipated level of outputs » Meetings ˃The coalition managed to meet as frequently as anticipated in 2012 ˃More than 80% of member schools generally participate in National meetings, which is a good indication ˃Representation of the DBE in National meetings not yet realised » Training ˃The coalition hosted an international training session for peer reviewers, as well as local – pre-review training sessions. The number of people affected has not yet been determined » Peer Review ˃The coalition has not yet managed to meet its target for the number of peer-review meetings for 2012, but is relatively close to the target. » Sharing platforms ˃There is growth in the uptake of the online platforms
» National Workshops – aim for 4 National workshops per year with 30 – 50 participants from the participating schools ˃2011: +Had two meetings and a launch = 3 +Average number of schools represented = 14.5 this is 80.5% +Average Number of participants = 22.5 +Not yet representation from the Education Department ˃2012 +Had three meetings excluding this one = 4 for the year +Average number of schools represented = 17.3 this is a 91% +Average Number of participants = 25.3 +Not yet representation from the Education Department » Regional Workshops – none yet realised, planned for 2013
» Peer review training – Aim for 8 people from 7 schools to be trained in New York in 2012, run one 5.5 hour training session prior to each school review ˃2012 +8 People from 7 schools trained in New York +6 Days of pre-review training conducted » School Peer reviews – Aim for 7 – 8 schools per year visited for 2 – 3 days by coalition peer educators ˃2012 +Four completed, two confirmed and one cancelled
» Social Media ˃Growth in all three platforms – the portal, Facebook and Twitter ˃Usefulness of Social platforms for Advocacy and Knowledge management to be investigated.
» Most participants (i.e. those from schools, partners and donors) in the coalition feel that the objectives of the coalition is still very relevant to their organization and to them personally. » The Coalition is seen as very relevant because it provides a possible alternative answer: “Education in SA remains in crisis and schools like ours are there to deliver quality education to the marginalised.” » Participants are also satisfied that the activities are still relevant to impacting the national debate, and improving in-school performance
» What a donor says: “We are in the process of helping build this sector of schooling.” » What a partner says: “They align with our goals.” » What a school says: “These are issues that MATTER to the national debate about education reform. Too many children are being failed by a system packed with unaccountable adults, rather than a collective of professionals who make themselves responsible for raising achievement.”
» National meetings and the Peer Review process is seen as very relevant » What a school says: “I have been directly involved in peer review process and I have seen how it positively impacted on schools. Schools in the coalition are aiming at establishing a high quality teaching and learning, better leadership skills and conducive culture and ethos of schools. This is achievable because the peer review process advises and supports schools on how they can improve on these aspects. » But care should be taken to ˃foster more interaction – possibly by facilitating smaller regional meetings, so that more teachers can engage and ˃also to publish, learn from and advocate consistently
» Most participants indicate that the coalition is very valuable and that they would like to sustain their level of involvement – time permitting ˃People value inputs about especially peer review, and “teaching like a champion”, but also about data informed decision-making, approaches to continuing professional development, instructional leadership etc. ˃People feel motivated by interactions with other like-minded organizations » The financial support from donors makes participation possible for some organizations that would not be able to do so otherwise, and provides for a project manager that can coordinate the Coalition » Financial support is the key concern for future sustainability - Although some participants indicate that they might be willing to pay an affiliation fee, provide a venue or other in-kind support and/or carry some costs (e.g. travel and accommodation), others indicate that they are operating on the “Smell of an oil rag”
» What people say “The collegial links and the peer review mechanism on its own would be sufficient reason. For our school we have many association links, all of which bring value, even though they take up much time and at times I wonder if we can contribute at optimum level to all of them. The SAESC link would be one of the strongest for us to retain if we ever thought of rationalising our involvements.” “It has contributed to a transformed space at our school. We have focused more on instruction and data driven instruction in particular to target students and lift their performance. The peer reviews have been invaluable - both to participate in and receive. The value of constructive criticism and fresh insight has been very helpful. It has been the most helpful collegial space I have ever worked in over 36 years.”