Download presentation
Presentation is loading. Please wait.
Published byCarly Holaday Modified over 9 years ago
1
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD Trust Fund
2
Outline Terminology Motivation for focusing on evaluation The OECD/INFE High-level Principles Other OECD/INFE tools developed under the Trust Fund
3
Terminology Checking whether programme targets were met by monitoring inputs and outputs Keeping track of the day-to-day inputs and processes involved in delivering the education Assessing the outcomes and impact for participants Potentially, also analysing the cost-effectiveness of the programme Monitoring and Evaluating financial education programmes: what do we mean?
4
Motivation for focusing on evaluation Widespread policy interest in the role of financial education requires answers to pressing questions Does financial education work? What makes it work? How does it help consumers? When is consumer protection required? Good evaluation allows policy makers to: Identify programmes to replicate Test different approaches See where fine-tuning could be useful Show that objectives are being met and reward staff Share experiences and learn from others
5
The challenges faced 46 Authorities from 29 countries responded to an OECD/INFE request for information about the extent to which they were evaluating and the challenges they faced; 28 authorities in 23 countries had evaluated The most frequently faced challenges:
6
OECD INFE High-level Principles 3 steps : planning, implementation, reporting 5 key principles (discussed in following slides): 1.New programmes should be evaluated; try to also evaluate existing programmes 2.Include evaluation in the budget 3.Evaluation should be credible: consider an external evaluator or reviewer 4.Design the evaluation in accordance with the objectives and type of programme 5.Report what worked, and what didn’t work 6
7
1. Programmes should be evaluated New programmes: Develop a strategy for monitoring and evaluating alongside programme design Keep in mind the benefit of collecting information before the programme start All programmes: Encourage dialogue and collaboration with key stakeholders to ensure clarity and consistency of aims and objectives Reassure providers that evaluation is not designed to judge them, but to improve efficiency if appropriate and to identify successful programmes 7 FE Programmes planning should include evaluation
8
2. Include evaluation in the overall budget How much money do you include for evaluation?: Find out how much other evaluations have cost and gather estimates before finalising the programme budget Managing the evaluation budget: When limited funds are available prioritise certain aspects of evaluation Look for ways of reducing costs: e.g. sharing questionnaires, drawing on existing data and international methodology and drawing on contacts; piloting programmes before large scale roll-out 8 A good evaluation ensures that resources are being well spent: it is a wise expense!
9
3. Evaluation should be credible External evaluators bring skills and independence Credibility can also be improved through: The use of technology Administrative systems and websites can provide objective data. Electronic games can store scores and be used to measure improvement over time. Well designed instruments: Survey, test, interview or focus group questions should be based on good-practices Corroboration of the findings through analysis of other sources of data, such as pension fund records, credit counselling services etc 9
10
4. Appropriate evaluation design Continuous monitoring: Count/measure/quantify- how many participants, hours of contact, leaflets distributed etc Measure change according to programme type and objectives: Monitor improved awareness, evaluate behaviour change strategies, test knowledge Identify ways of attributing change: create a control group- lottery for places, random marketing of courses– according to programme design. Undertake comparisons of: knowledge, behaviour, attitudes before vs. after – and long after; participants vs. non-participants; targets vs. achievements; budget vs. expenditure, opinions of providers vs. users 10
11
5. Reporting Reporting is critical for the future of FE programmes Avoid over generalisation get advice on whether findings may apply more widely Report the method and limitations of the evaluation Disseminate the findings widely use different styles of reporting (newsletter, academic paper..) Draw on the findings when making future funding decisions & designing future programmes Compare your results to those of other programmes 11
12
Evaluation research and tools Fact-finding stock take of programme evaluation amongst INFE members and framework for evaluation OECD/INFE set of criteria, principles, guidelines and policy guidance to improve financial education: Measurement and evaluation tools OECD/INFE guides to evaluation 12
13
Questions, comments, further information: adele.atkinson@oecd.org OECD/INFE www.financial-education.orgwww.financial-education.org Russian Trust Fund www.finlitedu.orgwww.finlitedu.org 13
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.