EVALUATION FOR MODEL ADAPTATION WS3

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Raising a Child through Prison Bars With financial support from the EU DAPHNE Programme European Anti- Violence Network Needs Assessment with Prison’s.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Sub-regional Workshop for the Gulf Countries to Launch the Education for All National Assessments Sharjah, June, 2013 National EFA 2015 Review Technical.
What is Action Research? Unit 1
IA-DMM Measures and Results for Year 1. Cohort 1 as of 6/23.
V.I.D.E.O. Video-CVto Increase and Develop Employment Opportunities THE RESEARCH ACTIVITIES OF THE V.I.D.E.O. PROJECT Marco Merlini First Transnational.
Quality assurance in IVET in Romania Lucian Voinea Mihai Iacob Otilia Apostu 4 th Project Meeting Prague, 21 st -22 nd October 2010.
Needs Analysis Session Scottish Community Development Centre November 2007.
Session 3C: Monitoring and Evaluation of C/DRR: Tools and Strategies
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Measuring effects and impact: Analysis of practices in Analysis of practices in HI & partners projects Impact and evaluation seminar - 2 to 6 December,
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Careers Education in Estonian Schools Tiina Trampärk Careers education specialist.
JCint - JobCreator International Network and Web Services n. LLP-LDV-TOI-09-IT-0502 This project has been funded with support from the European Commission.
Prodotto da Elena Turchi 4^H Scienze Sociali. ESkills Kit.
EU Funding opportunities : Rights, Equality and Citizenship Programme Justice Programme Jose Ortega European Commission DG Justice.
Guidelines for non-Annex I National Communications Implications for Assessment of Impacts of, and Adaptation to Climate Change Asia-Pacific Regional Workshop.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
National Agencies’ contribution to the evaluation of Grundtvig Action and to the National Evaluation Report on Socrates Programme Socrates - Grundtvig.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Successes and Challenges of Integrating Quality Improvement in HIV Prevention: Results of the Quality Action's mixed method evaluation Christiana Nöstlinger.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
Back to the Future: Evaluation and Measurement of Learner Outcomes in Financial Education National Endowment for Financial Education® (NEFE®) August 2-4,
Dr. Anubha Rajesh, Early Education Services, ICF January 13, 2016 Invest in Young Children: National Conference on Mother- tongue Based Multilingual Early.
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
Supporting community action on AIDS in developing countries ‘Evaluating HIV and AIDS-related advocacy’ Skills Building Session International.
Ricardo Furman Senior Evaluation Officer- Geneva Daniel Chachu M&E Officer - Accra EIA Evaluation and Impact Assessment Section International Programme.
SOCIAL INCLUSION IN EASTERN EUROPE AND CENTRAL ASIA TOWARDS MAINSTREAMING AND RESULTS SOCIAL INCLUSION IN EASTERN EUROPE AND CENTRAL ASIA TOWARDS MAINSTREAMING.
Building an ENI CBC project
Quality Assurance MENTEP Yves Beernaert, Educonsult
Evaluating the Quality and Impact of Community Benefit Programs
MULTIPLIER EVENT January , Brussels.
EVALUATION OF GUIDANCE PROGRAMME
Common attentions and many differences
Monitoring and Evaluation
Background Non-Formal Education is recognized as an important sub-sector of the education system, providing learning opportunities to those who are not.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
M&E Basics Miguel Aragon Lopez, MD, MPH
Training Trainers and Educators Unit 8 – How to Evaluate
a New Focus for External Validity
Fundamentals of Monitoring and Evaluation
11 ii. Develop a plan for aDSM
Development and Transfer of Technologies under the UNFCCC
Conducting Efficacy Trials
Department of Early Childhood Education
Provincial Evaluation Plan By Kathleen Douglas-England
Funded by the Erasmus+ Programme EPP JO-EPPKA2-CBHE-JP Lina Tsakalou
“CareerGuide for Schools”
Training Trainers and Educators Unit 8 – How to Evaluate
Planning a Learning Unit
The Eureka Project Evaluation
Presented by: Community Planning & Advocacy Council.
Follow-up and Evaluation Mechanism
Preparatory Action 2011 European Voluntary Humanitarian Aid Corps Call for proposal
Adult Education Survey : recommendations of the TF AES
Instructions for Using the Program Evaluation Toolkit
IENE – INTERCULTURAL EDUCATION OF NURSES AND MEDICAL STAFF IN EUROPE
Data Collection: Designing an Observational System
CHRODIS PLUS WP8 Task 8.2 – Toolkit and pilot
Field monitoring Project (number and title)
Usefulness of participants’ reports in studying and analyzing outcomes and learning effects in Erasmus+. Siru Korkala 29 Nov 2018.
Integrating Gender M&E Capacity Strengthening Workshop, Addis Ababa
Integrating Gender into Rural Development M&E in Projects and Programs
Program Evaluation for Development
M & E Plans and Frameworks
Presentation transcript:

EVALUATION FOR MODEL ADAPTATION WS3 1st March 2016, Lubjiana With financial support from the DAPHNE program of the European Commission

Evaluation approach Two different evaluation tasks in SAVE to be coordinated: Evaluation of project implementation in WS3, led by LH9, including the assessment of: Quality of the capacity building (training) Quality of the model approach Cost-efficiency of the model Transferability Experimentation assessment, coordinated by LH9 and collected by the partners responsible for experimentation: Outcome measures (evaluating objectives reached by SAVE model) Process measures (evaluating mechanisms and conditions of success )

Evaluation approach With the results UVEG-Polibienestar will adapt and adjust the SAVE model according to the evaluation report issued in month 22, concluding about previous evaluation results. Evaluation methodologies include Qualitative evaluation based on questionnaires and interviews to local decision makers and SAVE operators and stakeholders according to the evaluation of project implementation in WS3 Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators, measuring outcome and process indicators measures

Evaluation approach With the results UVEG-Polibienestar will adapt and adjust the SAVE model according to the evaluation report issued in month 22, concluding about previous evaluation results. Evaluation methodologies include Qualitative evaluation based on questionnaires and interviews to local decision makers and SAVE operators and stakeholders according to the evaluation of project implementation in WS3 Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators, measuring outcome and process indicators measures

Evaluation approach Qualitative evaluation of project implementation in WS3 (e.g. questions for questionnaries/interviews to SAVE operators): I. Quality of the capacity building (training) Which aspects of the knowledge/training provided do you consider could have a highest impact in your capacity to detect/prevent/manage child abuse in your daily work? II. Quality of the model approach At which extent the SAVE models cover the relevant elements in child abuse detection/prevention/management? Which additional element would you incorporate to the model? Why?

Evaluation approach Qualitative evaluation: III. Cost-efficiency of the model Do you think the resources and time dedicated to adopt the SAVE model could be reduced? How? (for those using ICT component) ICT tools developed under the SAVE model have resulted of interest for you? Why? It has been efficient for you to participate in training using virtual tools? IV. Transferability Which barriers have you found in applying the model to your context/local/regional situation? Are there specific conditions limiting and/or facilitating the impact of the SAVE model? Could you please describe me which ones?

Evaluation approach Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators I. Outcome measures Number of cases detected (compared to baseline) Number of adults abused in childhood asking for case management/any support (even social). Knowledge/attitudes on violence prevention, detection & management (content test?) Self-efficacy perceived by SAVE operators (and any participant, including teachers, parents and maybe children) to detect, prevent, manage (based on Bandura theory, existing scales or items could be used/adapted) Quality of the social services dedicated to victims in case management experimentation (Likert-scale questionnaire compared to baseline) Awareness level of the external audience accessing SAVE resources (Likert-scale)

Evaluation approach Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators: II. Process measures Variables describing the characteristics of the experimentation in each country: Modules applied (prevention, detection or management) Duration of training (hours), participants per group (N) and type (virtual/physical) Description of the % per type of participants (social services professional, parents, teachers, psychologist, children) Sociodemographic variables of the participants (age, gender, education, etc) Resources invested (cost/time/assets)

Evaluation approach NEXT STEP Evaluation framework elaboration (questionnaires / items) EVALUATION TIMELINE Pre-experimentation (before SAVE implementation) In-experimentation (mid-term evaluation within the implementation process) Post-experimentation (just after the end of SAVE implementation) Follow up (out of the project lifespan)

Thank you!