Presentation is loading. Please wait.

Presentation is loading. Please wait.

EVALUATION FOR MODEL ADAPTATION WS3

Similar presentations


Presentation on theme: "EVALUATION FOR MODEL ADAPTATION WS3"— Presentation transcript:

1 EVALUATION FOR MODEL ADAPTATION WS3
1st March 2016, Lubjiana With financial support from the DAPHNE program of the European Commission

2 Evaluation approach Two different evaluation tasks in SAVE to be coordinated: Evaluation of project implementation in WS3, led by LH9, including the assessment of: Quality of the capacity building (training) Quality of the model approach Cost-efficiency of the model Transferability Experimentation assessment, coordinated by LH9 and collected by the partners responsible for experimentation: Outcome measures (evaluating objectives reached by SAVE model) Process measures (evaluating mechanisms and conditions of success )

3 Evaluation approach With the results UVEG-Polibienestar will adapt and adjust the SAVE model according to the evaluation report issued in month 22, concluding about previous evaluation results. Evaluation methodologies include Qualitative evaluation based on questionnaires and interviews to local decision makers and SAVE operators and stakeholders according to the evaluation of project implementation in WS3 Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators, measuring outcome and process indicators measures

4 Evaluation approach With the results UVEG-Polibienestar will adapt and adjust the SAVE model according to the evaluation report issued in month 22, concluding about previous evaluation results. Evaluation methodologies include Qualitative evaluation based on questionnaires and interviews to local decision makers and SAVE operators and stakeholders according to the evaluation of project implementation in WS3 Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators, measuring outcome and process indicators measures

5 Evaluation approach Qualitative evaluation of project implementation in WS3 (e.g. questions for questionnaries/interviews to SAVE operators): I. Quality of the capacity building (training) Which aspects of the knowledge/training provided do you consider could have a highest impact in your capacity to detect/prevent/manage child abuse in your daily work? II. Quality of the model approach At which extent the SAVE models cover the relevant elements in child abuse detection/prevention/management? Which additional element would you incorporate to the model? Why?

6 Evaluation approach Qualitative evaluation:
III. Cost-efficiency of the model Do you think the resources and time dedicated to adopt the SAVE model could be reduced? How? (for those using ICT component) ICT tools developed under the SAVE model have resulted of interest for you? Why? It has been efficient for you to participate in training using virtual tools? IV. Transferability Which barriers have you found in applying the model to your context/local/regional situation? Are there specific conditions limiting and/or facilitating the impact of the SAVE model? Could you please describe me which ones?

7 Evaluation approach Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators I. Outcome measures Number of cases detected (compared to baseline) Number of adults abused in childhood asking for case management/any support (even social). Knowledge/attitudes on violence prevention, detection & management (content test?) Self-efficacy perceived by SAVE operators (and any participant, including teachers, parents and maybe children) to detect, prevent, manage (based on Bandura theory, existing scales or items could be used/adapted) Quality of the social services dedicated to victims in case management experimentation (Likert-scale questionnaire compared to baseline) Awareness level of the external audience accessing SAVE resources (Likert-scale)

8 Evaluation approach Quantitative data collected by project partners involved in experimentation through a survey from SAVE operators: II. Process measures Variables describing the characteristics of the experimentation in each country: Modules applied (prevention, detection or management) Duration of training (hours), participants per group (N) and type (virtual/physical) Description of the % per type of participants (social services professional, parents, teachers, psychologist, children) Sociodemographic variables of the participants (age, gender, education, etc) Resources invested (cost/time/assets)

9 Evaluation approach NEXT STEP Evaluation framework elaboration (questionnaires / items) EVALUATION TIMELINE Pre-experimentation (before SAVE implementation) In-experimentation (mid-term evaluation within the implementation process) Post-experimentation (just after the end of SAVE implementation) Follow up (out of the project lifespan)

10 Thank you!


Download ppt "EVALUATION FOR MODEL ADAPTATION WS3"

Similar presentations


Ads by Google