CAST Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme CAST Plenary Meeting 30 September.

Slides:



Advertisements
Similar presentations
Technical skills and competences
Advertisements

Critical Reading Strategies: Overview of Research Process
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Identifying enablers & disablers to change
Episode 3 / CAATS II joint dissemination event Lessons Learnt Episode 3 - CAATS II Final Dissemination Event Philippe Leplae EUROCONTROL Episode 3 Brussels,
Chapter 4 Scope Management
Alternative Assesment There is no single definition of ‘alternative assessment’ in the relevant literature. For some educators, alternative assessment.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
ROAD SAFETY ETP EVALUATION TRAINING
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Fundamentals of Information Systems, Second Edition
UGDIE PROJECT MEETING Bled September WP6 – Assessment and Evaluation Evaluation Planning  Draft Evaluation plan.
SOCIAL MARKETING GÜLŞAH KILIÇKAYA EMRE AYDINLIOĞLU DİBA TAŞDEMİR OYA MURATOĞLU 1.
Scaling and Attitude Measurement in Travel and Hospitality Research Research Methodologies CHAPTER 11.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Chapter 6 Training Evaluation
Chapter 8 The Marketing Plan
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
The Knowledge Resources Guide The SUVOT Project Sustainable and Vocational Tourism Rimini, 20 October 2005.
Recent international developments in Energy Statistics United Nations Statistics Division International Workshop on Energy Statistics September 2012,
RESEARCH A systematic quest for undiscovered truth A way of thinking
‘One Sky for Europe’ EUROCONTROL © 2002 European Organisation for the Safety of Air Navigation (EUROCONTROL) Page 1 VALIDATION DATA REPOSITORY Overview.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Developing Business Practice –302LON Using data in your studies Unit: 5 Knowledgecast: 2.
The Innovation Management Model Nº LLP ES-LEONARDO-LNW.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
” Particulates „ Characterisation of Exhaust Particulate Emissions from Road Vehicles Key Action KA2:Sustainable Mobility and Intermodality Task 2.2:Infrastructures.
Prof Max Munday The E4G Toolkit. What is an E4G project expected to do/collect in terms of visitor numbers and related information? When you need to deliver.
Evaluating a Research Report
Essential SNA Project being developed from 2011 to 2013.
Deliverable 2.6: Selective Editing Hannah Finselbach 1 and Orietta Luzi 2 1 ONS, UK 2 ISTAT, Italy.
Kick Off Meeting June WP6: Overview WP Leader: Province of Asti (in close collaboration with iMpronta) Importance of monitoring and evaluation:
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Copyright  2004 McGraw-Hill Pty Ltd. PPTs t/a Marketing Research by Lukas, Hair, Bush and Ortinau 2-1 The Marketing Research Process Chapter Two.
1 Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta, th April 2014.
CAST Final Conference Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme 30 September.
REAL WORLD RESEARCH THIRD EDITION Chapter 8: Designs for Particular Purposes: Evaluation, Action and Change 1©2011 John Wiley & Sons Ltd.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Introduction to Development Centres Sandra Schlebusch The Consultants.
CAST Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme.
The European agenda on improving the efficiency of employment and social policies: Bratislava, December 2011 The example of social experimentation.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
IMPACT 3-5th November 20044th IMPACT Project Workshop Zaragoza 1 Investigation of extreme flood Processes and uncertainty IMPACT Investigation of Extreme.
Marketing Research Process and Types of Marketing Research.
SENnet Thematic Study - Year 1 Leuven 3rd Consortium meeting - October 9-10.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Grading based on student centred and transparent assessment of learning outcomes Tommi Haapaniemi
Duncan Jordan CAM Examiner Assignment brief December 2013 / March 2014 Marketing and Consumer Behaviour.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
PROCESS ASSESSMENT AND IMPROVEMENT. Process Assessment  A formal assessment did not seem financially feasible at the onset of the company’s process improvement.
Marketing Research.
Planning my research journey
LO1 - Analyse the impact and influence which the macro environment has on an organization and its business strategies 1. P1 Applying appropriate frameworks,
Towards more flexibility in responding to users’ needs
Introduction to Marketing Research
Documentation of statistics
Survey on the implementation of Directive 2008/50/EC
RECARE set-up Rudi Hessel on behalf of coordination team
WG 2.9 Best Practices in River Basin Planning
Sustainable Development
Energy Statistics Compilers Manual
RESEARCH BASICS What is research?.
Chapter 8 The Marketing Plan
DG Environment, Unit D.2 Marine Environment and Water Industry
THE WATER FRAMEWORK DIRECTIVE (WFD)
Scene setter European Commission DG Environment
Presentation transcript:

CAST Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme CAST Plenary Meeting 30 September Oslo

30 September, OsloPlenary Meeting CAST Overview presentation 1.Objective WP2 2.Deliverables WP2 D2.1 - results D2.2 - results D2.3 D2.4 3.Conclusion

30 September, OsloPlenary Meeting CAST 1.OBJECTIVES WP2 Main research question: How to effectively measure and report the effects of campaigns? Developing two practical tools: –Evaluating tool –Reporting tool: guidelines for reporting the campaign results in a standardised way

30 September, OsloPlenary Meeting CAST 2.DELIVERABLES WP2 D2.1Typology of evaluation methods  Finalised D2.2Comparison of research design  Finalised D2.3Evaluation tool  project month 36 D2.4Reporting tool  project month 36

30 September, OsloPlenary Meeting CAST 3.D2.1: TYPOLOGY OF EVALUATION METHODS Investigation on how road safety campaigns in Europe and beyond have been implemented and evaluated OBJECTIVE  Typology of road safety campaigns  with respect to theme, target group, objectives, media plan etc.  Inventory of evaluation methodologies  with respect to research design, measurement variables, data collection methods and techniques RESULT  Current state-of-the-art regarding RSC and their evaluation  Strengths and weaknesses of current evaluation reports  Identification of relevant attributes of RSC that have significant implications for evaluation

30 September, OsloPlenary Meeting CAST D2.1: TYPOLOGY OF EVALUATION METHODS RESULT  Current state-of-the-art of RSC and evaluation –national campaigns lasting up to one month and being part of the long-term strategy –general themes as speeding, seat belt use and intoxicated driving –frequently media channels: television, radio advertising, billboards, free press and internet –message appeal described as informative, emotional and confronting –integrated campaigns –EVALUATION single-group evaluation designs either with one or multiple measurements self-reported or observational data – frequently measuring both reach/recall and effectiveness of campaign

30 September, OsloPlenary Meeting CAST D2.1: TYPOLOGY OF EVALUATION METHODS RESULT  Strengths and weaknesses of current evaluation reports  STRENGTHS –Theme, approach, used media channels, accompanying activities are mentioned –Detailed description of the target group –Most campaigns are evaluated in terms of both reach/recall (reception of the campaign) and the effectiveness (measurement) variables

30 September, OsloPlenary Meeting CAST D2.1: TYPOLOGY OF EVALUATION METHODS RESULT  WEAKNESSES –exact running period / duration of the campaign is missing –objectives are not clearly defined –objectives are not always clearly separated between the main groups such as knowledge, awareness, attitude, behaviour, and accident rate. –media production costs and evaluation costs are hardly known –information about the accompanying activities in case of an integrated campaign is often missing –measurement variables are often not consistent with the pre-set objectives –cost-benefit and/or cost-effectiveness analyses are seldom implemented

30 September, OsloPlenary Meeting CAST D2.1: TYPOLOGY OF EVALUATION METHODS RESULT  Identification of attributes with significant implications for evaluation -Scope -Target group -Objectives -Accompanying activities -A-priori knowledge  Each relevant attribute has its particular implication for the evaluation  Combination of implications will determine the appropriate evaluation methodology for a campaign  Deliverable 2.3

30 September, OsloPlenary Meeting CAST 3.D2.2: COMPARISON OF RESEARCH DESIGNS Listing ALL possible measurement variables, research designs, data collection methods and techniques that are at least theoretically possible to use for a summative evaluation AIM Comparison with regard their merits and weaknesses to measure the effect of a media or integrated RSC RESULT Introduction of the appropriate evaluation methodology for different campaign types

30 September, OsloPlenary Meeting CAST D2.2: COMPARISON OF RESEARCH DESIGNS classification evaluation types: 1.Formative evaluation – before implementation 2.Summative evaluation – effect (goals reached?) and reach/recall 3.Economic evaluation – CBA and CEA

30 September, OsloPlenary Meeting CAST D2.2: COMPARISON OF RESEARCH DESIGNS Evaluation methodology  Measurement variable(s)  Research design  Data collection method(s)  Data collection techniques  every component is presented and discussed in detail

30 September, OsloPlenary Meeting CAST D2.2: COMPARISON OF RESEARCH DESIGNS Measurement variables  Assessment of reach and effectiveness  Three groups of measurement variables: -Self-reported measures -Reach, recognition, recall, likeability, comprehension -Social cognitive variables and behaviour -Observed behaviour -Changes in accident statistics ! Important to assess the right variable in relation to the pre-set objectives of the campaign

30 September, OsloPlenary Meeting CAST D2.2: COMPARISON OF RESEARCH DESIGNS Research design  Experimental  Quasi-experimental -Aim -Inputs/ouputs -Threats to validity -Applicability for evaluation  Selection of a design -Certain degree between rigor and applicability -costs -other practical issues

30 September, OsloPlenary Meeting CAST D2.2: COMPARISON OF RESEARCH DESIGNS Data collection methods  Method of asking -questionnaires -interviews -key informants -focus groups  Method of observing -on-site  Method of document analysis -statistics Advantages and disadvantages  complementary

30 September, OsloPlenary Meeting CAST D2.2: COMPARISON OF RESEARCH DESIGNS CONCLUSION  Each RSC is different  appropriate evaluation methodology  All evaluation elements are related:  Data collection method defines DC techniques and measurement variables  Measurement variables depend also on the aim/objective of campaign  Isolated effect of an integrated campaign  Choose a proper design  Comparison between different phase and elements of the campaign  Before – after  Multiple intervention groups/periods  In practice?

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL AIM -a practical tool to help researchers/practitioner evaluate a single campaign -a best practice manual depending on the characteristics of the campaign RESULT  Finalisation and adaptations according review comments workshops and WP4 will be discussed during technical WP2 meeting in Oslo (1-2 October 2008)

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL Important attributes Scope –scale on which to measure effectiveness –Implications for e.g. research design Objectives –Clearly defined objectives can be seen as criteria for campaign’s success or failure –Implications for data collection techniques / measurement variables Target group –Clear specification –Implication for choice research design Accompanying activities –Issues about how to isolate the effects of the media campaign A-priori knowledge –Can be used as a before measurement

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL Theme –Influence evaluation questions but objectives are far more determinative Message appeal –Important factor to clarify success/failure, but not crucial from the evaluation Media coverage –Choice depends on the objectives and the target group –Influences the evaluation questions and choice of measurement variables

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL CAST workshops 2008 – sessions evaluation tool  General remarks regarding campaign evaluation  Expectation of an evaluation tool  Possibility and/or restrictions in implementing the CAST evaluation methodology  Opposed propositions regarding:  General guidelines or specific standardised questionnaire  What to measure?  Feasibility and necessity to isolate the effects of an integrated campaign  Minimum criteria or best practice?

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL Comments of the CAST workshops 2008:  Tool should contain a minimum tool – minimum standards for evaluation  Two tools: one to define campaign objectives and one for evaluation itself  Simple to use and cost-efficient  How can you learn from the evaluation results?  Checklist!  Message: evaluations are important, reasons to convince people…  Marketing of the tool?

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL  Ready to use questions with specific examples  Necessity of a profound situational analysis to identify the relevant concepts  A compulsory and an optional list of measurement variables  Clear statements regarding feasibility or not of measuring the isolated effect of the media campaign itself!  What is the effect of enforcement alone? (PEPPER?)  Several recommendations for several types of campaigns (one best practice is not possible) with a lot of examples!

30 September, OsloPlenary Meeting CAST D2.3: EVALUATION TOOL Comments from WP4?  Report on Wednesday 22 October 2008

30 September, OsloPlenary Meeting CAST D2.4: Reporting tool AIM  Guidelines for fieldworkers and researchers for reporting effects of a single campaign in a standardised way  A template to write down the working method and the results of evaluation RESULTS  Update document December 2007  Structure will be linked to D2.3  Finalisation and adaptations according review WP4 will be discussed during technical WP2 meeting in Oslo (1- 2 October 2008)

30 September, OsloPlenary Meeting CAST 3. Conclusion Comments workshops: OK Comments WP4: 22 October 2008 Half December: deadline – review  A lot of planning and work to do!!