CONFERENCE EVALUATION PLANNING Good planning is essential ! (‘’fail to plan is plan to fail’’)

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Donald T. Simeon Caribbean Health Research Council
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
 Reading School Committee January 23,
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Rest of Course Proposals & Research Design Measurement Sampling Survey methods Basic Statistics for Survey Analysis Experiments Other Approaches- Observation,
Strategic Management Process Lecture 2 COMT 492/592.
Title I Needs Assessment and Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
1 Selecting A Research Method. 2 Selecting research methods How do we answer our research questions? Methodology depends on question, budget, timing,
Student Assessment Inventory for School Districts Inventory Planning Training.
GTM for Product Leaders Project Overview A project that guides product leaders and their teams in developing a successful go-to-market strategy.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
CRISIS COMMUNICATIONS PLANNING A rehearsal for crisis Planning is key.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Committee Planning and Budgeting Tool: Outlining the Roles and Responsibilities of Selected Committee Support Personnel A Presentation to Committee 8 July2014.
Reporting and Using Evaluation Results Presented on 6/18/15.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
HIDALGO COUNTY ASSET MAPPING AND STRATEGIC PLANNING PROJECT Kickoff Meeting
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
Evaluation of the Indianapolis, Indiana September 2002 – August 2003 Stamp Out Syphilis Coalition.
UNIT 9: An Ecosystem Approach to Fisheries Management Plan.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Civil society participation in the Universal Periodic Review.
1 Department of Medical Assistance Services Stakeholder Advisory Committee June 25, 2014 Gerald A. Craver, PhD
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
Building Capacity for effective government wide Monitoring and Evaluation Mr Oliver Seale M&E Learning Network Tuesday, 15 May 2007.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Review Presentation Wafaa Alsaggaf S May
United Nations Regional Seminar on Census Data Dissemination and Spatial Analysis for Arabic Speaking Countries, Amman, Jordan May 2011 Identification.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Data Driven Professional Learning Communities Hertford County Schools.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
GENEVA EVALUATION NETWORK WORKSHOP CONFERENCE EVALUATION Organized by Laetitia Lienart & Glenn O’Neil Geneva, 16 March 2011.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Introduction to policy briefs What is a policy brief? What should be included in a policy brief? How can policy briefs be used? Getting started.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Developing SEA Change’s Evaluation Plan
MWSD. Differentiated Supervision Mode (DSM)  Reference Pages in Plan Book 8-16 Description of Differentiated Mode Relevant Appendices 34 Teacher.
United Nations Regional Seminar on Census Data Dissemination and Spatial Analysis for Arabic Speaking Countries, Amman, Jordan May 2011 Identification.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Defining Clear Goals and ObjectivesDefining Clear Goals and Objectives Barbara A. Howell, M.A. PAD 5850.
Sustainability Principles for Land Use and Mobility Approved by City Council – January 2007.
Wimba Session Two Program Evaluation Project
Adding evaluation to your plan and next steps: proposal Webinar 4 of the series Mapping an Outreach Project: Start with Information; End with a Plan The.
Rest of Course Proposals & Research Design Measurement Sampling
Assessment of Your Program Why is it Important? What are the Key Elements?
Session 2: Developing a Comprehensive M&E Work Plan.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Monitoring of the project consists of: 1. Project budget control (p ) Monitoring of project activities (p.29) 2. Check whether all activities - are.
National Network of Libraries of Medicine Outreach Evaluation Resource Center Planning Outcomes-Based Outreach Programs.
© PeopleAdvantage 2013 All Rights Reserved We will Show You How to Easily Conduct Effective Performance Appraisals LCSA Conference 2013.
NIH Change Management Program Change Management Program Overview March 8,
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Center for Applied Linguistics
Session VII: Formulation of Monitoring and Evaluation Plan
Program Evaluation Essentials-- Part 2
Audit of <insert title and audit #>
VERITE – Dissemination plan
APMP Professional Certification
Presentation transcript:

CONFERENCE EVALUATION PLANNING Good planning is essential ! (‘’fail to plan is plan to fail’’)

KEY SOURCES OF INFORMATION TO DESIGN A CONFERENCE EVALUATION PLAN  Conference objectives and expected outcomes  Conference information available to the public  Key documents disseminated to a restricted list of people involved in the conference planning & organization (e.g. concept notes, meeting reports, etc.)  Reports of & lessons learnt from previous conference evaluations  Consultations with as many stakeholders as possible, including committee members, conference organizers, etc.

KEY THINGS TO KEEP IN MIND WHEN DESIGNING A CONFERENCE EVALUATION PLAN Who?The plan should preferably be designed by the evaluation team leader, in consultation with key stakeholders. The plan should be eventually endorsed by the highest conference governing body. When?The plan should be prepared well in advance and should be approved at least 3 months before the conference starts but this depends on the evaluation scope (e.g. some evaluations may include assessment of pre-conference events and processes that can start 6 months before the actual conference) What?The plan should be a comprehensive document, easy to disseminate by , and include the following…

KEY ELEMENTS TO INCLUDE IN A CONFERENCE EVALUATION PLAN Why?Define the main goal of the evaluation, its objectives (these can be formulated as key questions to be answered through the evaluation) and focus (processes, outcome, impacts or all). Identify who will use evaluation results and how. This should also include background information on the conference to be evaluated. How?Describe all methods/tools that will be used (the selection of these methods/tools is guided by the type of evaluation objectives and depends on what is achievable with the available human resources and within the timeframe). Need a good mix of quantitative and qualitative methods to ensure data triangulation. Describe how the data will be analyzed.

KEY ELEMENTS TO INCLUDE IN A CONFERENCE EVALUATION PLAN (cont.) Resources & Budget Description of human, physical and financial resources required to conduct the evaluation. Specify main responsibilities of each evaluation team member. Budget should be as detailed as possible. TimeframeGanttchart or timeline with main deadlines/targets. LimitationsDescription of evaluation limitations. Appendix (examples) List of persons consulted List of indicators and methods to measure them List of all planned surveys/interviews specifying for each of them the main focus, target group & administration period

EXAMPLES OF METHODS TO COLLECT DATA Face-to-face or phone individual interviews (structured & semi-structured) Focus group interviews Online surveys Printed surveys Structured observations of key sessions and conference areas Review of conference programme and online resources Review of statistical data on conference registration, scholarship recipients, abstracts, etc Review of statistical data and evaluation findings from previous conference to allow comparison over time

EXAMPLES OF METHODS TO COLLECT DATA (cont.) Use of rapporteurs to follow sessions addressing key topics. Their feedback can be also used to measure some indicators (e.g. number of sessions presenting new findings). Analysis of the conference media coverage. Review of posts and comments left by delegates and non-attendees on the conference blog, Facebook page and Twitter. “Mystery shopper" approach to test registration or other conference services. Instant polling/feedback at conference using SMS/phones, voting systems and smart devices.