Evaluating Risk Communication Katherine A. McComas, Ph.D. University of Maryland.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Evaluating the Tikkun Middot Project Tobin Belzer PhD October 28, 2013.
Overview of the action research process
Involving the Public in Risk Communication Katherine A. McComas, Ph.D. University of Maryland.
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Instrument Development for a Study Comparing Two Versions of Inquiry Science Professional Development Paul R. Brandon Alice K. H. Taum University of Hawai‘i.
Formative and Summative Evaluations
Programming Techniques and Skills for Advisory Leaders Ralph Prince and Roger Rennekamp, Ph.D. University of Kentucky Cooperative Extension Service.
Strategic Management Process Lecture 2 COMT 492/592.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Observations A Mirror of the Classroom. Objectives  Gain specific strategies for communicating with teachers to promote successful teaching and learning.
Factors affecting contractors’ risk attitudes in construction projects: Case study from China 박병권.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Introduction to Public Relations Chapter 8: Evaluating Public Relations Effectiveness Copyright © 2012 McGraw-Hill Companies. All Rights Reserved. McGraw-Hill/Irwin.
The Vision Integration Platform Change Readiness Campaign Theme “Together to the Future”
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Types of Research 1. Categorized by Practicality a. Basic research  done to satisfy a need to know with no intention of resolving an immediate social.
McGraw-Hill/Irwin ©2009 The McGraw-Hill Companies, All Rights Reserved Marketing Research, Primary Data, Secondary Data, Qualitative Research, Quantitative.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
1 Commissioned by PAMSA and German Technical Co-Operation National Certificate in Paper & Pulp Manufacturing NQF Level 3 Collect and use data to establish.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Methods of Media Research Communication covers a broad range of topics. Also it draws heavily from other fields like sociology, psychology, anthropology,
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
The changing face of media and its impact on communications measurement lst European Summit on Measurement Principles of Measurement Dr Tom Watson The.
Logic Models and Theory of Change Models: Defining and Telling Apart
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Experimental Research Methods in Language Learning Chapter 1 Introduction and Overview.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
 What is Public Relations Research? Research is important and thus it is the key to a successful Public Relations programme. Research assists in gathering.
Mariya Potabenko GRID-Arendal Guest-researcher USE OF SOCIOLOGICAL SURVEYS FOR ASSESSING ENVIRONMENTAL INFORMATION NEEDS.
Public Relations 3355 University of Texas at Arlington Fall 2007 Dr. Kay Colley.
Chapter 6: Getting the Marketing Information We Need.
Institute of Professional Studies School of Research and Graduate Studies Introduction to Business and Management Research Lecture One (1)
Qualitative Research an inquiry process of understanding a social or human problem based on building a complex, holistic picture formed with words, reporting.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Program Evaluation.
CHAPTER 16 ASSESSMENT OF THE PROGRAM. Educational Assessment »Assessment and Evaluation is an integral part of any educational program. »This is true.
1 ANALYSIS. 2 Presentation Objectives By the end of this session you should be able to: Explain the importance of analysis to developing an effective.
Chapter 3 Identifying a Research Problem Novia Rianty ( ) Suparni ( )
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Public Relations Campaigns. _view0/part1/chapter1/multiple_choice_quiz.html
Week 2 The lecture for this week is designed to provide students with a general overview of 1) quantitative/qualitative research strategies and 2) 21st.
Applicability of principles Reidar K. Lie, MD, PhD Department of Clinical Bioethics, NIH and University of Bergen, Norway.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Prevention Education Meeting May 29, 2013 Evaluation 101.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Chapter 5: Research. Research is the most important to PR because it is used to... Achieve credibility with management Define audiences and segment publics.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Are we there yet? Evaluating your graduation SiMR.
Computing Honours Project (COMP10034) Lecture 4 Primary Research.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
1 International Institute of Business Analysis Vision: The world's leading association for Business Analysis professionals” Mission: To develop and maintain.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Short term Medium term Long term
Logic Models and Theory of Change Models: Defining and Telling Apart
Investigating science
Logic modeling.
Resources Activity Measures Outcomes
Features of a Good Research Study
Changing the Game The Logic Model
Presentation transcript:

Evaluating Risk Communication Katherine A. McComas, Ph.D. University of Maryland

What This Tutorial Covers What evaluation entails Stages of evaluation Different types of evaluation Formative measures of evaluation Summative measures of evaluation Types of data obtained through evaluation

What Is Evaluation? Evaluation measures the effectiveness and impacts of risk communication efforts. It can highlight success as well as areas needing improvement. –For example… Evaluation can help risk communicators know whether their messages reached targeted audiences and whether the targeted audiences understood the messages.

Stages of Evaluation There are five stages of evaluation (Grunig & Hunt, 1984), which resemble stages in any formal research project: –Specify objectives –Determine methods of measurement –Collect and analyze data –Report results to decision makers –Apply results to decisions

Different Types of Evaluation Evaluation is commonly categorized as formative or summative. –Formative evaluation is conducted before and during a program. It assists with the planning of the program. It helps to determine whether any mid-course corrections are needed in the implementation of the program. –Summative evaluation is conducted at the end of a program. It assesses whether the program was worthwhile according to various indicators of success (e.g., knowledge of the program, attitude change among participants, and so forth).

Formative Measures When planning or implementing a risk communication program, formative evaluation can help you determine current knowledge about the problem, the current situation, and the constraints and the opportunities. – For example… Conduct a literature review to determine what the relevant issues are related to your topic. Review recent media coverage of your topic to see what prominent themes or topics are evident. Conduct some form of audience research (e.g., public opinion poll, focus group) to determine what people currently know and believe about your topic.

Summative Measures Some summative measures focus on more “short term” or immediate indicators of success or failure. –For example… Media coverage: Did a press release get placed in a newspaper? How many column inches did a topic receive? Attendance: How many people came to the public meeting? Was there good attendance? Implementation: Was the event well-organized? Were there enough handouts? Was the room large enough? Did the event start and end on time? –These variables are somewhat easy to measure. Some have called them “process” variables (Chess & Purcell, 1999) or “output” variables (Hon & Grunig, 1999).

Additional Summative Measures Another possibility is to examine “outcome” variables, which examine the impacts of risk communication on attitudes and behaviors (e.g., see Chess & Purcell). –For example… Did the risk information reach the targeted audience? Did the targeted audience understand it? Did it improve their knowledge of the topic? Did people take protective actions or change their behaviors after receiving the information? –These types of variables are typically more difficult to measure than process variables; however, they may also provide more valuable information to the risk communicator.

Types of Data Quantitative Data: Provides numerical indicators related to success or failure of program. –Common methods include: Surveys, Economic Indicators, Media Content Analyses, Experiments, “Hits” on Web Site, Calls to Customer Service or Hotline Qualitative Data: Goes beyond the numbers to investigate underlying motivations and reasons for program’s success or failure. –Common methods include: Focus groups, personal interviews, field observations

References Chess, C., & Purcell, K. (1999). Public participation and the environment: Do we know what works? Environmental Science & Technology, 33, Grunig, J., & Hunt, T. (1984) Managing public relations. Holt, Rinehart and Winston. Hon, L. C., & Grunig, J. E. (1999) Guidelines for measuring relationships in public relations. Gainesville, FL: Institute for Public Relations, Commission on PR Measurement and Evaluation.