Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

A Systems Approach To Training
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Administrators Meeting April 21, Key Areas of Grant-Based Monitoring Schools to be Served Instructional Assessments Instructional Strategies and.
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Office of Special Education & Early Intervention Services Getting Ready at the Local Level Preparing for the Service Provider Self-Review.
REL Appalachia and the Virginia Middle School Research Alliance Justin Baer, Director, REL Appalachia Virginia School-University Partnership Steering Committee.
Using training packages to meet client needs Facilitator: Gerard Kell.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
1 Mystery Shopping SHIP Directors’ Conference June 11, 2007 Julie Leonard & Erika Melman BearingPoint, Inc.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
8. Evidence-based management Step 3: Critical appraisal of studies
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
Steps in the Research Process I have a research question, what do I do next?
Leading the Learning The Role of the Instructional Leadership Team ESC, West Instructional Directors July 30, 2014.
Introduction of Internet Survey Methodology to the Emirate of Abu Dhabi Andrew Ward, Maitha Al Junaibi and Dragica Sarich.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Continuous Quality Improvement (CQI)
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
 AKA CIPP  Evaluators: Elaine Carlson and Tom Munk  Assist in summative evaluation of the center  Helped develop standardized logic model  Helped.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
1 SARAH HUNTER RAND CORPORATION LAURA STEIGHNER AMERICAN INSTITUTES FOR RESEARCH NOVEMBER 16, 2009 National Evaluation of the Demonstration to Improve.
RESEARCH Chapter Eight. 8-2 Research Provides us with ways of knowing Beginning and ending of programming  R-A-C-E  Measurement, analysis, and evaluation.
Using Online Instruction in Positive Behavior Support Training Kansas Institute for Positive Behavior Support (KIPBS) Rachel Freeman.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
IEc INDUSTRIAL ECONOMICS, INCORPORATED Measuring the Dosage Effect: Using Tenure as a Variable in Assessing Program Impact Angela Helman, Principal June.
Thoughts on the Role of Surveys and Qualitative Methods in Evaluating Health IT National Resource Center for HIT 2005 AHRQ Annual Conference for Patient.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
National Literacy Strategy Early Literacy Support (ELS) HEADTEACHER BRIEFING.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Get Your "Party" Started: Establishing a Successful Third-party Evaluation Martha Thurlow, Ph.D. & Vitaliy Shyyan, Ph.D.—National Center on Educational.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Organizational Self-Assessment to Achieve Health Equity Gail Brandt Health Equity Consultant National Association of Chronic Disease Directors November.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Marcia L. Grek, Ph.D. The Florida Center for Reading Research Reading Coaches Conference Orlando, Florida August, 2004.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
The Role of Statistics and Statisticians in Effectiveness Evaluations Wendy Bergerud Research Branch BC Min. of Forests November 2002.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
Welcome to the Zone-Level District Trainer Program Helping you to plan and conduct training meetings that support effective Rotary clubs.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Statewide System of Support For High Priority Schools Office of School Improvement.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Increased # of AI/AN receiving in- home environmental assessment and trigger reduction education and asthma self-management education Increased # of tribal.
Center to Improve Project Performance (CIPP), Westat Sarah Heinemeier, Compass Evaluation & Research Jill D. Lammert, Westat January 14, 2015 Guidelines.
Using Surveys to Design and Evaluate Watershed Education and Outreach Day 5 Methodologies for Implementing Mailed Surveys Alternatives to Mailed Surveys.
Overview Introduction to marketing research Research design Data collection Data analysis Reporting results.
Welcome to the Zone-Level District Trainer Program
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Sam Catherine Johnston, Senior TA Specialist National AEM Center
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Cynthia Curry, Director National AEM Center
Presentation transcript:

Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research

Purpose of the Webinar Series  Provide guidance to grantees in planning, designing, and conducting high-quality customer surveys 2

Components of the 3-Part Webinar Series 3

Why conduct a customer survey? 4

Context for the Customer Surveys in Grant Evaluations  Grant program theory –Evaluation plan Evaluation questions –Sources of data »Plan for a Customer Survey Survey’s purpose, goals, objective Sampling plan Survey items Data collection Analyses Data use 5

Key Decisions About Customer Surveys  What are the purposes of the survey?  What is the population of interest?  What questions do you want to answer about the population?  What data do you need to answer the questions?  How often do you need to conduct the survey?  Is sampling an option for reducing cost and burden?  What is the best mode of data collection?  What steps are needed to make the survey accessible? 6

Should you partner with a third-party evaluator?  Does your staff have knowledge and skills in –survey design, –sampling, –item development, –data collection, –data analysis, and –reporting?  Do the staff with the necessary skills have time for planning and conducting a survey or are they assigned to other activities?  Do you have available support personnel for –conducting phone interviews, –stuffing envelopes, doing and tracking mailings –programming web surveys, and –following up with non-respondents?  Does your staff have the objectivity for interpreting results, e.g., are they willing to accept negative feedback?  Will your budget support a third-party evaluator? 7

Benefits and Limitations of Working with a Third-Party Evaluator BenefitsLimitations Third-party evaluators can:  Bring technical expertise in research methodology, statistics, or related topics to the evaluation team  Provide credibility and objectivity by acting as an external “critical friend”  Take on responsibility for completing some or all of the (formative and summative) evaluation tasks Third-party evaluators may:  Add unanticipated or additional cost to the evaluation  Add to monitoring and management tasks focused on the work of contractors  Not know the background or content area as well as project staff  Be less available or accessible, as compared to project staff 8

Should you sample customers and, if so, how?  What is the population of interest?  What question(s) are you trying to answer about that population?  Is sampling an option for reducing cost and burden? 9

Common Problems with Customer Surveys: Bias  Under coverage  Non-response  Voluntary response 10

Systematic Samples: Subjects are selected to be representative of the entire population Examples of systematic samples Target PopulationPossible Sampling Strategy Local district administratorsPick a random sample of districts using ED’s Common Core of Data, then use the web to get the names and contact information of individuals who fit the role you need, e.g., technology directors. Technical assistance recipientsGenerate a running list of TA recipients and select at random from all or pick from within groups, e.g., those who received targeted or intensive TA. Former conference participantsUse a participant list and select every n th name 11

Convenience Samples: Subjects are selected just because they are easiest to recruit for the study Is it a systematic sample or is it a convenience sample? ScenarioType of Sample To understand your target audience’s need for intensive technical assistance, survey individuals who visit your grant’s web-site Convenience To describe the knowledge and skills of all state agency personnel, survey those attending a conference session. Convenience To assess scholars’ confidence to carry out future job-related tasks, survey scholars in a college class. Depends To assess school principals’ awareness of your grant’s products, survey principals in 50 schools that are selected by chance from a list of 16,000 schools nationwide. Systematic 12

How large a sample do you need?  While sample size depends on analysis plans, one place to start is with a sample size estimator:  

Sample size calculator 14

How frequently should you collect data? How often?Possible applications One-time surveySnapshot of population characteristics Initial grant or activity planning Post-event evaluation Annual surveyAnnual planning Assessment of progress toward goals Pre/post surveyChange in knowledge or skills after an activity or intervention Longitudinal surveyLong term retention of knowledge or skills Change in skills or behavior Trends over time 15

Takeaways  Plan, plan, plan.  Position your survey within the framework of a logic model, evaluation plan, and evaluation questions.  Think about the ultimate uses of the data and let that drive your design.  A small systematic sample is better than a large convenience sample.  Consider available fiscal and human resources. 16

Benefits of a Well-Designed Customer Survey  Valid, reliable data for planning, fine-tuning, or evaluating your grant activities  Credibility when reporting to OSEP and other stakeholders 17

2nd and 3rd Webinars in the Series 18

Additional Resources  Guidelines for Working with Third-Party Evaluators, available on cippsite.org.  Dillman, D., Smyth, J., & Christi, L. (2014). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition.  Survey Fundamentals: A Guide to Designing and Implementing Surveys, available at vey_Guide.pdf. vey_Guide.pdf 19

Questions? Contact us:  Elaine Carlson or Anne D’Agostino 20