How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly.

Slides:



Advertisements
Similar presentations
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Advertisements

USING NEEDS ASSESSMENTS & CREATING ACTION PLANS September 25, 2012 Greg Lobdell, Center for Educational Effectiveness Candace Gratama, The BERC Group Travis.
Challenging the Budget Creating Incentives for Results Rwandas Experience Elias Baingana - Budget Director.
Donald T. Simeon Caribbean Health Research Council
Making a Difference: Measuring Your Outcomes Montgomery County Volunteer Center February 4, 2014 Pam Saussy and Barry Seltser, Consultants.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Action Plan Mr. Ahmed Zaki Uddin Mathematics O-Level.
Academic Alignment CROSBY TURNAROUND COMMITTEE
Determining Your Program’s Health and Financial Impact Using EPA’s Value Proposition Brenda Doroski, Director Center for Asthma and Schools U.S. Environmental.
Leading with Wonder National Title I Conference February 2015 U.S. Department of Education Office of State Support (OSS)
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Second Legislated Review of Community Treatment Orders Ministry of Health and Long-Term Care November 9, 2012.
Family Resource Center Association January 2015 Quarterly Meeting.
Academic Work Environment Survey 2004 Barbara Silver, ADVANCE Program Director Presented at the ADVANCE National Conference, G-Tech, Atlanta, Georgia April.
Phillip R. Rosenkrantz, Ed.D., P.E. Industrial & Manufacturing Engineering Department California State University, Pomona.
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
Coaching for School Improvement: A Guide for Coaches and Their Supervisors An Overview and Brief Tour Karen Laba Indistar® Summit September 2, 2010.
Molly Chamberlin, Ph.D. Indiana Youth Institute
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
1 Review of Toolkit Amber Gove RTI International Session 3.4.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Leadership Transformation Theme HEWM Event 13 th July 2015 Paula Clark LETB Executive Sponsor / Jo Chambers TT Deputy Chair.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Improve Achievement for EACH PreK-12 Student Areas of Focus Preview GREENWICH PUBLIC SCHOOLS October 11, 2007.
Maximizing Support Team Resources in Diplomas’ Now Sites.
Elementary & Middle School 2014 Mathematics MCAS Evaluation & Strategy.
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Working Definition of Program Evaluation
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU. Quality Assurance José Viegas Ribeiro IGF, Portugal SIGMA.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Welcome! Please join us via teleconference: Phone: Code:
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
From Output to Outcome: Quantifying Care Management Kelly A. Bruno, MSW and Danielle T. Cameron, MPH National Health Foundation Background Objectives Methods.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
AADAPT Workshop South Asia Goa, December 17-21, 2009 Maria Isabel Beltran 1.
Member Development and Support Tools and Resources for Building Strong Programs.
Lawrence M. Paska, Ph.D. Coordinator of Technology Policy Educational Design and Technology Updates.
Why Do State and Federal Programs Require a Needs Assessment?
System Changes and Interventions: Registry as a Clinical Practice Tool Mike Hindmarsh Improving Chronic Illness Care, a national program of the Robert.
Th e Heart of TPEP: Learning Centered Conferencing Michelle Lewis John Hellwich TPEP.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Community Investment Reviewer Training United Way of Greater New Haven March 2010.
© 2012 CAPELLA UNIVERSITY T WENTY YEARS OF ENHANCING ONLINE STUDENT S UCCESS Amy Buechler-Steubing & Siri Sorensen Capella University – Learning Assistance.
Lanphier High School The Future of Our SIG Efforts.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Proposed Preliminary Statewide Full Service Partnership Classification System BASED ON STAKEHOLDER FEEDBACK THIS REPORT IS THE MENTAL HEALTH SERVICES OVERSIGHT.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Enhancing Education Through Technology Round 8 Competitive.
School/City/Community Work Plan Year 1 Progress Report.
Measuring Child and Family Outcomes Conference Crystal City, VA July 30, 2010 Jacqueline Jones, PhD Senior Advisor to the Secretary for Early Learning.
Outputs to Outcomes: A Preliminary Assessment of Community Impact through Community Voice Michele K. Wolff, UMBC LaToya White, Health Leads.
Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific.
School Counselors & Assignments \ Elementary Schools Demographic Information.
VTPBiS Coordinators as Coaches Learning and Networking Meeting 1 May, 2016.
Leadership Guide for Strategic Information Management Leadership Guide for Strategic Information Management for State DOTs NCHRP Project Information.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
Can You Enhance Knowledge and Stimulate Excellence One STEM Unit at a Time? AEA – October 16, 2014 Panel: Evaluating STEM Professional Development Interventions.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
Division of HIV/AIDS Managing Questionnaire Development for a National HIV Surveillance Survey, Medical Monitoring Project Jennifer L Fagan, Health Scientist/Interview.
Job Corps – Career Center Collaboration Case Study New York and New Jersey September 19, 2016.
The Federal programs department September 26, 2017
Family Engagement Coordinator Meeting July 25, 2018
Research Program Strategic Plan
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Implementation Guide for Linking Adults to Opportunity
Preparing to Use This Video with Staff:
The Heart of Student Success
Presentation transcript:

How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly Bay-Meyer Jeff Knudsen

Agenda  Introductions & Rationale (10:45-10:50)  Case Study (10:50-11:05)  Group Discussion (11:05-11:30)  We will discussion findings to group (please sign in!) and post to AEA library following the conference.

Rationale  Only 18% of non-profits have at least one FTE researcher.  Increasing reliance on field staff for data collection.  Challenges in supporting field staff in data collection.  Goals to learn from each other, problem-solve, compile best practices to share with the larger AEA community

Case Study  Internal Research & Evaluation Department at large education nonprofit  Supporting AmeriCorps College & Career Coaches delivering pre and post assessments to middle school students

Case Study  Evolution of the AmeriCorps College & Career Readiness Program Evaluation ( )  : Pre & post using Survey Monkey; too few matches  : Pre & post using Survey Monkey; more training; no quarters assigned to survey versions; more but still too few matches  : Paper survey option using Remark & label tracking system; more matches (90%) but instrument too long, too much data clean-up, missing labels

Case Study  Best Practices: Successes & Difficulties 1.Engage field staff in the revision of survey items at the end of each program cycle. Successes: –Content aligns with diverse, informal, evolving curriculum being delivered –Wording matches grade-level comprehension –Captures indicators of interest Challenges: –Lengthy –Content inconsistencies across buildings and grade levels –Most volunteers serve for one year, and indicators of interest often change with them.

Case Study  Best Practices: Successes & Difficulties 2. Provide a paper survey option when computer access is not available. Successes: –Increased response rate –Learning to maximize efficiencies of Remark® scanning software –Labeling system increased matches Challenges: –Missing labels –Lots of data clean-up –Caseload doubled unexpectedly –Workflow challenges & backlogging

Case Study  Best Practices: Successes & Difficulties 3. Deliver tailored, descriptive data back to each field staff member as quickly as possible. Successes: –Staff see results of their hard work –Staff use data to improve their service delivery –Increased staff buy-in to evaluation process Challenges: –Time-consuming to deliver school- level reports –Not all schools had post assessments for midyear report

Case Study  Best Practices: Successes & Difficulties 4. Dedicate time for field staff to discuss the meaning of their data and to articulate how these findings will inform their practice. Successes: –Staff read the report in real-time –Increased staff understanding of the data –Increased staff buy-in to evaluation process Challenges: –Ceiling effects –Continued focus on question content when reviewing final data analysis undermined some learning opportunities in terms of student outcomes

Case Study Current Status of Evaluation:  Successes:  Single page  Outcome related question content  Student generated unique code  Faster implementation  Enhanced workflow tracking  Challenges:  Variation in implementation of curriculum  Desire to use assessments for dual purposes: output counts of unique students served & student outcomes  More emphasis on perfunctory survey implementation  Less emphasis on relationship among survey, intervention & change targeted  Less emphasis on data handling & security

Group Discussion 1.Under what circumstances have you relied heavily on field staff for data collection? 2.What challenges were posed by relying on field staff for data collection? 3.How did you or your team address these challenges? 4.What advice would you offer evaluators supporting field staff in data collection?

Responses– Reliance on Field Staff Low budgets => reliance on field staff

Responses– Challenges Service provider reluctance because reflects on own performance Balancing program staff ownership with evaluation needs So many stakeholders have interest in designing the instrument Input of data into electronic medical record; missing data; data not in on time Data quality issues in non-profit setting

Responses– Strategies Benefit of training point person (separate from the program; outcomes oriented) vs. training all field staff Training small groups Coaching them in the field Written protocols Availability for questions Data entry checks Getting buy-in: What do you wonder about? (adding survey items) Evaluators take care of feedback at end of program, along with interns, to administer the surveys, focus groups & enter data (not field staff); benefit of confidentiality when responding to questions about program staff Debrief program staff afterwards Question banks for program staff to design surveys Guskey’s framework for surveys in training settings Attending quarterly meetings; face to face contact helps to explain why we are collecting data (e.g., outcomes, funding, etc.)

Responses– Advice Ensure everybody involved in the data collection process understands the overall purpose of the evaluation; ensure that everyone is invested Program director buy-in and understanding, especially when face to face time with evaluator not frequent enough Make evaluation fun (e.g., small meetings to talk about how evaluation can be used in everyday life, making it exciting and interactive, group presented back to staff at staff meetings) Getting field staff engaged in the analysis; things that you’ve noticed; helps to capture context Talk to others in the organization or other evaluators as a “support group” to talk through where to draw the line on issues of data quality, etc. Is this enough to inform decision-making (even if not at level to determine statistical significance)? Evaluability assessment: Is program at a point to warrant an evaluation? Balancing Rigor with Resources conference in April List different research design options and resources required; purposefully selecting an option; able to justify to others; carry forward in your reports.

Thank you!  We will discussion findings to group (please sign in!) and post to AEA library following the conference.  Kelly Bay-Meyer, Senior Research & Evaluation Analyst, College Success Foundation  Jeff Knudsen, Director of Research & Evaluation, College Success Foundation