Presentation is loading. Please wait.

Presentation is loading. Please wait.

How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly.

Similar presentations


Presentation on theme: "How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly."— Presentation transcript:

1 How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly Bay-Meyer Jeff Knudsen

2 Agenda  Introductions & Rationale (10:45-10:50)  Case Study (10:50-11:05)  Group Discussion (11:05-11:30)  We will email discussion findings to group (please sign in!) and post to AEA library following the conference.

3 Rationale  Only 18% of non-profits have at least one FTE researcher.  Increasing reliance on field staff for data collection.  Challenges in supporting field staff in data collection.  Goals to learn from each other, problem-solve, compile best practices to share with the larger AEA community

4 Case Study  Internal Research & Evaluation Department at large education nonprofit  Supporting AmeriCorps College & Career Coaches delivering pre and post assessments to middle school students

5 Case Study  Evolution of the AmeriCorps College & Career Readiness Program Evaluation (2010-2013)  2010-11: Pre & post using Survey Monkey; too few matches  2011-12: Pre & post using Survey Monkey; more training; no quarters assigned to survey versions; more but still too few matches  2012-13: Paper survey option using Remark & label tracking system; more matches (90%) but instrument too long, too much data clean-up, missing labels

6 Case Study  Best Practices: Successes & Difficulties 1.Engage field staff in the revision of survey items at the end of each program cycle. Successes: –Content aligns with diverse, informal, evolving curriculum being delivered –Wording matches grade-level comprehension –Captures indicators of interest Challenges: –Lengthy –Content inconsistencies across buildings and grade levels –Most volunteers serve for one year, and indicators of interest often change with them.

7 Case Study  Best Practices: Successes & Difficulties 2. Provide a paper survey option when computer access is not available. Successes: –Increased response rate –Learning to maximize efficiencies of Remark® scanning software –Labeling system increased matches Challenges: –Missing labels –Lots of data clean-up –Caseload doubled unexpectedly –Workflow challenges & backlogging

8 Case Study  Best Practices: Successes & Difficulties 3. Deliver tailored, descriptive data back to each field staff member as quickly as possible. Successes: –Staff see results of their hard work –Staff use data to improve their service delivery –Increased staff buy-in to evaluation process Challenges: –Time-consuming to deliver school- level reports –Not all schools had post assessments for midyear report

9 Case Study  Best Practices: Successes & Difficulties 4. Dedicate time for field staff to discuss the meaning of their data and to articulate how these findings will inform their practice. Successes: –Staff read the report in real-time –Increased staff understanding of the data –Increased staff buy-in to evaluation process Challenges: –Ceiling effects –Continued focus on question content when reviewing final data analysis undermined some learning opportunities in terms of student outcomes

10 Case Study Current Status of Evaluation:  Successes:  Single page  Outcome related question content  Student generated unique code  Faster implementation  Enhanced workflow tracking  Challenges:  Variation in implementation of curriculum  Desire to use assessments for dual purposes: output counts of unique students served & student outcomes  More emphasis on perfunctory survey implementation  Less emphasis on relationship among survey, intervention & change targeted  Less emphasis on data handling & security

11 Group Discussion 1.Under what circumstances have you relied heavily on field staff for data collection? 2.What challenges were posed by relying on field staff for data collection? 3.How did you or your team address these challenges? 4.What advice would you offer evaluators supporting field staff in data collection?

12 Responses– Reliance on Field Staff Low budgets => reliance on field staff

13 Responses– Challenges Service provider reluctance because reflects on own performance Balancing program staff ownership with evaluation needs So many stakeholders have interest in designing the instrument Input of data into electronic medical record; missing data; data not in on time Data quality issues in non-profit setting

14 Responses– Strategies Benefit of training point person (separate from the program; outcomes oriented) vs. training all field staff Training small groups Coaching them in the field Written protocols Availability for questions Data entry checks Getting buy-in: What do you wonder about? (adding survey items) Evaluators take care of feedback at end of program, along with interns, to administer the surveys, focus groups & enter data (not field staff); benefit of confidentiality when responding to questions about program staff Debrief program staff afterwards Question banks for program staff to design surveys Guskey’s framework for surveys in training settings Attending quarterly meetings; face to face contact helps to explain why we are collecting data (e.g., outcomes, funding, etc.)

15 Responses– Advice Ensure everybody involved in the data collection process understands the overall purpose of the evaluation; ensure that everyone is invested Program director buy-in and understanding, especially when face to face time with evaluator not frequent enough Make evaluation fun (e.g., small meetings to talk about how evaluation can be used in everyday life, making it exciting and interactive, group presented back to staff at staff meetings) Getting field staff engaged in the analysis; things that you’ve noticed; helps to capture context Talk to others in the organization or other evaluators as a “support group” to talk through where to draw the line on issues of data quality, etc. Is this enough to inform decision-making (even if not at level to determine statistical significance)? Evaluability assessment: Is program at a point to warrant an evaluation? Balancing Rigor with Resources conference in April List different research design options and resources required; purposefully selecting an option; able to justify to others; carry forward in your reports.

16 Thank you!  We will email discussion findings to group (please sign in!) and post to AEA library following the conference.  Kelly Bay-Meyer, Senior Research & Evaluation Analyst, College Success Foundation (kbay@collegesuccessfoundation.org)kbay@collegesuccessfoundation.org  Jeff Knudsen, Director of Research & Evaluation, College Success Foundation (jknudsen@collegesuccessfoundation.org)jknudsen@collegesuccessfoundation.org


Download ppt "How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly."

Similar presentations


Ads by Google