Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training.

Similar presentations


Presentation on theme: "Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training."— Presentation transcript:

1 Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training

2 Topics  Evidence  Past Performance 2

3 Preview of Assessment Questions (True/False)  If a study’s methodology is not clear and the applicant provided a citation, the reviewer should read the abstract online.  Quality indicators are assessed for every level of evidence.  A corrective action plan for performance measures demonstrates satisfactory past performance. (Practice Your Understanding)  Complete portion of Review Assessment Form 3

4 Evidence  The Notice of Funding Opportunity (NOFO) identifies levels of evidence  Applicants are assigned a level of evidence based on:  Number of studies  Type of studies  Whether evidence is from the applicant’s program or a similar program  Combinations described in NOFO  Additional quality considerations Evidence 4

5 Evaluation Study Designs Comparison Ability to make statements about causal attribution Experimental Design Studies Randomly Assigned Groups Quasi-Experimental Design Studies Statistically Matched Groups Non-Experimental Design Studies Not Statistically Matched Groups or Group Compared to Itself Evidence Evaluation Study Designs & Causal Impact 5

6 Levels of Evidence  No Evidence  Pre-Preliminary Evidence  Preliminary Evidence  Moderate Evidence  Strong Evidence Evidence: Levels of Evidence 6

7 No Evidence The applicant did not collect any data or cite at least one non- experimental study from a similar program. Evidence: Levels of Evidence 7

8 Evidence does not support conclusions about contribution to observed outcomes  Applicant has collected quantitative or qualitative data from program staff, program participants, or beneficiaries  Data used for program improvement or performance measurement Evidence: Levels of Evidence Pre-Preliminary Evidence 8

9 Pre-Preliminary Examples  Previous grantee reports that they have met their performance measure targets  Previous grantee reports that they did not meet performance measure targets and have changed program design as a result  Previous grantee recruiting opportunity youth to serve as AmeriCorps members gathers feedback from members at the end of the service year (if consistent with TOC) Evidence: Levels of Evidence 9

10 Evidence supports conclusions about the program’s contribution to observed outcomes Evidence: Levels of Evidence Preliminary Evidence 10

11 At least one non-experimental study conducted on the proposed program or another similar program that uses a comparable intervention  Demonstrates improvement in program participants over time OR  Implementation study (process evaluation) used to learn and improve program operations AND  The applicant was rated satisfactory on all four quality of evidence standards Evidence: Levels of Evidence Preliminary Evidence 11

12 Preliminary Examples  An outcome study that tracks program participants through a service pipeline and measures participants’ responses at the end of the program  Pre-and post-test research that determines whether participants have improved on an intended outcome (Includes National Performance Measures with pre/post test) Evidence: Levels of Evidence 12

13 A reasonably developed evidence base that can support causal conclusions for the specific program proposed by the applicant with moderate confidence. Evidence: Levels of Evidence Moderate Evidence 13

14  One or more quasi-experimental studies conducted on the proposed program or another similar program that uses a comparable intervention with positive findings on one or more intended outcomes OR  2 or more non-experimental studies conducted on the proposed program with positive findings on one or more intended outcomes OR  1 or more experimental studies of another relevant program that uses a similar intervention AND  The applicant was rated satisfactory on all four quality of evidence standards Evidence: Levels of Evidence Moderate Evidence 14

15 Moderate - Examples  Well-designed and well-implemented quasi-experimental studies that compare outcomes between the group receiving the intervention and a statistically matched comparison group (i.e., a similar population that does not receive the intervention) Evidence: Levels of Evidence 15

16 Evidence base supports causal conclusions for the specific program proposed by the applicant with the highest level of confidence  One or more well-designed and well-implemented experimental studies conducted on the proposed program with positive findings on one or more intended outcome AND  The applicant was rated satisfactory on all four quality of evidence standards Evidence: Levels of Evidence Strong Evidence 16

17 Strong - Example  An experimental study that uses random assignment to select which individuals will receive the intervention and which will not (control group) Evidence: Levels of Evidence 17

18 Assessing Evidence Step 1: Number and type(s) of studies Step 2: Quality indicators, if applicable Step 3: Determine level of evidence Rely only on what is in the application – Do not look up additional information Evidence: Quality 18

19 Quality Indicators  Quality indicators are assessed only for preliminary, moderate and strong evidence  Quality indicators are not assessed for pre-preliminary or no evidence Evidence: Quality 19

20 Quality Ratings Satisfactory Response meets all or most aspects of the standard. Overall quality of the response is at least satisfactory. Unsatisfactory Response is low-quality and neglects to satisfactorily address more than one aspect of the standard. Overall quality of the response is lacking with room for assumption in key aspects. Evidence: Quality 20

21 Alignment of Models  The program model(s) studied is the same or nearly the same as the model the applicant will implement, in a similar context with similar target beneficiaries or entities. Evidence: Quality 21

22 Methodological Quality The study or studies used rigorous and appropriate research methodologies given the design (e.g. non-experimental, quasi- experimental, experimental), using high quality data, sufficient sample size/statistical power, and a representative sample to identify effects. The study or studies exhibited internal validity, i.e. any effects identified can be reasonably attributed to the program model given the methodological limitations. Evidence: Quality 22

23 Recency Studies conducted within the past six years are considered satisfactory. For studies older than six years, a “satisfactory” Rating may be given if there is reasonable confidence that the relevant conditions in which the program operated when studied are the same or similar as the conditions in which the applicant’s program will be operating. For example, for an educations program relevant conditions could include community demographics and educational standards. For an economic opportunity program relevant conditions could be economic climate. Evidence: Quality 23

24 Strength of Findings The findings from the study or studies indicate with confidence that the program model under study had at least one positive and significant effect on target beneficiaries or entities. “Significant” means the results were statistically significant, for example at the 95% confidence level (or p<.05). Evidence: Quality 24

25 Practice 1 Volunteers ‘R Us focuses on building the capacity of organizations to engage volunteers. The report “Volunteer Solutions for the Nonprofit Sector” identified eight steps or functions that should be adopted in order to achieve the greatest return on investment from volunteers. AmeriCorps members in our program assist nonprofit organizations to address all eight steps. Members develop a tailored volunteer program plan with their site. As part of this plan, members develop or refine procedures that collectively maximize the impact of volunteers, especially those volunteer management practices that, according to the Urban Institute’s Volunteer Management Capacity Study, correlate strongest to volunteer retention (i.e. screening and matching volunteers, recognition activities, and training and professional development for volunteers.) What is the level of evidence? Evidence: Practice 25

26 Answer Number/Type of Studies: 0 Quality: NA Level of evidence: No Evidence The applicant does not describe collecting any of its own data, and it is not clear that the two reports described are non-experimental studies. The first study mentioned could be a literature review or an article on best practices not based on research. The Urban Institute study is not described in such a way that we can discern what the study measured or how, so we cannot assume it was a non- experimental study. Evidence: Practice 26

27 Practice 2 The ABC AmeriCorps program proposes to replicate the Reading for Success AmeriCorps model. In 2011, Reading for Success tracked students’ reading and literacy fluency through the first semester of the school year. More than 1200 kindergarten students were in the study; half were randomly assigned to receive a tutor and half took part in their school’s usual approach to reading instruction without an AmeriCorps tutor. The study found that there was a statistically significant difference between the treatment and control groups, whereby scores increased 14.9 percent each week for the control group and increased at a rate of 19.4 percent for the treatment group, resulting in a 77.2 percent difference between the treatment and control students by the end of the fall. What is the level of evidence? Evidence: Practice 27

28 Answer Number/Type of Studies: 1 experimental study Quality Alignment of models – Satisfactory Methodological Quality – Satisfactory Recency – Satisfactory Strength of Findings – Satisfactory Level of evidence: Moderate The program cites one experimental study from a program that uses a similar intervention and is rated satisfactory on all four quality indicators. Note: If Reading for Success, the program that completed the study, were to submit this evidence, the level would be strong because an experimental study of the proposed program (as opposed to a similar program) constitutes strong evidence. Evidence: Practice 28

29 Practice 3 The model for community resiliency that drives the disaster service activities comes from a 2009 paper published in the Journal of Emergency Management that provides an operational framework for incorpoRating resiliency into our infrastructure and stresses that resiliency needs to be planned in advance before systems are damaged. Emergency Management AmeriCorps Program surveyed individuals in our community and found that the top two reasons why individuals were unprepared were “just haven’t thought about it” (27.5%) and “don’t think a disaster is likely (22.8%). Our program intervention is designed to address both of these factors. What level of evidence is it? Evidence: Practice 29

30 Answer Number and Type of Studies: 0 Quality: NA Level of evidence: Pre-preliminary The applicant conducted research (a survey) to develop its program design and has therefore “collected quantitative…data from beneficiaries that have been used for program improvement…” The 2009 paper is not relevant because it is not a research study. Evidence: Practice 30

31 Practice 4 Sports Not Drugs (SND) conducted a 20-month longitudinal study of program participants. By the end of the study, 35% of SND students reported in engaging in regular, vigorous activity for an hour or more at least five days a week (an increase of 10%). Two additional internal evaluation studies found that youth who participated in the program showed significantly more knowledge about drugs than youth in the comparison group. One of these studies (using pre and post surveys) further found that over time youth in the program maintained or increased their ability to refuse alcohol, marijuana or cigarettes, while comparison youth showed a decreased ability to refuse these drugs.” What is the level of evidence? Evidence: Practice 31

32 Answer Number and Type of Study: 3 non-experimental studies Quality Alignment of models: Satisfactory Methodological Quality: Satisfactory Recency: Satisfactory Strength of Findings: Satisfactory Level of evidence: Moderate The program cites three non-experimental studies conducted on the proposed program and was rated satisfactory on all quality indicators. Evidence: Practice 32

33 Practice 5 In 2009, Fostering Empowerment conducted a survey of 400 of our participants in our foster youth program and compared them to participants in other foster youth programs in our area. We found that participants were 50% more likely to have graduated from high school, independent of the effects of the participants’ age, race or ethnicity. This result was statistically significant at the 95% confidence level. Evidence: Practice 33

34 Answer Number and Type of Study: 1 non-experimental study Quality Alignment of models: Satisfactory Methodological quality: Satisfactory Recency: Satisfactory Strength of findings: Satisfactory Level of Evidence: Preliminary The study is a non-experimental study with a comparison group. No statistical matching appears to have been done, nor was it longitudinal (i.e. participants tracked over time) so it does not qualify as a quasi- experimental design. Evidence: Practice 34

35 Past Performance Reviewers assess the following standard using the five-point Rating scale in your handbook: The applicant demonstrates success in solving the identified problem. Past Performance 35

36 How to Assess Past Performance  The NOFO asked applicants to address past performance in the organizational capability section of the application  Past performance may also be addressed in the evidence section of the application (performance measurement or evaluation results) or elsewhere in the program design section  Past performance is linked to the identified problem - It is not enough for the applicant to have met performance measures in past years; those performance measures must be connected to the problem described in the application  For an applicant that has not met its performance measures in past years, a corrective action plan does demonstrate success in solving the identified problem. Past Performance 36

37 Next Steps To check for understanding and verify that you have completed this orientation session, please reference the remaining slides for the two-part Assessment of this orientation. 37

38 Practice Your Understanding  Answer the Assessment Questions at the following link: https://www.surveymonkey.com/s/evidence_pastperform https://www.surveymonkey.com/s/evidence_pastperform  Read and assess the norming/sample application and complete the Evidence and Past Performance sections of the review form. Do not include significant strengths, weaknesses, or other narrative sections.  You will receive a review form completed by a CNCS staff member. Compare your assessments to those in the key.  This will complete your verification requirement for this training module. 38 Past Performance: Practice

39 Practice SELF-ASSESSMENT When you receive the key for the practice activity from CNCS, compare your Ratings to those of the CNCS standard example. If your Rating was more than one Rating higher or lower than the example provided by CNCS, your assessment may not be normed closely enough to CNCS’ expectations for applicants Past Performance: Practice 39


Download ppt "Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training."

Similar presentations


Ads by Google