Presentation is loading. Please wait.

Presentation is loading. Please wait.

Don’t Be Fooled: Assessing the Quality of Evidence

Similar presentations


Presentation on theme: "Don’t Be Fooled: Assessing the Quality of Evidence"— Presentation transcript:

1 Don’t Be Fooled: Assessing the Quality of Evidence
District Review and Selection Process

2 Outcomes Considerations for judging the quality of evidence
Describe benefits of using an effective innovation review and selection process that includes a review of evidence and other critical factors Identify the components of an effective innovation review and selection process Review an example Demonstrate how the review and selection process can also be used for “de-selection”

3 1.0 Quality “Evidence:” Good, Bad and the Ugly

4 Why Evidence-Based Practice?
Increase probability of success for more students Central feature in implementation of MTSS & RtI Required in Every Student Succeeds Act (ESSA); many states require it as well

5 Levels of Evidence Pyramid

6 Major Categories Filtered information: judges the quality of studies and makes recommendations for practice Includes the following: Systematic reviews Critically-appraised topics Critically-appraised individual articles

7 Major Categories (cont.)
Unfiltered Information: includes the primary, unfiltered literature that requires people accessing to take on the role of making sure the information you found is reliable and valid Includes the following: Randomized controlled trials Cohort studies Case-controlled studies; case series / reports

8 1. Filtered Information Systematic Review:
Ask a specific question Thorough literature review across studies Eliminate poorly done studies Attempt to make practice recommendations based on well-done studies Critically - Appraised Topics: Evaluate and synthesize Multiple research studies Resulting in short, systematic reviews about a particular topic Critically- Appraised individual articles Evaluate individual research studies Write a short synopsis of the study

9 2. Unfiltered Information
Randomized Control Trials: Random assignment Intervention group (thing studying to determine its effect) Control group Considered to be superior to other study designs Cohort Studies: Observes a group of individuals Meet a set of criteria (selection criteria) Longitudinal (cohort is followed over time) Case-Controlled Studies; Case Series / Reports: Observational study Participants are not randomly assigned to the intervention or control group Lowest level of rigor for research designs Results may be confounded by other variables

10 Other Research Design Types
Quasi-experimental: looks like an experimental design but lacks the key ingredient of “random assignment” Concerns about internal validity because the intervention group and control groups may not be comparable at the onset of the study (baseline) Institute of Education Sciences (IES) has provided guidance on designing quasi-experimental studies (

11 Other Research Design Types
Regression discontinuity (RDD): assignment to intervention or comparison group based on performance relative to cutoff value (e.g, score on a test) This is a regression analysis Concerns about integrity of forcing variable, attrition, strict parameters for modeling and analyses Institute of Education Sciences (IES) has provided guidance on designing and evaluating RDD studies ( c_standards_handbook_v4_draft.pdf)

12 Activity 1.0 Review the levels of evidence presented in this section making note of critical pieces of information you believe your colleagues would benefit from hearing about after this session Your presenters will put the group into a partner activity after reviewing the slides

13 Activity 1.1 Describe the elements of your district’s process for conducting a thorough review of evidence prior to selection If you are unsure about your district’s process, identify some information you want to know after leaving this conference Discuss your district’s capacity (people, knowledge/skills, budget) to review evidence programs and practices

14 Evidence is Only Part of the Equation
Reviewing the evidence of programs, practices, etc. is only part of the equation. Districts need to consider other critical factors to help them make good decisions about what to select for improving student outcomes.

15 2.0 Benefits of a Review and Selection Process

16 Key Terminology Effective Innovation: A set of defined practices used in schools to achieve outcomes that have been empirically proven to produce desired results To be an effective innovation, the practices, programs, assessments, or initiatives should be proven to be “usable:” teachable, learnable, doable, and readily assessed in practice

17 Benefits Increased confidence in the following:
Participation in initiatives or the adoption of programs, practices, and assessments are the best available District has a full understanding of the resources needed to successfully use the selected effective innovations Decisions that resulted in not selecting an effective innovation or de-selecting the use of an existing innovation resulted in a thorough analysis and critical factors

18 3.0 Effective Innovation Review and Selection Process

19 Components of a Review and Selection Process
Purpose of a review and selection process Guidelines for when to use the process Decision-making protocol Directions for: Completing the review and selection tool Providing supporting documentation for specific items Submission

20 1. Purpose Include the following:
Brief summary of the purpose and intended outcome of conducting a thorough review of an effective innovation Rationale about why the district expects a thorough review process be completed before decisions are made to select an effective innovation

21 2. Guidelines for Use List likely scenarios that would warrant the use of the review process: Approached to consider participation in an initiative, “pilot project,” and / or approached to use a new assessment or data system Considering purchasing new curriculum resource materials Considering purchasing new assessments, data systems, or educational software Considering to continue the use of effective innovations that overlap or appear to be redundant with other effective innovations (de-selection)

22 3. Decision-Making Protocol
List the people with the highest level of decision- making authority to determine whether the review process will result in a new selection or de-selection Include statements about the conditions that would warrant involvement from other groups / teams (e.g., board of education, curriculum council) Provide parameters for timelines to make decisions

23 4. Directions Steps outline:
Initiating a review / selection process (e.g., who can do this and what needs to occur before it is started) People that need to be involved in the process: Specific items in the review and selection tools are to be completed by pre-determined designees Parameters for seeking consultation from program, assessment developers, or certified individuals to adequately represent the effective innovation Submitting the tool with the appropriate documentation

24 Selection Tools Two tools:
Program, Practice, or Initiative review tool Assessment review tool Each tool is framed around six critical factors that need to be considered during a high- quality review and selection process

25 Six Critical Factors Effective innovation overview (e.g., name / title, description) Need for the innovation Fit with district, state, national priorities or existing efforts Resources necessary to support peoples’ successful use Evidence to demonstrate positive effect on outcomes Readiness for the effective innovation’s use in a typical school / classroom setting Is the effective innovation mature enough to use in a typical classroom setting and do staff meet the qualifications to successfully use the effective innovation?)

26 Hexagon Tool (Blasé, K., Kiser, L., Van Dyke, M., 2013)

27 Activity 3.0 Given the components of a review and selection process (slide 16) and the six critical factors (slide 22), outline areas you believe your district’s process overlaps and opportunities to strengthen your existing process

28 4.0 Review and Selection Template

29 Activity 4.0 Independently read page 1 of the document, “Effective Innovation Review and Selection Process” Jigsaw Activity: Partner 1: Read the directions and steps for the Program, Practice, or Initiative Review process (pp 2-7) Partner 2: Read the directions and steps for the Assessment Review and Selection Tool (pp. 8-14) Each partner will develop talking points outlining the important sections, reasons for their importance, and critical areas that are included in the tool that could get overlooked

30 5.0 Additional Resources and Examples

31 Additional Resources Core Reading Curriculum Analysis Process: used prior to the district completes the Effective Innovation Review and Selection Tool (handout) Example of a district’s Effective Innovation Review and Selection Process (electronic access) Example of how a district engaged in a de-selection process to make room for the new effective innovation (electronic access) List of information on evidence-based programs and practices (handout)

32 Activity 5.0 With your partner review the additional resources
Given the context of your district, what information do you want to share with your colleagues

33 Thank You! Brad Niebling, Ph.D. Iowa Dept. of Education, Bureau Chief v Kim St. Martin, Ph.D. MIBLSI Assistant Director org


Download ppt "Don’t Be Fooled: Assessing the Quality of Evidence"

Similar presentations


Ads by Google