Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to Tell if Your Cheesesteak is Real: Differentiating Evidence-Based Practice from Well-Advertised Practice Dan Hyson University of Wisconsin-La Crosse.

Similar presentations


Presentation on theme: "How to Tell if Your Cheesesteak is Real: Differentiating Evidence-Based Practice from Well-Advertised Practice Dan Hyson University of Wisconsin-La Crosse."— Presentation transcript:

1 How to Tell if Your Cheesesteak is Real: Differentiating Evidence-Based Practice from Well-Advertised Practice Dan Hyson University of Wisconsin-La Crosse 2016 MSPA Midwinter Conference January 28, 2016

2 A Real Cheesesteak

3 Real cheesesteaks or impostors? “Research-based interventions” “RtI/MTSS systems” “Professional Learning Communities” Others?

4 How can you tell the difference? 1.Clearly define your terms 2.Identify non-negotiable components 3.Look for practices that don’t just “work,” but “work best”

5 1. Clearly define your terms Cheesesteaks “Cheesesteak” v. “Philly cheesesteak” or “Philly cheesesteak sandwich” Evidence-based practices “Intervention” v. “Modification” v. “Accommodation” “Intervention curriculum/ program” v. “intervention method/strategy” “Scientifically-based research” v. “peer-reviewed research” v. “research-based” v. “evidence- based”

6 Intervention v. modification v. accommodation InterventionModificationAccommodation Programs or strategies used to teach a new skill, build fluency or proficiency in a skill, encourage the application of an existing skill to a new situation and/or provide students with the incentive to demonstrate a skill Change to instruction or assessment that reduces expectations for student Change to instruction or assessment that allows student alternative way to achieve same goal, provides teacher with more accurate information about student needs Expected to be research-basedMay or may not be research-based

7 Intervention curriculum/program v. intervention method/strategy Curricula/programs What Works Clearinghouse - ies.ed.gov/ncee/wwc/findwhat works.aspx ies.ed.gov/ncee/wwc/findwhat works.aspx Best Evidence Encyclopedia - www.bestevidence.org www.bestevidence.org Methods/strategies Intervention Central - interventioncentral.org interventioncentral.org Evidence-Based Intervention Network - ebi.missouri.eduebi.missouri.edu

8 Scientifically-based research (SBR) v. peer-reviewed research (PRR) v. research-based (RB) v. evidence- based (EB) Evidence-based Research-based Peer-reviewed research Scientifically based research

9 Scientifically-based research (SBR) v. peer-reviewed research (PRR) v. research-based (RB) v. evidence-based (EB) SBR = experimental or quasi-experimental research, preferably with random assignment PRR = overlaps with SBR, but not a subset Many PRR studies not experimental or quasi-experimental Blind review typical of PRR, but not necessarily SBR RB = intervention designed to be consistent with relevant research findings, but has not necessarily been evaluated in SBR or PRR EB = intervention linked to student performance data and/or has been evaluated locally to assess its effectiveness, but is not yet clearly linked to research findings and has not yet been evaluated in SBR or PRR More rigorous standard Less rigorous standard

10 2. Identify non-negotiable components Cheesesteaks Crusty Italian hoagie roll Grilled rib-eye steak Cheese Whiz Evidence-based practices Critical components of research- based interventions, initiatives like RtI/MTSS, PLCs

11 One traditional model for measuring fidelity of implementation Minimum percent (e.g., 80%) of components implemented Real Cheesesteak model Identify non-negotiable components, 100% of which must be implemented

12 What are the critical components of RtI/MTSS? (Reschly & Bergstrom, 2009) 1.Multi-tiered system of intervention 2.Expectations based on established standards 3.All students screened to assess effectiveness of curriculum and instruction, needs of individual students in comparison to standards 4.Needs identified as gaps between current and expected performance on standards 5.Research-based interventions implemented as designed 6.Frequent progress monitoring used to adjust interventions or expectations as needed 7.Comprehensive model addresses student needs in both general and special education 8.Data used to evaluate individual, curricular and instructional needs

13 Recent controversy regarding effectiveness of RtI/MTSS ORIGINAL ARTICLE (Sparks, 2015) - http://www.edweek.org/ew/articles/2015/11/11/study-rti-practice- falls-short-of-promise.html http://www.edweek.org/ew/articles/2015/11/11/study-rti-practice- falls-short-of-promise.html RESPONSE (Vanderheyden et al., 2016) - http://www.edweek.org/ew/articles/2016/01/06/four-steps-to- implement-rti-correctly.html http://www.edweek.org/ew/articles/2016/01/06/four-steps-to- implement-rti-correctly.html

14 ORIGINAL ARTICLE (Sparks, 2015) A majority of schools in the 13-state reference sample (56 percent) reported full implementation of the RtI framework, while a higher proportion of impact sample schools (86 percent) in those states reported full implementation. Schools in the impact sample adjusted reading services to provide more support to students reading below grade-level standards than to those at or above the standards. For those students just below the school-determined eligibility cut point in Grade 1, assignment to receive reading interventions did not improve reading outcomes; it produced negative impacts.

15 ORIGINAL ARTICLE (Sparks, 2015) "We're looking at this framework that has developed over the years and how it has really played out in classrooms... We weren't expecting to see this pattern," said Fred Doolittle, a study co-author and a vice president of MDRC. "We don't want to have people say that these findings say these schools aren't doing RTI right; this turns out to be what RTI looks like when it plays out in daily life."

16 RESPONSE (Vanderheyden et al., 2016): Four steps to implement RtI correctly 1.Smarter screening 2.Focus on core instruction 3.Match intervention systems to student need 4.Ensure an increase in intervention intensity doesn’t just mean “longer and louder”

17 What are the critical components of PLCs? 1.Ensuring that students learn 2.A culture of collaboration 3.A focus on results

18 Ensuring that students learn If we truly believe all students can learn, we need to ask four questions: What do we want each student to learn? How will we know when each student has learned it? How will we respond when a student experiences difficulty in learning? AND How will we deepen the learning for students who have already mastered essential knowledge and skills?

19 A culture of collaboration and focus on results Find time during the school day for: Teachers to engage in focused collaboration AND Students not learning to receive supplemental support Create effective teams Horizontal (grade level, content area) and vertical Committed to norms Focus on learning, not teaching Data-driven Believe that “we all own all kids”

20 Real cheesesteak or impostor? Uncle Franky’s 10160 6th Ave N Plymouth, MN 55441 Uncle Franky’s certainly does its Philly Cheesesteak sandwich its way. For a more ‘rough around the edges’ version with bursting flavor, try Franky’s version of the cheesesteak sammie. Cooks cram shaved, grilled beef into a flaky baked roll, then it’s loaded with – wait for it – Cheese Whiz, onions and giardiniera (aka pickled-veggies). Don’t worry, if you’re not too keen on the Whiz you can substitute it for mozzarella or American cheese; just don’t ever ask for Swiss on this cheesesteak. There are three locations of Uncle Franky’s in Dinkytown, Plymouth and Northeast Minneapolis. If you’re looking for the Philly, be sure to go to the Plymouth location; the other two don’t have it on their menu.

21 Real cheesesteaks or impostors in your schools? Research-based interventions? RtI/ MTSS? PLCs? Others?

22 3. Look for practices that don’t just “work,” but “work best” Does your research-based intervention or initiative result in meaningful, not just statistically significant, differences?

23 Visible Learning (Hattie, 2008) Meta-analysis of meta-analyses 800 meta-analyses 52,637 studies 146,142 effect sizes Approximately 236 million students Effect Sizes How to calculate (see next slide for review) Importance: Difference between statistical and meaningful difference Interpretation Compared to what?.40, not 0. Almost everything works. What works BEST? Within context: Small effects can be important if associated with important outcomes

24 Calculating Effect Size (ES) ES = [M TX – M control ]/SD ES = [M Post – M Pre ]/SD Cohen: d =.20 (sm),.50 (med),.80 (large) Hattie: d =.20 (sm),.40 (med),.60 (large)

25

26 6 implications of Figure 2.2 1.The effect sizes of meta-analyses reviewed in Visible Learning are in a normal distribution a.The mean ES is a good indicator of all influences on achievement, as many influences are above and below mean 2.Almost everything works 3.Setting the bar at zero doesn’t make sense 4.Instead, set the bar at the average ES (.40) 5.To innovate and achieve.40 takes more than typical teaching (.20-.40) 6.Some influences are universal, but some (e.g., homework) vary based on a moderator (e.g., age)

27 Teaching v. Working Conditions Which matters more? Teaching? OR Working conditions?

28 Teaching vs. Working Conditions Teaching Quality of Teaching Reciprocal Teaching Teacher-student Relationships Providing Feedback Teaching student self- verbalization Meta-cognition strategies Direct Instruction Mastery Learning Working Conditions Within-class Grouping Adding More Finances Reducing Class Size Ability Grouping Multi-grade/Age Classes Open vs. Traditional Classes Summer Vacation Classes Retention

29 Teaching vs. Working Conditions Teaching ES=.68 Quality of Teaching.77 Reciprocal Teaching.74 Teacher-student Relationships.72 Providing Feedback.72 Teaching student self-verbalization.67 Meta-cognition strategies.67 Direct Instruction.59 Mastery Learning.57 Working Conditions ES=.08 Within-class Grouping.28 Adding More Finances.23 Reducing Class Size.21 Ability Grouping.11 Multi-grade/Age Classes.04 Open vs. Traditional Classes.01 Summer Vacation Classes -.09 Retention -.16

30 Teaching Styles An active teacher, passionate for their subject and for learning, a change agent OR A facilitative, inquiry or discovery-based provider of engaging activities

31 Activator vs. Facilitator Activator Reciprocal Teaching Feedback Teaching students self- verbalization Meta-cognition strategies Direct Instruction Mastery Learning Goals – Challenging Frequent-Effects of testing Behavioral Organizers Facilitator Simulations & gaming Inquiry-based teaching Smaller class sizes Individualized instruction Problem-based learning Different teaching for boys/girls Web-based learning Whole language reading Inductive teaching

32 Activator vs. Facilitator (Hattie, 2009) Activator ES =.60 Reciprocal Teaching.74 Feedback.72 Teaching students self- verbalization.67 Meta-cognition strategies.67 Direct Instruction.59 Mastery Learning.57 Goals – Challenging.56 Frequent-Effects of testing.46 Behavioral Organizers.41 Facilitator ES =.17 Simulations & gaming.32 Inquiry-based teaching.31 Smaller class sizes.21 Individualized instruction.20 Problem-based learning.15 Different teaching for boys/girls.12 Web-based learning.09 Whole language reading.06 Inductive teaching.06

33 Questions?

34 Contact information Dan Hyson Assistant Professor, School Psychology Program University of Wisconsin-La Crosse dhyson@uwlax.edu 608-785-8444


Download ppt "How to Tell if Your Cheesesteak is Real: Differentiating Evidence-Based Practice from Well-Advertised Practice Dan Hyson University of Wisconsin-La Crosse."

Similar presentations


Ads by Google