Teaching, Learning, & Transfer of Experimental Procedures in Elementary School Science David Klahr Department of Psychology Pittsburgh Science of Learning.

Slides:



Advertisements
Similar presentations
Inquiry-Based Instruction
Advertisements

Florida Department of Education’s FCAT Explorer
The Teacher Work Sample
Differentiation: What It Is/What It Isn’t
1 Archived Information How can laboratory research in cognitive and developmental psychology contribute to science education (and vice versa)? David Klahr.
“The Scientific Ability of Young Children and the Role of the Teacher in Inquiry-based Learning Karen Worth
Common Core State Standards Focus on Math Training Module II.
1 Intervention Design: Implications for Research Michael D. Coyne, Ph. D. Assistant Professor IES Research Conference Washington DC,
DAWN STEWART BSC, MPA, PHD BRS 214 Introduction to Psychology Rehabilitation interventions and clinical psychology.
Driving Curriculum Scope and sequencing in mathematics for your school context Katherin Cartwright Mathematics Advisor K-6.
Planning for Inquiry The Learning Cycle. What do I want the students to know and understand? Take a few minutes to observe the system to be studied. What.
Scaling up our instruction Our prior work demonstrates an effective method for teaching the Control of Variables Strategy (CVS), a means of designing simple.
©2001 CBMS Math Preparation of Teachers Teachers need to study the mathematics of a cluster of grade levels, both to be ready for the various ways in which.
Inquiry.
Jason Finley UCLA Department of Psychology Thanks to: Robert A. Bjork, Lindsey Richland, & Matt Hays at UCLA Marcia C. Linn & Britte Cheng at UC Berkeley.
Multivariate Analyses & Programmatic Research Re-introduction to Programmatic research Factorial designs  “It Depends” Examples of Factorial Designs Selecting.
Multivariate Analyses & Programmatic Research Re-introduction to Multivariate research Re-introduction to Programmatic research Factorial designs  “It.
Inquiry. Inquiry is a term that we often hear when we are talking about science teaching. How do you define “inquiry”?
Intervention Resource Guide. Math Intervention Courses Address foundational math skills – Whole numbers – Addition, Subtraction, Multiplication, Division.
A Day in the Life of… an 2 nd grade Teacher By: Ciera Thomas Coach Snell 4 th period.
Gifted and Differentiation Forum Nov. 1, 2010 Julian Middle School Commons.
Center for Teacher Certification at ACC Lesson Planning 101 What you need to know about planning for students to learn.
CYCO Professional Development Packages (PDPs) Teacher Responsiveness to Student Scientific Inquiry 1.
Common Core State Standards Initiative A Fundamental Shift Toward College and Career Readiness in K-12 Education.
Agenda for Wednesday Dec 3 rd Notebook set-up Pre-test Learning Targets Scientific Method.
Three Shifts of the Alaska Mathematics Standards.
Affecting and Documenting Shifts in Secondary Precalculus Teachers’ Instructional Effectiveness and Students’ Learning Marilyn P. Carlson Arizona State.
Lecture # 6 SCIENCE 1 ASSOCIATE DEGREE IN EDUCATION TEACHING OF SCIENCE AT ELEMENTARY LEVEL.
Developing teachers’ mathematics knowledge for teaching Challenges in the implementation and sustainability of a new MSP Dr. Tara Stevens Department of.
Nicole Paulson CCSSO Webinar March 21, 2012 Transition to the Common Core State Standards in Elementary Math.
A Framework for Inquiry-Based Instruction through
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
An iterative approach to designing scalable and adaptive computer-based science instruction Mari Strand Cary, David Klahr, Stephanie Siler, Cressida Magaro,
Strategies for Differentiated Instruction Michael Klein MISD Science Consultant New Teacher Academy March 18, 2009.
EDU 385 Education Assessment in the Classroom
Angie Mangiantini. ARE WE DOING ENOUGH? 2 adm
Prompts to Self-Explain Why examples are (in-)correct Focus on Procedures 58% of explanations were procedure- based Self-explanation is thought to facilitate.
Training in Experimental Design (TED): Developing Scalable and Adaptive Computer-based Science Instruction (Year 2) Stephanie Siler, Mari Strand Cary,
1 Diagnostic Measurement and Reporting on Concept Inventories Lou DiBello and Jim Pellegrino DRK-12 PI Meeting Washington, DC December 3, 2010.
Scientific Investigation Science & Technology. Scientific Investigation2 Learning objectives: To identify the major steps in carrying out a scientific.
Numerical Reasoning: An Inquiry-Based Course for K-8 Teachers Rachel Cochran, Center for Educational Accountability Jason Fulmore, Center for Educational.
TWS Aids for Student Teachers & Interns Overview of TWS.
Hong Qian, PhD Student Peter Youngs, Associate Professor Michigan State University Presentation to Global Education Group February 16, 2011 The Influence.
Rolling Things. What is the A World In Motion Program about? Utilizes highly interactive learning experiences Brings math, science and technology principles.
Training Individuals to Implement a Brief Experimental Analysis of Oral Reading Fluency Amber Zank, M.S.E & Michael Axelrod, Ph.D. Human Development Center.
Earth’s Water Cycle The Earth is made up of systems. The total amount of water on Earth remains constant. Amount of water leaving = Amount of water entering.
Narrowing the Challenge: Revisiting Understanding by Design Cherie McCollough VaNTH-PER Professional Development June 1, 2004.
Elementary Formative Assessment Re-engagement Lessons Using Student Work.
Training in Experimental Design: Developing scalable and adaptive computer-based science instruction Mari Strand Cary, David Klahr Stephanie Siler, Cressida.
1 Instructional Practices Task Group Chicago Meeting Progress Report April 20, 2007.
SCIENCE 1 ASSOCIATE DEGREE IN EDUCATION
Comparing Pedagogical Approaches for the Acquisition and Long-Term Robustness of the Control of Variables Strategy By: Michael São Pedro Advisor: Janice.
SIOP: Sheltered Instruction Observation Protocol Dr. Kelly Bikle Winter 2007.
The Power of Comparison in Learning & Instruction Learning Outcomes Supported by Different Types of Comparisons Dr. Jon R. Star, Harvard University Dr.
Teaching the Control of Variables Strategy in Fourth Grade Classrooms Robert F. Lorch, Jr., William J. Calderhead, Emily E. Dunlap, Emily C. Hodell, Benjamin.
September 27, 2012 Do Now: Answer the following question on a separate sheet of paper: Do you consider psychology to be a real science? Explain your answer.
The New Face of Assessment in the Common Core Wake County Public Schools Common Core Summer Institute August 6, 2013.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Thinking aloud about NEW AND OLD BALLS and ramps Bemilies, et al. University of Kentucky--Lexington, KY ABSTRACT INTRODUCTION  Scientists use many methods.
The Role of Prior Knowledge in the Development of Strategy Flexibility: The Case of Computational Estimation Jon R. Star Harvard University Bethany Rittle-Johnson.
Grade 2 Back to School Night
Grade 3 Back to School Night
Quarterly Meeting Focus
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
YEAR 1 NETWORK: SCIENCE 17th MAY 2016.
Briana B. Morrison Adrienne Decker Lauren E. Margulieux
Vincent Aleven & Kirsten Butcher
Correlated-Groups and Single-Subject Designs
Julie Booth, Robert Siegler, Ken Koedinger & Bethany Rittle-Johnson
Designing Programs for Learners: Curriculum and Instruction
Presentation transcript:

Teaching, Learning, & Transfer of Experimental Procedures in Elementary School Science David Klahr Department of Psychology Pittsburgh Science of Learning Center (PSLC) Program in Interdisciplinary Education Research (PIER) Carnegie Mellon University Society for Research on Educational Effectiveness First Annual Conference Dec , 2006

Topic: Assessing different methods for teaching experimental procedures to middle school children More specifically: Teaching “CVS” In the lab In both “easy” & “challenging” classrooms To students of widely varying abilities

CVS: Control of Variables Strategy A simple procedure for designing unconfounded experiments: - Vary one thing at a time (VOTAT). The conceptual basis for making valid inferences from data: - isolation of causal path. What is (CVS)? NOT This:

Why study CVS? Theoretical issues Surface vs deep mapping during transfer of procedures and concepts at different transfer “distances”. Practical importance Topic: Core topic in early science instruction Assessment:State standards High stakes assessments NCLB to start testing science Best Instructional approach for teaching CVS? Heated controversy in profession Legislative battles (e.g., CA and “hands on” science)

Goal: Compare different types of instruction for teaching CVS. Participants: 60 2nd - 4th graders Assessment: –Measure learning & transfer at different “distances” from initial instruction. Materials: 3 different physcial domains –Springs –Ramps –Sinking objects. Chen & Klahr (1999), Child Dev. Between subjects design

Materials: 8 springs: 2 lengths x 2 widths x 2 wire sizes & 2 pair of weights Select two springs Select two weights Hang springs on rack hooks Hang weights on springs. Compare amount of stretching. Springs domain Which attributes determine how far a spring will stretch? Execution:

Question: does the length of a spring make a difference in how far it stretches? A B Length: shortlong Width: widewide Wire: thinthin Weight: lightlight An unconfounded test:

Exploratory: Explicit = Exploratory plus: Two types of instruction (between subjects) – Training: Explicit, good and bad examples – Training: Reasons why, focus on deep structure – Probe questions: Can you tell for sure? Why? –Hands on: work with physical materials – Goal provided: “find out if x makes a difference”

Different transfer “distances” Far transfer (between domain): –CVS tests in different domain from training. –Time: few days after training –Location, context, etc., same as training Near transfer (within domain): –CVS “tests” in same domain as training, but on a different dimension. –Time: minutes after training –Location, context, etc.: same as training Remote transfer (more later)

Exploration Near Transfer Far Transfer 0% 10% 20% 30% 40% 50% 60% 70% Study Phases Day 1 Day 2 (Pre-test) Training Manipulation

Far Transfer (Day 2) Near Transfer Explicit immediately better than Exploration and remains so (4 experiments per child in each phase) Training Manipulation 0% 10% 20% 30% 40% 50% 60% 70% Exploration (pre-test) Exploratory Explicit % of unconfounded experiments

100 ExplicitExploratory (at least 3 out of 4 unconfounded experiments) CVS mastery by individual children % of children becoming Masters

1.Initial transfer measures are very close to training objectives. 2.Need a more “distant” ( “authentic”?) assessment of children’s understanding. 3. Will training effects remain with such extended assessments? Procedure Create a more “authentic” assessment: Ask children to judge science fair posters. Score their comments and suggestions. Extensions

1.Participants: 112 3rd & 4th graders 2.Train on CVS via Explicit or Exploration method. 3.Assess effectiveness of CVS skill. 4.Present poster evaluation task. 5.Look at how CVS skill, training condition, affect poster evaluation performance. CVS Training and Science Fair Assessments (Klahr & Nigam, 2004)

Training Manipulation 0% 10% 20% 30% 40% 50% 60% 70% Exploration Near transfer Far transfer Day 1 1 week Study Design Poster Evaluation

Scoring Rubric for Children’s Poster Critiques 1. Adequacy of research design 2. Theoretical explanation 3. Controlling for confounds in: 4. Measurement: Subjects/Materials, Treatment, Experimenter bias, etc. Reliability/Variability, Error, Data Representation 6. Completeness of conclusion: 5. Statistical Inferences: Sample size/population, effect size Supported by data, Relate to hypothesis Grand Poster Score = (Pingpong Poster) + (Memory Poster)  all valid, non-redundant, critiques about a poster Poster Score = 

Possible subtle effects of type of instruction Do the few kids who master CVS in the Exploratory condition do better on poster evaluation than the many who master CVS in the Explicit Instruction condition?

Possible subtle effects of type of instruction More specifically: – What is the relation between Poster Scores and Path to CVS mastery? Method: –Secondary analysis based on “learning paths” Do the few kids who master CVS in the Exploratory condition do better on poster evaluation than the many who master CVS in the Explicit Instruction condition?

Different “paths” to mastery or non-mastery of CVS How do these children following these different paths perform on poster evaluations? Note: following based on combining results from two studies: original K&N plus a replication

Poster Assessment Score (standardized) Explicit Masters Exploratory non- Masters Exploratory Masters Experts Explicit non-Masters oCVS mastery is associated with high poster scores oNon-mastery with low poster scores oPath to mastery, or non-mastery is irrelevant n = 59n = 25n = 15n = 66n = 19 p <.001 n.s.

Decomposition (attention to detail) Nature of science Rhetorical stance Science as argument Question for cognitive research: Why does training on CVS (narrow) lead to better poster evaluations (broad)? Focused search for causal paths Stay tuned ….

Translate experiment “script” into teacher lesson plan. Procedure (in a nutshell): Teach in “normal” science classes (in high SES schools). (Toth, Klahr, & Chen, 2000) Question for applied research: Can CVS be taught in a normal classroom setting?

Participants in Classroom Study 77 4th graders from 4 classrooms in two different private schools 2 different science teachers Neither school had participated in “lab” studies

What to hold and what to fold? Pedagogy: –Goal – teach CVS –Type of teaching: Explicit instruction Assessment: –Same as laboratory –Plus, some new assessments in classroom Context: – Lesson plan, not “script” –Teacher, not researcher –Scheduling – Student/teacher ratio – Group work – Record keeping – Error and multiple trials KeepChange & adjust These are issues of “engineering design”.

Pretest Post Test Results of Classroom Implementation % unconfounded designs Individual students classified as “Experts” (8 of 9 correct) Posttest 91% Pretest 5%

What about more challenging classrooms? (“Lesson Planning Project”, w/Junlei Li, Stephanie Siler, Mandy Jabbour) One facet of the Lesson Planning Project: Two classrooms (5 th and 6 th graders) in urban school 90% eligible for free lunch. Teacher is researcher (Junlei Li)

2-Day Classroom Replication of CVS Training Domain: Ramps 2-Day CVS Transfer & Retraining Domain: Pendulum 2-Week Delay: Transfer to “real world”, “high- stakes” items Local National International Standardized Test Items 0% 20% 40% 60 % 80% 100% Teaching & Assessment of CVS with Urban 5 th and 6 th Graders (n = 42) (Klahr & Li, 2005 ) Dyads Student Design Mastery-based Formative Assessment (CTBS) (NAEP) (TIMSS) Dyads Focused Analogical Mapping % Correct Our CVS Tests

% correct for various groups on a TIMMS CVS item

He wants to test this idea: The heavier a cart is, the greater its speed at the bottom of a ramp. Which three trials should he compare? Typical TIMMS CVS item

Significance Brief, theoretically grounded, focused instruction:  Is highly effective for middle class students  In the sort run & over longer durations  On “far transfer” assessments Path independence:  “What” matters more than “how”. BIG differences in effectiveness with different student population. Thus, current approach requires:  Adaptation, Modification, & Individualization

Questions to pursue (Next steps) NCLB in “the small”: Goal: No child who can’t understand & execute CVS Method: Develop an “intelligent tutor” that can adapt to wide variability in children’s learning

TYPE FAST GAIN UP DOWN UP GRADUAL GAIN HIGH CONSTANT Explicit 31%10% 7%7% Socratic 0% 0%14%0% TYPE UP & DOWN STEADY DECLINE LOW CONSTANT Explicit 7% 0% 37% Socratic 18% 7% 60% Wide variety of individual learning patterns (From Chen & Klahr, 1999)

Design a Tutor for Experimental Design w/ Mari Strand Cary, Stephanie Siler, Junlei Li

Thanks to Zhe Chen, Eva Toth, Junlei Li, Mari Strand Cary, Stephanie Siler, Milena Nigam, Amy Masnick, Lara Triona Funding $ources: McDonnell Foundation, NICHD, NSF, IES Recent & Current collaborators

END

Extras

A page from the 15-item test booklet Good Test Bad Test Does the amount of water affect plant growth? Remote transfer items Temporal –Training - test interval: 7 months Domain –Physical - biological, et al Format –Physical materials vs. paper and pencil test booklet Why “remote”? Context - One on one with Experimenter vs whole class test taking

rd4th Untrained Trained Remote Transfer Results Good Test Bad Test Does the amount of water affect plant growth? Mean % correct on 15-item far transfer test

Ramps Domain Question: Does the surface of a Ramp make a difference in how far a ball rolls? Surface:smooth Run: short Steepness: high Ball: golf Surface: rough Run: long Steepness: low Ball: rubber A completely confounded test