Presentation is loading. Please wait.

Presentation is loading. Please wait.

Training in Experimental Design: Developing scalable and adaptive computer-based science instruction Mari Strand Cary, David Klahr Stephanie Siler, Cressida.

Similar presentations


Presentation on theme: "Training in Experimental Design: Developing scalable and adaptive computer-based science instruction Mari Strand Cary, David Klahr Stephanie Siler, Cressida."— Presentation transcript:

1 Training in Experimental Design: Developing scalable and adaptive computer-based science instruction Mari Strand Cary, David Klahr Stephanie Siler, Cressida Magaro, Junlei Li Carnegie Mellon University & University of Pittsburgh TED

2 2 Overview of the TED project Curriculum: Experimental design, evaluation, and interpretation Age: 5th-8th grade students Schools: 6 inner city –4 low SES & challenging classroom environments –2 mid-high SES End goal: Computer-based adaptive tutor –1 student : 1 computer in classroom environment –Provides individualized, adaptive instruction –Supplements (does not replace!) teacher

3 3 What do we mean by “Experimental design?” CVS: Control of Variables Strategy 1.Simple procedure for designing unconfounded experiments (Vary one thing at a time) 2.Conceptual basis for making valid inferences from data (Isolating the causal path)

4 4 CVS and Ramps Test whether the ramp surface affects the distance that a ball travels. VariableRamp 1Ramp 2 ConfoundedUnconfounded SurfaceSmoothRough Track lengthShortLongShort HeightHighLowHigh BallGolfRubberGolf

5 5 Why do we need to teach CVS? Core topic in science instruction –State standards –High stakes assessments –Science component of NCLB Has real-world applications –Essential to evaluating product claims, and news reports Students do not always learn CVS “on their own” (low SES students, in particular)

6 6 What do students do wrong? Common errors: Vary everything Hold target variable constant and vary other variables Partially confounded Nothing varied (identical) Their justifications: “I don’t know” You told me to test x! Describe their set-up Want to see if x happens Want to see if this setup is better than that setup

7 7 Why do they take these approaches? By accident –misread question –working carelessly Are led astray –by saliency of physical apparatus (e.g., ramps) –don’t understand written representations (e.g., tables) On purpose –different goals (e.g., “engineering”) –misconception of experimental logic –think other variable(s) don’t matter Just guessing

8 8 What’s the best way to teach CVS? As a society (educators, researchers, and legislators), we don’t know Our research team knows of one effective way…

9 9 Our basic CVS instruction: Students design experiments Students answer questions Instructor provides explicit instruction about CVS One domain Short instructional period

10 10 Effective in the lab and in classrooms of high SES and achievement levels One-on-one: Chen & Klahr (1999); Klahr & Nigam (2004), Strand Cary & Klahr (in preparation) Full class: Toth, Klahr & Chen (2000) Physical and virtual materials: Triona & Klahr (2003)

11 11 Would it work for lower- achieving students in low-SES schools?

12 12 Effective in low-achievement classrooms (Li, Klahr & Jabbour, 2006) Raises item-scores above national norms Enables students to “catch up” with untrained peers from high-SES schools BUT, repeated and varied forms of instruction are required for generalized CVS understanding –Many days –Multiple domains

13 13 Thus, our starting point: Brief, focused CVS instruction is differentially efficient and effective for different student populations, settings, and transfer tasks. We want to reach ALL students! To improve our instruction for the entire student population, we must engage in modification & individualization

14 14 A computer tutor could facilitate differentiated instruction Computer-based instruction –Individualized & self-paced –Provides instruction, practice, and feedback Teacher freed to provide coaching as needed

15 15 How are we building our tutor? 4 development phases & Iterative design process

16 16 4 development phases: 1.Information gathering What are the novice models students hold and how can we address those? 2.Refining the basic instruction and “going virtual” 3.Building a computer tutor with a few “paths” 4.Building an adaptive computer tutor with a “web” of paths

17 17 An evolving CVS computer tutor Version 1Version 2Version 3Version 4 Instructional mode Class (teacher) Individual (computer) Class (teacher) Individual (computer) Inflexible Flexibility Limited flexibility (differentiation points) Flexible (multiple paths) Adaptive (“web” of paths) Stimuli Simulations Computer interface Physical apparatus Overhead transparencies Simulations Computer interface Simulations Computer interface Instructional components (domain) Procedural & Conceptual (Ramps) Prereq. skills (Auto sales) Procedural (Study habits) Conceptual (Ramps) TBD Discussion Feedback Discussion, paper exchange, researchers Discussion, Computer, researchers TBD

18 18 Improve current version & Inform next version Compare against previous version Our iterative design process: Version n Pilot testing Delayed post assessment One-on-one human tutoring Classroom validation study (+ pre, post, and formative assessments)

19 19 What are we learning from each version that will help us design the final, adaptive tutor? VERSION 1 (Completed) Database of student biases, misconceptions, errors & areas of difficulty Inventory of successful tutoring approaches familiar domains instruction in prerequisite skills step-by-step approach Student-friendly terminology, definitions, and phrasing Requiring explicit articulation by student

20 20 What are we learning from each version that will help us design the final, adaptive tutor? VERSION 2 (Ongoing) Information regarding: classwide implementation of successful tutoring approaches feasibility of multiple domains effect of emphasizing domain-generality interface usability worksheet usability

21 21 What are we learning from each version that will help us design the final, adaptive tutor? VERSION 3 (being developed) Information regarding: individual tutor usability and pitfalls comparative efficacy of set learning paths efficacy of immediate computer feedback

22 22 The adaptive tutor will include: Pre-testing and ongoing monitoring of student knowledge Self-paced instruction Diverse topics matching student’s interests An interactive and engaging interface Teacher-controlled and/or computer-controlled levels of difficulty Level of scaffolding, feedback, and help aligned with student’s needs Computerized assessments Logging capability

23 23 Beyond our classroom instruction… Where on the contextual / abstract continuum should this type of instruction be focused? When? Single vs. multiple domains? Static pictures vs. simulations vs. tabular representations Best mix of explicit instruction, exploration, help, feedback, etc.

24 Questions? Comments? MariStrandCary@cmu.edu Klahr@cmu.edu Many thanks to the Institute of Education Sciences for supporting our work

25

26 26 V1 learning examples: VERSION 1 Database of student biases, misconceptions & areas of difficulty Inventory of successful tutoring approaches familiar domains instruction in prerequisite skills step-by-step approach Student-friendly terminology, definitions, and phrasing Requiring explicit articulation of understanding and reasoning Ignore the data or Biased by expectations Create “best” outcome or Most dramatic difference Learn about all variables at once Pets, Sports drinks, Cars, Study habits, Running races Variable vs. Value Experiment Result vs. Conclusion Read carefully, Identify question, Identify variables… Good vs. Fair vs. Informative vs. True “Variable” = something that can change Table format Remembering the target variable Drawing conclusions based on the experiment

27 27 What IS an “intelligent tutor?” Computer-based instructional system Contains an artificial intelligence component –Encodes cognitive objectives of the instruction –Tracks students’ state of knowledge –Compares student performance to expert performance –Tailors multiple features of instruction to the student (Anderson, Boyle, Corbett, & Lewis, 1990; Anderson, Conrad, & Corbett, 1989; Corbett & Anderson, 1995; Greeno, 1976; Klahr & Carver, 1988).

28 28 Ramp apparatus

29 29 CVS and Ramps A completely confounded test for determining the effect of ramp surface on the distance that a ball travels. VariableRamp 1Ramp 2 SurfaceSmoothRough Track lengthShortLong HeightHighLow BallGolfRubber

30 30 Classroom CVS with urban 5th & 6th graders CVS Training (Ramps, 2 days) CVS Probe-based retraining (Pendulum, 2 days) 0% 20% 40% 60 % 80% 100% % Correct (Klahr, Li & Jabbour, 2006)

31 31 “Low” training vs. “high” comparison group Training group (5/6th grade, low achieving school) Comparison group (5-8th grade, high achieving school)

32 32 Stand-alone, detailed lesson plan with visual aids Examples of exp. designs (good and bad) Assessments (formative and summative) Students designing experiments Asks students to explain, justify, and infer Feedback Every version

33 33 Increasing complexity and adaptiveness Physical apparatus  Virtual simulations Full class  Full class & individual computer use Inflexible  Individually-adaptive & self-paced One domain  Multiple domains

34 34 Why SES differences? Found them in our previous studies Classroom environment Reading comprehension Experience with this type of thinking (expectations, appropriate challenge and/or scaffolding, amount of practice)

35 35 What if later versions are less effective than earlier versions? “Stop the presses!” Look for obvious reasons Examine lesson components individually Consider what is missing

36 36 “Prerequisites” “Science mindset Problem decomposition –Vocabulary! –Identify and understand question –Identify key variables –Notice and complete component steps Analogical reasoning Reading & listening carefully

37 37 “Procedures” Test one variable at a time 1.Make the values for the variable you’re testing be DIFFERENT across groups. 2.Make the values for the variables you’re not testing be the SAME across groups.

38 38 “Concepts” You need to use different values for the variable you’re testing in order to know what effect those different values have. You need to use the same value for all the other variables (hold all the other variables constant; “control” the other variables) so that they can’t cause difference in the outcome. If you use CVS, you can know that only the variable you’re testing is causing the outcome/result/effect.


Download ppt "Training in Experimental Design: Developing scalable and adaptive computer-based science instruction Mari Strand Cary, David Klahr Stephanie Siler, Cressida."

Similar presentations


Ads by Google