The Behavior of Tutoring Systems

Slides:



Advertisements
Similar presentations
Differentiation: how can I make it work better for me?
Advertisements

The University of Kansas Center for Research on Learning
Meeting the needs of all learners.  Is differentiated instruction new? ◦ Think of the one room school house. ◦ Can you remember how your elementary teachers.
Imbalanced data David Kauchak CS 451 – Fall 2013.
1 Profit from usage data analytics: Recent trends in gathering and analyzing IVR usage data Vasudeva Akula, Convergys Corporation 08/08/2006.
Learned Helplessness.  Students who approach assignments with very low expectations of success and give up quickly.  Condition where a student believes.
Supporting (aspects of) self- directed learning with Cognitive Tutors Ken Koedinger CMU Director of Pittsburgh Science of Learning Center Human-Computer.
Cognitive Tutors ITS, Sept 30, Overview Production system models –For LISP, geometry, and algebra 8 principles from ACT theory Gains: 1/3 time to.
ICE Evaluations Some Suggestions for Improvement.
QA on Anderson et al Intro to CTAT CPI 494 & 598 Jan 27, 2009 Kurt VanLehn.
All you need to know about ITS (in 60 slides) Created by Kurt VanLehn © Presented by Vincent Aleven VanLehn, K. (2006). The behavior of tutoring systems.
An introduction to intelligent interactive instructional systems
Essay assessors CPI 494, April 14, 2009 Kurt VanLehn.
QA on “The behavior of tutoring systems” CPI 494 Feb 3, 2009 Kurt VanLehn.
QA on AutoTutor 2004 paper CPI 494, March 31, 2009 Kurt VanLehn.
Teaching with Instructional Software
1 How IGPro Can Help Teachers Understand the Grades They Assign & More.
Tutoring and Learning: Keeping in Step David Wood Learning Sciences Research Institute: University of Nottingham.
Determining the Significance of Item Order In Randomized Problem Sets Zachary A. Pardos, Neil T. Heffernan Worcester Polytechnic Institute Department of.
PSAT PREP Titan Forum Lesson Plans October, 2013.
An innovative learning model for computation in first year mathematics Birgit Loch Department of Mathematics and Computing, USQ Elliot Tonkes CS Energy,
CMSC 345 Fall 2000 Unit Testing. The testing process.
Chapter 10 Active Teaching. Four Primary Approaches to Teaching Information Processing Information Processing Personal Personal Behavioral Behavioral.
Your Name Grading and Reporting on Student Learning What is it? A system of assessing and reporting that describes student progress in relation to standards.
Understanding the Role of Post-Secondary Coaches in High Schools Lynne Haeffele, Ph.D. Center for the Study of Education Policy Illinois State University.
Tuteurs cognitifs: La théorie ACT-R et les systèmes de production Roger Nkambou.
All you need to know about ITS (in 60 slides) © Kurt VanLehn For journal article version, see distrib/journal/ITSintroAbstract.htm.
+ Chapter 7 Using Integrated Teaching Methods. + Integrated Teaching Methods Combining direct and indirect delivery of instruction Encourages self-directed.
1 USC Information Sciences Institute Yolanda GilFebruary 2001 Knowledge Acquisition as Tutorial Dialogue: Some Ideas Yolanda Gil.
711: Intelligent Tutoring Systems Week 1 – Introduction.
Leena Razzaq and Neil T. Heffernan Venu Babu Thati Hints: Is It Better to Give or Wait to Be Asked? 1.
Assessment embedded in step- based tutors (SBTs) CPI 494 Feb 12, 2009 Kurt VanLehn ASU.
“Intelligent User Interfaces” by Hefley and Murray.
Agenda  Review application exercises and hand in  Error correction & diagnosis  Math vocabulary  Counting  Symbol Identification & Place Value  Curriculum.
Some Suggestions for Improvement
Interventions for Cognitive Dysfunction OT 460A
How to interact with the system?
Adapted from PPT developed by Jhpiego corporation
CS4311 Spring 2011 Process Improvement Dr
Software Name (Function Type)
Coach Observation Strategies: A Follow-Up
Presenter: Guan-Yu Chen
Why Be Random? What is it about chance outcomes being random that makes random selection seem fair? Two things: Nobody can guess the outcome before it.
Constraint-based tutoring
Classroom Assessment Validity And Bias in Assessment.
Using Bayesian Networks to Predict Test Scores
Intervention Strategies for borderline students
Teaching with Instructional Software
Print slides for students reference
AP English Language and Composition
Intelligent Tutoring Systems
Big Data, Education, and Society
The purposes of grading student work
CS 1302 Programming Principles II
Copyright 2001 by Allyn and Bacon
Detecting the Learning Value of Items In a Randomized Problem Set
Learning companions 1 CPI 494, Kurt VanLehn March 26, 2009.
Teaching Java with the assistance of harvester and pedagogical agents
Eliminating testing through continuous assessment
How to interact with the system?
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Vincent Aleven & Kirsten Butcher
JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY JEOPARODY.
Software metrics.
The Assessment Process Part 2
Taking ownership of the work.
Julie Booth, Robert Siegler, Ken Koedinger & Bethany Rittle-Johnson
AP Research The second course in College Board’s Capstone Program
Education Application and effect of ICT
Education & AI High level discussion points relating ‘techniques / tools’ to possible ‘projects’
Presentation transcript:

The Behavior of Tutoring Systems Kurt VanLehn University of Pittsburgh

Underlying Claim Behaviors of tutoring systems are quite similar, despite difference on Task domain User interface Software structure Knowledge base etc.

6 Illustrative Tutoring Systems Algebra Cognitive Tutor Andes AutoTutor Sherlock SQL-Tutor Steve

Learning Event & Knowledge Component Empirical tech. for establishing KC Modeling transfer Predicting learning curve Determining grain size of KC Instructive objective of designers “the designers’ decisions about what learning events should be detected and advised by the tutor determine what knowledge components are included in the tutor’s representation of knowledge”

Two loops Outer loop Inner loop select a task for student that will help to learn Obtain a task set to select from Inner loop select the right step and help learning

I like this… “a tutoring system does not have to replace a teacher or run an after-school remediation session. Its role in the student’s learning ecology can be anything—a smart piece of paper; an encouraging coach; a stimulating peer, etc” (p6-7)

The Outer Loop Task selection Student control Homework helper e.g. Andes Fixed sequence (random sequence) e.g. AutoTutor, Assistments linear order curriculum Mastery learning Self-paced, e.g. Sherlock Macro-adaptation Knowledge component (KC) based. e.g. Cognitive Tutor Systems modeling also incorrect KC (e.g.??) Systems also represent learning styles and preferences (e.g. different pedagogical agents??)

The Inner Loop The goal is to help student to learn from a task Common services: Minimal feedback Error-specific feedback Hints on the next step. Assessment of knowledge. Review of the solution.

Minimal Feedback Categories When to give Correct, incorrect, non-optimal, unrecognizable When to give Immediate Delayed Demand Feedback policy can be a function of student competence (Razzaq & Heffernan, 2007) Delayed feedback for high performing students Immediate one for less competent students

Next-step Hints To avoid repeated guessing and frustration When Which Based on student data (DT Tutor) Hint on demand (Andes, Sherlock) Hint abuse/refusal “Proactive” help (DT Tutor, Steve) Which Following student’s plan How Single KC: Point, Teach, Bottom-out Multi KC: bottom-out, refer to history; Scaffolding? Contingent tutoring on student recent performance

Error-specific Feedback Analyzing an incorrect step to determine the incorrect learning event and to prevent future occurrence To help self-debugging How Minimal feedback + a sequence of hints When Immediate; when same error happens again; on demand etc. Step analyzer seems a magic, black box to me. It seems it is impossible, at least very hard, to detect what KC a student applied in case of an incorrect answer.

Assessing Knowledge What kind of assessment is required? Assessments are decision aids Coarse-grained assessment Combination of different measures Fine-grained assessment Based on learning events on steps “Mastery” really means the probability that a knowledge component will be applied when it should be applied. Assignment credit & blame problem Counting learning events Counting failures Other issues Here comes the step analyzer again: given a correct student step, it returns the set of knowledge components that were used to derive the step. It is kind of easy to understand “blame” is a difficult problem. But why credit too? Is it possible for the tutor to credit different KCs in case of a correct step? E.g. given an external and an internal angles’ degree, asking for another one

Reviewing the Whole Solution Real-time task domains Non-real-time task domains Pedagogical reason: delayed feedback Plans and policies Learn from human tutor (bottom-hint during performance, reasoning during reviewing) Other issues

Final Remarks The dream of having an ITS for every student in every course is within our grasp. 

Q? Is Assistment single-loop or double-loop? Knowledge component can be incorrect (p3)? Is it saying the KC itself is wrong or not applied at proper step? #steps per task is not fixed, varying by student; Steps <-> learning events (x = 18.46) Mastery learning vs. Macroadptation? Student model A pointer to next task Attribute-value pairs Student data, such as standardized test score, major, etc Step analyzer and generator