Case Study – San Pedro Week 1, Video 6. Case Study of Classification  San Pedro, M.O.Z., Baker, R.S.J.d., Bowers, A.J., Heffernan, N.T. (2013) Predicting.

Slides:



Advertisements
Similar presentations
Good Evaluation. Good Evaluation Should … be simple be fair be purposeful be related to the curriculum assess skills & strategies set priorities use multiple.
Advertisements

Causal Data Mining Richard Scheines Dept. of Philosophy, Machine Learning, & Human-Computer Interaction Carnegie Mellon.
Knowledge Inference: Advanced BKT Week 4 Video 5.
Knowledge Engineering Week 3 Video 5. Knowledge Engineering  Where your model is created by a smart human being, rather than an exhaustive computer.
Modeling Student Knowledge Using Bayesian Networks to Predict Student Performance By Zach Pardos, Neil Heffernan, Brigham Anderson and Cristina Heffernan.
Discovery with Models Week 8 Video 1. Discovery with Models: The Big Idea  A model of a phenomenon is developed  Via  Prediction  Clustering  Knowledge.
Data Synchronization and Grain-Sizes Week 3 Video 2.
Ryan S.J.d. Baker Adam B. Goldstein Neil T. Heffernan Detecting the Moment of Learning.
Ground Truth for Behavior Detection Week 3 Video 1.
Special Topics in Educational Data Mining HUDK50199 Spring term, 2013 April 1, 2012.
Feature Engineering Studio Special Session January 26, 2015.
Homework Planners as an Intervention for Homework Completion Audrey Bullock Fall 2009 Math 5090 Audrey Bullock Fall 2009 Math 5090.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 April 13, 2011.
The ASSISTment Project Trying to Reduce Bottom-out hinting: Will telling students how many hints they have left help? By Yu Guo, Joseph E. Beck& Neil T.
Conclusion Our prediction model did a good job at predict 8 th grade math proficiency. It can be used to estimate 10 th grade score fairly well, too. But.
Searching for Patterns: Sean Early PSLC Summer School 2007 Question: Which is a better predictor of performance in a cognitive tutor, error rate or assistance.
Using Mixed-Effects Modeling to Compare Different Grain-Sized Skill Models Mingyu Feng, Worcester Polytechnic Institute Neil T. Heffernan, Worcester Polytechnic.
Affective Computing and Intelligent Interaction Ma. Mercedes T. Rodrigo Ateneo Laboratory for the Learning Sciences Department of Information Systems and.
Development of an Affect-Sensitive Agent for an Intelligent Tutor for Algebra Thor Collin S. Andallaza August 4, 2012.
TECHNOLOGY-INTENSIVE INSTRUCTION WITH HIGH PERFORMING AND LOW PERFORMING MIDDLE SCHOOL MATHEMATICS STUDENTS Master’s Thesis Research James P. Dildine,
Educational Data Mining Ryan S.J.d. Baker PSLC/HCII Carnegie Mellon University Richard Scheines Professor of Statistics, Machine Learning, and Human-Computer.
Classifiers, Part 3 Week 1, Video 5 Classification  There is something you want to predict (“the label”)  The thing you want to predict is categorical.
Determining the Significance of Item Order In Randomized Problem Sets Zachary A. Pardos, Neil T. Heffernan Worcester Polytechnic Institute Department of.
Diagnostic Metrics, Part 1 Week 2 Video 2. Different Methods, Different Measures  Today we’ll focus on metrics for classifiers  Later this week we’ll.
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 February 6, 2012.
Feature Engineering Studio Special Session September 11, 2013.
Classifiers, Part 1 Week 1, video 3:. Prediction  Develop a model which can infer a single aspect of the data (predicted variable) from some combination.
Data Annotation for Classification. Prediction Develop a model which can infer a single aspect of the data (predicted variable) from some combination.
COACHING FOR STUDENT SUCCESS October 17, Agenda  Introductions  Characteristics of a UPX student  Strategies for helping our students  Feedback.
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 April 4, 2012.
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 February 13, 2012.
Boredom Across Activities, and Across the Year, within Reasoning Mind William L. Miller, Ryan Baker, Mathew Labrum, Karen Petsche, Angela Z. Wagner.
Gaming the System in the CALL Classroom Peter Gobel Kyoto Sangyo University
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 April 2, 2012.
Tracing behaviors associated with motivational states and learning outcomes when students learn with the Cognitive Tutor Team: Matthew Bernacki & Pranav.
Machine learning system design Prioritizing what to work on
Educational Data Mining: Discovery with Models Ryan S.J.d. Baker PSLC/HCII Carnegie Mellon University Ken Koedinger CMU Director of PSLC Professor of Human-Computer.
Advanced BKT February 11, Classic BKT Not learned Two Learning Parameters p(L 0 )Probability the skill is already known before the first opportunity.
Learning Analytics: Process & Theory March 24, 2014.
Special Topics in Educational Data Mining HUDK5199 Spring term, 2013 February 6, 2013.
Core Methods in Educational Data Mining HUDK4050 Fall 2014.
Core Methods in Educational Data Mining HUDK4050 Fall 2015.
Core Methods in Educational Data Mining HUDK4050 Fall 2015.
Feature Engineering Studio September 9, Welcome to Feature Engineering Studio Design studio-style course teaching how to distill and engineer features.
Special Topics in Educational Data Mining HUDK5199 Spring term, 2013 March 6, 2013.
Core Methods in Educational Data Mining HUDK4050 Fall 2015.
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 January 25, 2012.
Core Methods in Educational Data Mining HUDK4050 Fall 2014.
Computational Models of Discourse Analysis Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
Data-Driven Education
Core Methods in Educational Data Mining
Core Methods in Educational Data Mining
Data Mining in Education
How to interact with the system?
Core Methods in Educational Data Mining
Using Learning Analytics in Personalized Learning
Core Methods in Educational Data Mining
Big Data, Education, and Society
Big Data, Education, and Society
MOST IMPORTANT ELEMENTS OF EFFECTIVE LESSON DESIGN
MEMORIZE THIS PROPORTION OF VARIANCE IN STUDENT GAIN SCORES-- READING, MATH-- EXPLAINED BY LEVEL--PROSPECTS STUDY STUDENTS 28% R 19% M SCHOOLS 12% R
Big Data, Education, and Society
Big Data, Education, and Society
Addressing the Assessing Challenge with the ASSISTment System
Core Methods in Educational Data Mining
Neil T. Heffernan, Joseph E. Beck & Kenneth R. Koedinger
How to interact with the system?
Lecture 6: Introduction to Machine Learning
Core Methods in Educational Data Mining
MAS 622J Course Project Classification of Affective States - GP Semi-Supervised Learning, SVM and kNN Hyungil Ahn
Presentation transcript:

Case Study – San Pedro Week 1, Video 6

Case Study of Classification  San Pedro, M.O.Z., Baker, R.S.J.d., Bowers, A.J., Heffernan, N.T. (2013) Predicting College Enrollment from Student Interaction with an Intelligent Tutoring System in Middle School. Proceedings of the 6th International Conference on Educational Data Mining,

Research Goal  Can we predict student college attendance  Based on student engagement and learning in middle school mathematics  Using fine-grained indicators distilled from interactions with educational software in middle school (~5 years earlier)

Why?  We can infer engagement and learning in middle school, which supports  Automated intervention  Providing actionable info to teachers and school leaders  But which indicators of engagement and learning really matter?  Can we find indicators that a student is at-risk, that we can act on, before problem becomes critical?

ASSISTments

Log Data  3,747 students  In 3 school districts in Massachusetts 1 urban 2 suburban  Completed 494,150 math problems  Working approximately 1 class period a week for the entire year  Making 2,107,108 problem-solving attempts or hint requests in ASSISTments  Between

Data set  Records about whether student eventually attended college  58% of students in sample attended college

Automated Detectors  A number of automated detectors were applied to the data from ASSISTments  These detectors had themselves been previously developed using prediction modeling and were published in previous papers, including (Pardos et al., 2013)  Building a detector and then using it in another analysis is called discovery with models

Automated Detectors  Learning  Bayesian Knowledge Tracing; we’ll discuss this later in the course

Disengagement Detectors (No sensors! Just log files!)  Gaming the System  Intentional misuse of educational software  Systematic Guessing or Rapid Hint Requests  Off-Task Behavior  Stopping work in educational software to do unrelated task  Does not include talking to the teacher or another student about math; these can be distinguished by behavior before and after a pause  Carelessness  Making errors despite knowing skill

Affect Detectors (No sensors! Just log files!)  Boredom  Frustration  Confusion  Engaged Concentration

College Attendance Model  Predict whether a student attended college from a student’s year-long average according to the detectors  Logistic Regression Classifier (binary data)  Cross-validated at the student-level  We’ll discuss this next week

Individual Feature Predictiveness CollegeMeanStd. Dev. t-value Student Knowledge NO (p<0.01) YES CorrectnessNO (p<0.01) YES BoredomNO (p<0.01) YES Engaged Concentration NO (p<0.01) YES Confusion NO (p<0.01) YES

Individual Feature Predictiveness CollegeMeanStd. Dev. t-value Off-Task NO p=0.237 YES Gaming NO (p<0.01) YES CarelessnessNO (p<0.01) YES Number of First Actions (Proxy for Attendance) NO (p<0.01) YES

Full Model  A’ = 0.686, Kappa =  χ 2 (df = 6, N = 3747) = , p < (computed for a non-cross-validated model)  R 2 (Cox & Snell) = 0.098, R 2 (Nagelkerke) =  Overall accuracy = 64.6%; Precision = 66.4; Recall rate = 78.3%

Final Model (Logistic Regression) CollegeEnrollment = StudentKnowledge Correctness NumFirstActions – Carelessness Confusion Boredom

Flipped Signs CollegeEnrollment = StudentKnowledge Correctness NumFirstActions – Carelessness Confusion Boredom

Implications  Carelessness is bad… once we take knowledge into account  Boredom is not a major problem… among knowledgeable students  When unsuccessful bored students are removed, all that may remain are those who become bored because material may be too easy  Does not mean boredom is a good thing!

Implications  Gaming the System drops out of model  Probably because gaming substantially hurts learning  But just because Gaming->Dropout is likely mediated by learning, doesn’t mean gaming doesn’t matter! 0.34  effect

Implications  Off-Task Behavior is not such a big deal  How much effort goes into stopping it?  Past meta-analyses find small significant effect on short-term measures of learning But not when collaborative learning is occurring?

Implications  In-the-moment interventions provided by software (or suggested by software to teachers) may have unexpectedly large effects, if they address boredom, confusion, carelessness, gaming the system

Interventions  Just getting started  First pilot tried in June 2013  Got feedback from teacher  Intervention is being re-designed for a second pilot

Week One Complete!

Week Two  How do we know if a prediction model is any good?  Goodness Metrics  Model Validation