QUANTITATIVE EVIDENCE FOR THE USE OF SIMULATION AND RANDOMIZATION IN THE INTRODUCTORY STATISTICS COURSE Nathan Tintle Associate Professor of Statistics.

Slides:



Advertisements
Similar presentations
 Small liberal arts college: 1350 undergraduate students  Statistician within Department of Math, Stat and CS  Class size: Stat 131 (30 students),
Advertisements

Implementation and Order of Topics at Hope College.
Panel at 2013 Joint Mathematics Meetings
An Active Approach to Statistical Inference using Randomization Methods Todd Swanson & Jill VanderStoep Hope College Holland, Michigan.
Research and Impact The WaterBotics ® evaluation and research studies include two synergistic, but distinct, domains: educational impact and scale-up/sustainability.
Using Randomization Methods to Build Conceptual Understanding in Statistical Inference: Day 1 Lock, Lock, Lock Morgan, Lock, and Lock MAA Minicourse –
COLLECTING AND ANALYZING DATA: MEASURING STUDENT SUCCESS Rebecca Orr, Ph.D. Professor of Biology.
What Students Learn (and Don’t Learn) about Inferential Reasoning in Introductory Statistics Courses 2014 Joint Statistical Meetings (JSM) Boston, MA Sharon.
Campus Collaboration to Build a Series of Information Competency Workshops Nancy Getty and Deborah Moore Glendale Community College LOEX 2007.
Early Inference: Using Bootstraps to Introduce Confidence Intervals Robin H. Lock, Burry Professor of Statistics Patti Frazer Lock, Cummings Professor.
A debate of what we know, think we know, and don’t know about the use of simulation and randomization-based methods as alternatives to the consensus curriculum.
Using Simulation/Randomization to Introduce p-value in Week 1 Soma Roy Department of Statistics, Cal Poly, San Luis Obispo ICOTS9, Flagstaff – Arizona.
Faculty Learning Circle Introduction The goal of Achieving College Success Now! project is to impact post-secondary academic success of students with disabilities.
Introduction to Psychology: Northern Arizona University Fully implemented, 2009  2000/year foundational, survey-style class  Traditionally, 8-11 uncoordinated.
Models and Modeling in Introductory Statistics Robin H. Lock Burry Professor of Statistics St. Lawrence University 2012 Joint Statistics Meetings San Diego,
Elizabeth Fry and Rebekah Isaak University of Minnesota
A new approach to introductory statistics Nathan Tintle Hope College.
A Fiddler on the Roof: Tradition vs. Modern Methods in Teaching Inference Patti Frazer Lock Robin H. Lock St. Lawrence University Joint Mathematics Meetings.
Context  Learning Problem: Many students are under-prepared for freshman biology courses. They have difficulty using their biology textbook and performing.
2015 USCOTS V1 1 Milo Schield, Augsburg College Member: International Statistical Institute US Rep: International Statistical Literacy Project Director,
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
NCAT Mid-Course Sharing Workshop June 20, 2012, Columbia, MO Human Anatomy Redesign.
Introducing Inference with Bootstrap and Randomization Procedures Dennis Lock Statistics Education Meeting October 30,
Following changes in a two-semester Business Statistics Sequence at Duquesne University Amy L. Phelps, Speaking About Teaching Business.
Statistics: Unlocking the Power of Data Lock 5 Hypothesis Testing: Hypotheses STAT 101 Dr. Kari Lock Morgan SECTION 4.1 Statistical test Null and alternative.
Creating an OER Course to Enhance Student Learning Kipp Snow Brandi Ulrich Anne Arundel Community College.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Assessing Student Learning about Statistical Inference Beth Chance – Cal Poly, San Luis Obispo, USA John Holcomb – Cleveland State University, USA Allan.
1 An Application of the Action Research Model for Assessment Preliminary Report JSM, San Francisco August 5, 2003 Tracy Goodson-Espy, University of AL,
Adaptation, Implementation, & Assessment Activity and Web-based Probability & Statistics Workshop M. Leigh Lunsford and Ginger Holmes Rowell July 14, 2004.
1 Applying an Action Research Model to Assess Student Understanding of the Central Limit Theorem in Post-Calculus Probability and Statistics Courses JSM,
Aiming to Improve Students' Statistical Reasoning: An Introduction to AIMS Materials Bob delMas, Joan Garfield, and Andy Zieffler University of Minnesota.
Hybrid Courses: Some Random Thoughts on Expectations and Outcomes Martha Goshaw Seminole State College of Florida November 12, 2009.
Marsha Lovett, Oded Meyer and Candace Thille Presented by John Rinderle More Students, New Instructors: Measuring the Effectiveness of the OLI Statistics.
How to Handle Intervals in a Simulation-Based Curriculum? Robin Lock Burry Professor of Statistics St. Lawrence University 2015 Joint Statistics Meetings.
Engaging Mathematics at Roosevelt University College Algebra: Modeling the City Barbara Gonzalez and Cathy Evins Engaging Mathematics 1 Timeline: Fall.
Using Activity- and Web-Based Materials in Post-Calculus Probability and Statistics Courses Allan Rossman (and Beth Chance) Cal Poly – San Luis Obispo.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
NSF DUE /15/2002 Collaborative Research: Adaptation & Implementation of Activity & Web-Based Materials Into Post-Calculus Introductory Probability.
Integrating Data Analysis Across the Curriculum Feel free to edit and change this slide.
StatKey Online Tools for Teaching a Modern Introductory Statistics Course Robin Lock Burry Professor of Statistics St. Lawrence University
Introducing Inference with Bootstrapping and Randomization Kari Lock Morgan Department of Statistical Science, Duke University with.
Implementing a Randomization-Based Curriculum for Introductory Statistics Robin H. Lock, Burry Professor of Statistics St. Lawrence University Breakout.
PANEL: Rethinking the First Statistics Course for Math Majors Joint Statistical Meetings, 8/11/04 Allan Rossman Beth Chance Cal Poly – San Luis Obispo.
Teaching to the Guidelines for Assessment and Instruction in Statistics Education CAUSEway Webinar Sept 12, 2006.
Give your data the boot: What is bootstrapping? and Why does it matter? Patti Frazer Lock and Robin H. Lock St. Lawrence University MAA Seaway Section.
The Evaluation of the Web-based ARTIST Ann Ooms, Joan Garfield, Bob delMas – University of Minnesota Assessment Resource Tools for Improving Statistical.
Chiraz Ouerfelli Higher Institute of Applied Studies in Humanities Tunis Situating Strategy Use: The Interplay of Language Learning Strategies and Individual.
From Assessment to SoTL Nitya Jacob and Miriam Segura-Totten BSP Research Residency July 22, 2015.
Reflections on making the switch to a simulation-based inference curriculum Panelists: Julie Clark, Lacey Echols, Dave Klanderman, Laura Shultz Moderator:
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies
Synthesis and Review 2/20/12 Hypothesis Tests: the big picture Randomization distributions Connecting intervals and tests Review of major topics Open Q+A.
Bootstraps and Scrambles: Letting a Dataset Speak for Itself Robin H. Lock Patti Frazer Lock ‘75 Burry Professor of Statistics Cummings Professor of MathematicsSt.
Action Research for Helping Increase Teacher Participation Rates in Technology-Based Professional Learning Brief Paper Presented at SITE 2013 by Dr. Julia.
Simulation-based inference beyond the introductory course Beth Chance Department of Statistics Cal Poly – San Luis Obispo
Teaching the statistical investigation process with simulation-based inference BETH CHANCE, CAL POLY- SAN LUIS OBISPO NATHAN TINTLE, DORDT COLLEGE.
Using Randomization Methods to Build Conceptual Understanding in Statistical Inference: Day 1 Lock, Lock, Lock, Lock, and Lock Minicourse – Joint Mathematics.
Discussant Webster West.
Patti Frazer Lock Cummings Professor of Mathematics
Preliminary Data Analyses
Scholarship of Teaching and Learning
Assessing the association between quantitative maturity and student performance in simulation-based and non-simulation based introductory statistics Nathan.
Research on Geoscience Learning
Multiple Measures Susan Barbitta Associate Director, Special Projects
Investigation of Inverted and Active Pedagogies in STEM Disciplines: A Preliminary Report Reza O. Abbasian Texas Lutheran University, Department of.
Research on Geoscience Learning
Assessment, Attitudes and ARTIST
SYNTHESIS THROUGH SERVICE LEARNING IN STATISTICS-PART TWO
Assessing the association between quantitative maturity and student performance in simulation-based and non-simulation based introductory statistics Nathan.
Presentation transcript:

QUANTITATIVE EVIDENCE FOR THE USE OF SIMULATION AND RANDOMIZATION IN THE INTRODUCTORY STATISTICS COURSE Nathan Tintle Associate Professor of Statistics Dordt College, Sioux Center, Iowa

Broader Context  Randomization and simulation: What is it?  Simulation of null distributions  Bootstrapping  Permutation tests  An incomplete recent history in the algebra-based introductory statistics course (Stat 101; AP Statistics equivalent)  Technological changes- implications on practice and teaching  Cobb (2005, 2007); renewed interest and catalyst  Deeper understanding of the logic and scope of inference  Modules and full length texts

Does it work?  Anecdotal evidence; excitement; momentum; discussions; panels  Quantitative evidence - Deeper understanding of logic and scope of inference?  Holcomb et al. ICOTS-8 (Slovenia; 2010)  Tintle et al. Journal of Stat Ed (2011)  Tintle et al. Statistics Education Research Journal (2012)  Tintle, Joint Statistics Meeting, Panelist, Assessment results (2013)

Holcomb et al. 2010ab  Methods  Modules introduced in the course  Key findings  Not much improvement  Limitations  Not a full curriculum implementation  Only one institution

Tintle et al  Methods  One institution before and after switch  Full course redesign  Similar instructors before and after switch  Standardized assessment (CAOS; delMas et al. 2007)  Key findings  Overall improved post-course performance  Areas with largest improvement in design and inference  ‘No harm’ in other areas  Limitations  Conflation of design and pedagogy with ability to pinpoint reasons for improvement

Tintle et al  Methods  One institution before and after switch  Full course redesign  Similar instructors before and after switch  Standardized assessment (CAOS; delMas et al. 2007)  Sub-sample measured 4 months post-course  Key findings  Overall improved retention  Areas with largest improvement in retention were in in design and inference  ‘No harm’ in other areas  Limitations  Conflation of design and pedagogy with ability to pinpoint reasons for improved retention

Tintle 2013  Methods  New assessment instrument (modified CAOS ~30 questions)  Multiple institutions participating  Key findings  Overall, similar results at other institutions  Limitations  Not always a ‘before the change’ at the institution  Different institutions, pedagogies and uses of materials; large number of potential confounding variables

New results  Another before and after story (Dordt College)  Transferability – 2013/2014 results  What about low performers?

Dordt’s before and after story  Methods  Traditional curriculum (Moore 2010) - 94 students; spring 2011  New curriculum (ISI, 2011 version) – 155 students; fall 2011 and spring 2012  All students completed the 40-question CAOS test during the first week of the semester and again during the last week of the semester. Students were given course credit for completing the assessment test, but not for their performance, and the test was administered electronically outside of class.  Two instructors taught the course each semester, with one instructor the same each semester, and one different in spring 2011 than in fall 2011/spring 2012

Dordt’s before and after story  Overall performance Very similar to Tintle et al (2011) results at another institution Approx. twice the gains using new curriculum as compared to traditional (11.6% vs. 5.6%; p<0.001)

Dordt’s before and after story SubscaleCohortPretestPosttestDiff.Paired t-test p- value Cohort p-value 95% CI for cohort Data Collection and Design Random. Tradition. 34.8% 34.9% 53.1% 36.5% 18.2% 1.6% < <0.001(9.2%, 23.9%) Descript. Statistics Random. Tradition. 55.1% 53.5% 61.1% 69.6% 6.0% 16.1% < (-2.1%, -18.1%) Graphical Represent ations Random. Tradition. 55.8% 58.5% 64.4% 60.9% 8.6% 2.4% < (0.6%, 11.4%) BoxplotsRandom. Tradition. 35.0% 32.4% 41.6% 34.1% 6.6% 1.6% (-2.3%, 12.3%) Bivariate Data Random. Tradition. 58.1% 56.4% 60.7% 64.8% 2.6% 8.4% (-13.3%, 1.6%)

Dordt’s before and after story Averages by Topic SubscaleCohortPrePostDiff.Paired t-test p-value Cohort p-value 95% CI for cohort Prob.Random. Tradition. 31.9% 32.4% 56.5% 35.2% 24.5% 2.7% < <0.001(10.8%, 32.7%) Samp Var. Random. Tradition. 36.7% 38.7% 39.4% 43.5% 2.7% 4.8% (-9.4%, 5.2%) CIsRandom. Tradition. 37.9% 42.9% 51.8% 47.8% 13.9% 4.9% < (1.1%, 16.7%) Tests of Sig. Random. Tradition. 46.1% 50.0% 70.0% 60.6% 23.9% 10.6% <0.001 (6.6%, 19.9%)

Transferability  Fall 2013 and spring 2014  22 different instructor-semesters  17 different instructors  12 different institutions  N=725; pre-post on 30 question ISI assessment (adapted from CAOS)  Many different instructional styles (traditional classroom, active learning pedagogy, computer lab, flipped classroom)  Many different institutions (high school, community college, large university, mid-sized university, small liberal arts college)

Transferability- Overall  Similar findings to author’s institutions; Significantly better overall post-course performance

Transferability – by subscale SubscalePretestPosttestDiff.Paired t-test p-value Overall48.7%57.8%9.1%<0.001 Data Collection and Design 64.7%67.2%2.4%0.03 Descript. Statistics 36.8%44.5%7.7%<0.001 Graphical Representations 50.9%59.0%8.1%<0.001 Probability35.8%47.2%11.4%<0.001 Sampling Variability 20.9%24.8%4.0%0.001 CIs52.7%64.2%11.5%<0.001 Tests of Sig.58.7%70.5%11.8%<0.001

Low performers - overall  Not leaving weak students behind; results similar to traditional curriculum

Discussion  What we know  Anecdotal evidence growing; more and more people jumping on the bandwagon; sustained discussion, development of materials over the last decade  The ISI version of the curriculum (early, middle and current versions) have demonstrated  Improved learning gains in logic and scope of inference compared to traditional curriculum at same institutions  These results appear to translate reasonably well to other institutions---even those without direct comparison data  Improved retention of these same key areas  ‘Do no harm’ in descriptive statistics and other areas  Attitudes; conceptual/attitudes (Talk this afternoon; 1F1; Swanson)

Discussion  What we don’t know  Pedagogy? Content? Spiraling?  Conflated!  What you should ‘take’ and what you can ‘leave’; student learning trajectories  Key instructor/institutional requirements for success  How the approach can be improved even further for greater success

Our plans…  Assessment initiative  Do you want to participate?  Pre- and post- concepts and attitudes; common exam questions  ‘Non-users’ are especially needed!!  Goal: What works, what doesn’t, comparisons by institution, instructor, style, etc. Individualized instructor reports to learn about your own students outcomes  Dissemination of materials (prelim edition; other talks); continued refinement of materials; training on implementing randomization/simulation (workshop Saturday; JSM; more coming)  Continued conversation  Online community fall 2014

Other talks (among others)  Swanson and VanderStoep  Attitudes; this afternoon 145PM; 1F1  Chance and McGaughey.  More conceptual on specific areas  6B1 (Thursday)  Roy et al.  Overview of introduction of p-value in week 1.  4A2 (Tuesday)

..but more is needed  Randomized experiments with targeted interventions to assess  particular student learning outcomes  effective pedagogical strategies and  to develop a clearer understanding of student development learning trajectories

Concluding analogy  Goal:  Give students a 360 degree view of statistical reasoning; a comprehensive understanding of description and inference; what statistics can and can’t tell  Are we there yet?

Option #1: Made it!  We’ve blazed a trail to the top of the mountain; Randomization/simulation gives students the 360 degree view we want

Option #2: False summit We thought we were almost to the top, but we’re not. We’re on the right route and climbing the right mountain, but not there yet. More work to do.

Option #3: Wrong mountain  The only way to get higher is to go down and climb a different mountain Randomization Bayesian? EDA?

Option #4: Wrong continent? Flagstaff ICOTS 10?

Even if we have made it…  We’re only halfway (we still have to get down!)  Once we’re down we’ve got to figure out build a 4-lane highway to the top so we can bring the rest of the statistics education community with us

Acknowledgments  Acknowledgments: ISI Team, other curriculum developers  Funding: NSF (DUE and DUE ), Wiley, other funding agencies (HHMI; Teagle Foundation, etc.)  Slides available at  (main textbook website)

References  Chance and McGaughey (2014). Impact of a simulation/randomization-based curriculum on student understanding of p-values and confidence intervals. To be presented at ICOTS-9.  Cobb, G. (2007). The Introductory Statistics Course: A Ptolemaic Curriculum? Technology Innovations in Statistics Education, 1(1),  delMas, R., Garfield, J., Ooms, A., and Chance, B., (2007). Assessing Students’ Conceptual Understanding after a First Course in Statistics, Statistics Education Research Journal, 6(2),  Holcomb, J., Chance, B. Rossman, A., & Cobb, G. (2010a). Assessing Student Learning About Statistical Inference, Proceedings of the 8 th International Conference on Teaching Statistics.  Holcomb, J., Chance, B. Rossman, A., Tietjen, E., & Cobb, G. (2010b), Introducing Concepts of Statistical Inference via Randomization Tests, Proceedings of the 8 th International Conference on Teaching Statistics.  Lock, R. H., Lock, P. F., Lock Morgan, K., Lock, E. F., & Lock, D. F. (2013). Statistics: Unlocking the Power of Data. Hoboken, NJ: John Wiley and Sons.  Roy, S., Rossman, A., & Chance, B. (2014). Using Simulation/Randomization to Introduce P-Value in Week 1. To be presented at ICOTS-9.  Schau, C. (2003). Survey of Attitudes Toward Statistics (SATS-36).  Swanson, T., VanderStoep, J., & Tintle, N. (2014). Student Attitudes Toward Statistics from a Randomization-Based Curriculum. To be presented at ICOTS-9.  Tintle, N., Chance, B., Cobb, G., Rossman, A., Roy, S., Swanson, T., & VanderStoep, J (2016). Introduction to Statistical Investigations. Hoboken, NJ: John Wiley and Sons.  Tintle, N., VanderStoep, J., Holmes, V-L., Quisenberry, B., & Swanson, T. (2011). Development and assessment of a preliminary randomization-based introductory statistics curriculum. Journal of Statistics Education, 19(1).  Tintle, N., Topliff, K., VanderStoep, J., Holmes, V-L., & Swanson, T. (2012). Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum. Statistics Education Research Journal, 11(1).