Training Teachers to Use Authentic Discovery Learning Projects in Statistics AMTE January 30, 2010 Robb Sinn Dianna Spence Department of Mathematics &

Slides:



Advertisements
Similar presentations
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
Advertisements

Minnesota’s Professional Development Plan to Prepare for the 2014 GED Test Amy Vickers, Minneapolis Adult Education Astrid Liden, Minnesota Department.
Using Assessment to Inform Instruction: Small Group Time
Increasing your confidence that you really found what you think you found. Reliability and Validity.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Writing an Effective Proposal for Innovations in Teaching Grant
Introduction: The Structure and Scope of the 3-5 Modules November 2012 Common Core Ambassadors.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Making Your Assessments More Meaningful Flex Day 2015.
Visualizing Algebraic Relationships: Solving Rate Problems with Pattern Blocks Dianna Spence Robb Sinn North Georgia College & State University Joint Mathematics.
Engaging Students in the First Year Computer Literacy Course with Self- Paced, Flexible, Online Instruction Mava Wilson, PhD Bill Jaber, PhD Department.
Preparing Elementary Teacher Candidates for the edTPA Prior to Student Teaching: Documenting Experiences in a Math Methods Course Dr. Erica Kwiatkowski-Egizio.
Scholarship of Teaching: An Introduction New Fellows Orientation April 17, 2008 SoTL Fellows
Project Design and Data Collection Methods: A quick and dirty introduction to classroom research Margaret Waterman September 21, 2005 SoTL Fellows
Experiencing Online Classes as Student vs. Instructor: A Case Study Linda Alexander.
 8 years teaching English at SMSU  7 ½ years mentoring high school CEP teachers  Professional awareness of disconnect in high school and college standards.
Research & Statistics Student Learning Assessment comparing Online vs. Live (Face to Face) Instruction.
Experiences and requirements in teacher professional development: Understanding teacher change Sylvia Linan-Thompson, Ph.D. The University of Texas at.
Journal of Statistics Education Webinar Series February 18, 2014 This work supported by NSF grants DUE and DUE Brad Bailey Dianna Spence.
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
AFT 7/12/04 Marywood University Using Data for Decision Support and Planning.
Using Authentic Discovery Projects to Improve Student Outcomes in Statistics Joint Mathematics Meetings January 16, 2010 Dianna Spence Brad Bailey Robb.
EVALUATION REPORT Derek R. Lane, Ph.D. Department of Communication University of Kentucky.
RESEARCH TEAM: ASSEMBLE! Brandon Hanson 2013 AP Statistics Reading Best Practices.
RESPONDENT BACKGROUND DISTRIBUTION Data from 31 survey respondents Student Assessment of Their Learning Gains from Conducting Collaborative Research Projects.
CAUSE Teaching and Learning Webinar December 14, 2010 Dianna Spence and Brad Bailey North Georgia College & State University This work supported by grants.
Student Projects in Statistics GCTM Conference October 14, 2010 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
The Required Library Component: Assessing First-Year Teaching in the Small Academic Library Susan von Daum Tholl, PhD, Director Diane Zydlewski, Head of.
Online Resources for Pre- Service Teachers Laura J. Pyzdrowski West Virginia University Anthony S. Pyzdrowski California University Of Pennsylvania
University 100 Classroom Management and Instruction Workshop by Dr. Kathryn Hoover.
Implementing Educational Gaming in the Mathematics Classroom: Phase I, Professional Development.
Sharing in Leadership for Student Success DeAnn Huinker & Kevin McLeod, UWM Beth Schefelker, MPS 18 April 2008.
Authentic Discovery Learning Projects in Statistics NCTM Conference April 23, 2010 Dianna Spence Robb Sinn Department of Mathematics & Computer Science.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
Course and Syllabus Development Presented by Claire Major Assistant Professor, Higher Education Administration.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
Biomedical Component 2014 Student & Teacher Summer Institute Results.
May 13, 2011 Getting to Know the Common Core State Standards (CCSS)
Quantitative SOTL Research Methods Krista Trinder, College of Medicine Brad Wuetherick, GMCTE October 28, 2010.
Authentic Discovery Projects in Statistics GAMTE Annual Conference October 14, 2009 Dianna Spence Robb Sinn NGCSU Math/CS Dept, Dahlonega, GA.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
California Educational Research Association Annual Meeting Rancho Mirage, CA – December 5, 2008 Hoky Min, Gregory K. W. K. Chung, Rebecca Buschang, Lianna.
Student Preferences For Learning College Algebra in a Web Enhanced Environment Dr. Laura J. Pyzdrowski, Pre-Collegiate Mathematics Coordinator Institute.
Project-Based Learning (PBL) Vivene Robinson.
Math Leadership Support Network ’09-’10 Mathematics Leadership Support Network Presentation Provided Jointly by the P-12 Math and Science Outreach Division.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright Information Copyright Karen St.Clair & Stan North.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Authentic Discovery Projects in Statistics GCTM Conference October 16, 2009 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
+ Getting Started: Projects Based Learning Wando High School PD – October 28, 2015 Chris Turpin H222/H230.
WHAT MAKES A GOOD AND BAD ACTIVITY? CAUSE/SERC Workshop May 16, 2007 Joan Garfield.
CM220 College Composition II Friday, January 29, Unit 1: Introduction to Effective Academic and Professional Writing Unit 1 Lori Martindale, Instructor.
Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies
Second Grade Parent Night. Reading and Writing Mini-Workshop S.A.F.A.R.I. Guides: Mrs. Bowen Mrs. Moorhead.
Strategies for blended learning in an undergraduate curriculum Benjamin Kehrwald, Massey University College of Education.
Evaluation Structure. 2| Evaluation – A Multi-layered Approach All AHEC awardees are conducting pre and post tests as well as focus groups with an external.
CLASSROOM ASSESSMENT TECHNIQUES Departmental Workshop Wayne State University English Department January 11, 2012.
Innovative Applications of Formative Assessments in Higher Education Beyond Exams Dan Thompson M.S. & Brandy Close M.S. Oklahoma State University Center.
The Dignity of Risk A Coach’s Story Karyn Nimac Literacy Coach Daramalan College, ACT AIS-ACT Sharing Day, November 12, 2015 University of Canberra.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Jacksonville, FL March 2013 Welcome, Bienvenido, Bienvenu Teaching Certification Programs Key Questions for Design & Refinement Judith Longfield Georgia.
Adventures in flipping a cell biology course Dr. Katie Shannon Biological Sciences Missouri S&T How do online videos and textbook reading engage students.
Success through Technology and Assessment August 1 st -3 rd Attended workshops and training to prepare for the grant o Achievement Series o eMetric o Obtaining.
Patsy Kraj Spring 2011 University of West Georgia.
Discovery Learning Projects in Introductory Statistics
Melanie Taylor Horizon Research, Inc.
Your Inquiry Project
Florida Standards Assessment Parent Night
Presentation transcript:

Training Teachers to Use Authentic Discovery Learning Projects in Statistics AMTE January 30, 2010 Robb Sinn Dianna Spence Department of Mathematics & Computer Science North Georgia College & State University Dahlonega, Georgia

Agenda Overview of Project Scope and Tasks – Dianna Teaching Model and Sample Workshop Activities – Robb Research Design and Initial Findings – Dianna Directions and Discussion – All of Us

NSF Grant Project Overview NSF CCLI Phase I Grant: “Authentic, Career-Specific Discovery Learning Projects in Introductory Statistics” Goals: Increase students’...  knowledge & comprehension of statistics  perceived usefulness of statistics  self-beliefs about ability to use and understand statistics Tasks:  Develop Instructional Materials for Projects  Develop Instruments  Train Instructors to Use Materials  Measure Effectiveness

Student Projects Linear regression  Variables student selects often survey based constructs  Survey design  Sampling  Regression analysis t-tests  Variables student selects  Designs Independent samples Dependent samples

Materials Developed (Web-Based) Instructor Guide  Project overview Timelines Best practices  Student handouts  Evaluation rubrics Student Guide  Project Guide Help for each project phase  Technology Guide  Variables and Constructs

Teacher Training – Pilot Instructors Took place before pilot of materials Half a day training Follow-up meetings Work sessions Individual Mentoring

Teacher Training Workshop For Secondary Teachers 1 day workshop Follow-up online assignments PLU credit available “Make It Real”

Make It Real Training for Inservice Teachers of AP Statistics

Workshop Goals Participants created surveys: DDeveloped quality research ideas DDesigned their variables and constructs PPracticed writing good questions A team of students worked during the lunch break with the combined surveys: AAdministered surveys in 6 NGCSU classrooms EEntered and compiled all data Participants returned after lunch to analyze their research findings

Capstone Experience Team presentations  Occurred in late afternoon session  Presented findings and their own learning outcomes A final session reviewed their day’s experience and asked them to critique the training. They reported:  Creating their own surveys was both fun and empowering  More than 75% felt sure they could adapt the discovery learning projects to their own classroom needs

Points of Learning Scientific Method  Where survey-based research fits  Students become researchers Technology – Excel Statistics  Regression analyses and analyzing relationships  Presenting t-Test findings within context of discovery learning Brainstorming sessions on:  Collaborative groups  Assignment sheets, timelines, grading rubrics

Learning to Discover? We did “make it real” HHands on experiences SSimulated student projects Discovery is often messy WWe learn by watching discovery happen WWe learn by watching experienced users of discovery learning facilitate WWe will NOT learn from a lecture So why are you lecturing to me?

Question 1 How much K-12 teaching experience do you have? A.Less than 2 years B.Between 2 and 5 years C.Between 5 and 10 years D.Between 10 and 20 years E.More than 20 years

Question 2 How much experience do you have teaching classes for inservice and preservice teachers? A.Just starting B.Taught between 3 and 5 courses / sections C.Taught between 5 and 10 courses / sections D.Taught more than 10 courses / sections

Activity 1 Consider the following survey-study variable idea: 1.How much did you study last week _____ ? 2.How many hours did you study last night? 0 1 – 2 3 – 4 5 – 6 7 – What are some flaws? Design your own “study” variable.  Write a terse, clear question  Suggest answer format Closed vs. open If closed, give categories

Variable Constructs Our NSF grant supported the development a variables and constructs student help guide Depression example Answer Choice Format: Rarely Often Always 1.I do not get much pleasure or joy out of life. 2.Sometimes I feel sad, blue, or unhappy. 3.I often find it difficult to get out of bed in the morning. 4.Sometimes I feel like life is not going my way. 5.Sometimes I feel like crying. 6.I am not sure my life will improve in the future. 7.I often feel like my life really doesn’t matter.

Interesting Variable Ideas Number of text messages sent during class Age when you had your first real kiss Number of songs on your I-Pod / MP3 player Minutes spent getting ready each morning Number of “years old” for the car you drive most often  Appears to measure SES  Used in “Rich Kids” study ideas

Activity 2 Develop a t-test study idea  Brainstorm a variable you think will be different for two groups of students (at your school)  Be ready to explain why you expect to find differences We give our students (and the workshop participants) these “rules of brainstorming”  Lots of talking must occur  Throw out 5 or 6 ideas: “popcorn”  Choose a couple good ideas and revise You have about 2 minutes

Next Step Turning students’ research ideas into a high quality surveys  We have found that teaching others to facilitate this portion of discovery is The most difficult task The most important task  We both are adept at operationalizing opinions, activities, obsessions, and preferences High quality surveys  Multiple drafts  Tested with a few peers  Critiqued at least twice by an instructor

Activity 3 For the chosen topic, try operationalizing the variable idea  Talk with 2 – 3 folks nearby  Be clear and terse  Suggest an appropriate answer format You have about 2 minutes

Research Instrumentation Data Collection Initial Results

Instruments Developed: Content Knowledge Instrument  21 multiple choice items  KR-20 analysis: score = 0.63 Exploratory Results  treatment group significantly higher (p <.0001)  effect size = 0.59 Instrument shortened to 18 items for pilot

Instruments Developed: Perceived Usefulness of Statistics Instrument  12-item Likert style survey; 6-point scale  Cronbach alpha = 0.93 Exploratory Results  treatment group significantly higher (p <.01)  effect size = Instrument unchanged for pilot

Instruments Developed: Statistics Self-Efficacy Beliefs in ability to use and understand statistics Instrument  15-item Likert style survey; 6-point scale  Cronbach alpha = 0.95 Exploratory Results  gains realized, but not significant (1-tailed p =.1045)  effect size = 0.15 Instrument unchanged for pilot

Phase I Data Collection: Pilot of Developed Materials 3 institutions  university (3 instructors)  2-year college (1 instructor)  high school (1 instructor) Quasi-Experimental Design  Spring 2008: Begin instructor “control” groups  Fall 08 - Fall 09: “Experimental” groups

Results: t-Tests Perceived Usefulness  Pretest:50.42  Posttest:  Significance: p = Self-Efficacy for Statistics  Pretest:59.64  Posttest:  Significance: p = 0.032** Content Knowledge  Pretest:6.78  Posttest: 7.21  Significance: p = 0.088*

Subscales: Statistics Self-Efficacy Strong Gains  SE for Regression Techniques ( p = )  SE for General Statistical Tasks ( p = ) Little or No Improvement  SE for t-test Techniques ( p = )

Subscales: Content Knowledge Regression Techniques  Moderate Gains ( p = ) T-test Usage  Moderate Gains ( p = ) T-test Inference  No Gain

Multivariate Analysis: Content Knowledge

Multivariate Analysis: Statistics Self-Efficacy

Multivariate Analysis: Perceived Usefulness of Statistics

Future Directions NSF CCLI Type II Grant Proposal Submitted January 2010 Goals Include:  Nation wide pilot  Vertical Integration to early secondary  Revisions to Materials Increased flexibility Accommodate early high school grades  Qualitative Component More insight into instructor impact  Advisory Panel of Statisticians & Educators

Up For Discussion Because results vary by instructor, we’d like to focus on improving… Instructor Preparation Instructor Assessment Curriculum Materials

Up For Discussion Instructor Preparation What could be included in teacher workshops to foster instructor success in implementing these projects?

Up For Discussion Instructor Assessment In what ways can instructors be assessed…  to gauge their propensity for success with these projects?  to identify areas in which their skills could be refined?

Up For Discussion Curriculum Materials What should be included in the instructional materials?  Aspects to consider Content Organization Style  Features for what stakeholders? Instructors Students

For more information Project Website  northgeorgia.edu/~djspence/nsf/ northgeorgia.edu/~djspence/nsf/ Instructional Materials Home  Contact Us  Robb:  Dianna:  Brad: