Authentic Discovery Projects in Statistics GAMTE Annual Conference October 14, 2009 Dianna Spence Robb Sinn NGCSU Math/CS Dept, Dahlonega, GA.

Slides:



Advertisements
Similar presentations
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Advertisements

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Increasing your confidence that you really found what you think you found. Reliability and Validity.
Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.
Algebra I End-of-Course (Criterion-referenced) (published by ETS)
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Simple Survey Resources: Templates, Tabulation & Impact Jeff Buckley, Jenna Daniel, & Casey Mull.
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
By : Zohreh Saadati Background and Purpose.
Faculty Learning Communities’ Impacts: Results of a National Survey Faculty Learning Communities’ Impacts: Results of a National Survey Andrea L. Beach,
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Problem Identification
Home Economics Teachers’ Readiness for Teaching STEM
Project Design and Data Collection Methods: A quick and dirty introduction to classroom research Margaret Waterman September 21, 2005 SoTL Fellows
Quantitative Research
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Techniques for Improving Student Learning Outcomes Lynn M. Forsythe Ida M. Jones Deborah J. Kemp Craig School of Business California State University,
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
PhD Research Seminar Series: Writing the Method Section Dr. K. A. Korb University of Jos.
Training Teachers to Use Authentic Discovery Learning Projects in Statistics AMTE January 30, 2010 Robb Sinn Dianna Spence Department of Mathematics &
The Effect of Quality Matters™ on Faculty’s Online Self-efficacy DLA Conference 2010 Jim Wright, Ed.S. June 9, 2010.
Journal of Statistics Education Webinar Series February 18, 2014 This work supported by NSF grants DUE and DUE Brad Bailey Dianna Spence.
AFT 7/12/04 Marywood University Using Data for Decision Support and Planning.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
Using Authentic Discovery Projects to Improve Student Outcomes in Statistics Joint Mathematics Meetings January 16, 2010 Dianna Spence Brad Bailey Robb.
LEARNING PRIORITY OF TECHNOLOGY PROCESS SKILLS AT ELEMENTARY LEVEL Hung-Jen Yang & Miao-Kuei Ho DEPARTMENT OF INDUSTRIAL TECHNOLOGY EDUCATION THE NATIONAL.
EVALUATION REPORT Derek R. Lane, Ph.D. Department of Communication University of Kentucky.
Measuring Changes in Teachers’ Mathematics Content Knowledge Dr. Amy Germuth Compass Consulting Group, LLC.
Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.
Service-Learning and Grant Writing Workshop Tennessee Technological University February 23, 2010 Presented by: Shelley Brown Department of Sociology and.
CAUSE Teaching and Learning Webinar December 14, 2010 Dianna Spence and Brad Bailey North Georgia College & State University This work supported by grants.
Student Projects in Statistics GCTM Conference October 14, 2010 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
Development and results of an older adult health communication program using the Theory of Planned Behavior Virginia Brown, DrPH; Lisa McCoy, MS The National.
The Required Library Component: Assessing First-Year Teaching in the Small Academic Library Susan von Daum Tholl, PhD, Director Diane Zydlewski, Head of.
University of Arkansas Faculty Senate Task Force on Grades Preliminary Report April 19, 2005.
Dissertation Theme “The incidence of using WebQuests on the teaching-learning process of English Foreign Language (EFL) for students attending the seventh.
Orientation and Induction of Traditionally and Alternatively Educated New Teachers Jennifer Conkin October, 2012.
EDU 385 Education Assessment in the Classroom
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
Evaluating a Research Report
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
Authentic Discovery Learning Projects in Statistics NCTM Conference April 23, 2010 Dianna Spence Robb Sinn Department of Mathematics & Computer Science.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
Biomedical Component 2014 Student & Teacher Summer Institute Results.
Quantitative SOTL Research Methods Krista Trinder, College of Medicine Brad Wuetherick, GMCTE October 28, 2010.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
+ General Education Assessment Spring 2014 Quantitative Literacy.
California Educational Research Association Annual Meeting Rancho Mirage, CA – December 5, 2008 Hoky Min, Gregory K. W. K. Chung, Rebecca Buschang, Lianna.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Authentic Discovery Projects in Statistics GCTM Conference October 16, 2009 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
A Pilot Study of a Multimedia Instructional Program for Teaching of ESL Grammar with Embedded Tracking.
Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies
California Assessment of Student Performance and Progress CAASPP Insert Your School Logo.
Information Seeking Behavior and Information Literacy Among Business Majors Casey Long Business Liaison Librarian University Library Georgia State University,
Evaluation Structure. 2| Evaluation – A Multi-layered Approach All AHEC awardees are conducting pre and post tests as well as focus groups with an external.
Models and Instruments for CDIO Assessment Content Validity – mapping the CDIO syllabus to questionnaire items  The role of specificity: Difficulty, context,
Patsy Kraj Spring 2011 University of West Georgia.
“Excuse Me Sir, Here’s Your Change”
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Discovery Learning Projects in Introductory Statistics
Update from ECO: Possible Approaches to Measuring Outcomes
Chapter 8 VALIDITY AND RELIABILITY
CLASSROOM ENVIRONMENT AND THE STRATIFICATION OF SENIOR HIGH SCHOOL STUDENT’S MATHEMATICS ABILITY PERCEPTIONS Nelda a. nacion 5th international scholars’
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

Authentic Discovery Projects in Statistics GAMTE Annual Conference October 14, 2009 Dianna Spence Robb Sinn NGCSU Math/CS Dept, Dahlonega, GA

Agenda Overview of Project Scope and Tasks – Dianna Teaching Model & Discovery Materials Developed – Robb Instructor Observations During Pilot – Dianna Findings (so far) – Robb

NSF Grant Project Overview Grant Title: “Authentic, Career-Specific Discovery Learning Projects in Introductory Statistics” Project Goals: Increase students’...  knowledge & comprehension of statistics  perceived usefulness of statistics  self-beliefs about ability to use and understand statistics Tasks:  Develop Instruments  Develop Research Constructs and Projects  Develop Materials and Train Instructors  Measure Effectiveness

Interdisciplinary Team Disciplines Represented  Biology/Ecology  Criminal Justice  Psychology  Sociology Tasks of Team Members  Identify authentic research constructs  Define instrument/measurement of construct  Suggest simple statistical research projects  Nursing  Physical Therapy  Education  Business

Exploratory Study Fall 2007 Instrument Validation and Concept “Trial Run” Based on 10 sections of Introductory Stats 4 experimental sections  Used authentic discovery projects  n=113 participants out of 128 students 88% participation rate 6 control sections  Did not use authentic discovery projects  n = 164 participants out of 192 students 85% participation rate

Exploratory Results: Content Knowledge Instrument  21 multiple choice items  KR-20 analysis: score = 0.63 Results  control mean: 8.87; experimental mean =  experimental mean 9 percentage points higher  experimental group significantly higher (p <.0001)  effect size = 0.59 Instrument shortened to 18 items for full study

Exploratory Results: Perceived Usefulness of Statistics Instrument  12-item Likert style survey; 6-point scale  5 items reverse scored  score is average (1 – 6) of all items  Cronbach alpha = 0.93 Results  control mean: 4.24; experimental mean = 4.51  experimental group significantly higher (p <.01)  effect size = Instrument unchanged for full study

Exploratory Results: Statistics Self-Beliefs Beliefs in ability to use and understand statistics Instrument  15-item Likert style survey; 6-point scale  score is average (1 – 6) of all items  Cronbach alpha = 0.95 Results  control mean: 4.70; experimental mean = 4.82  difference not significant (1-tailed p =.1045)  effect size = 0.15 Instrument unchanged for full study

Full Study: Pilot of Developed Materials 3 institutions  1 university (6 undergraduate sections)  1 2-year college (2 sections)  1 high school (3 sections) 7 instructors Quasi-Experimental Design  Spring 2008: Begin instructor “control” groups  Fall 08 - Fall 09: “Experimental” groups

Teacher Training – Pilot Instructors Took place before pilot of materials Half a day training Follow-up meetings Work sessions Individual Mentoring

Teacher Training Workshop For Secondary Teachers 1 day workshop Follow-up online assignments PLU credit available Units covered 1.Designing Quality Variables and Constructs 2.Hands-on Survey Design Session 3.Project Organization, Phases, Assessment, and Rubrics 4.Best Practices and Avoiding Pitfalls (Panel Discussion) 5.Technology Tools and Hands-On Data Analysis 6.Team Presentations (Participants share their work product) 7.Instructor Observations from First Implementations

What we discovered… …about discovery learning

Instructional Model: Discovery Learning in Statistics Authentic Research Projects  Experiencing the Scientific Method  Discovering Statistical Methods in Context Design of Research Question Definition of Variables Demographic Data Representative Sampling Issues Data Collection Appropriate Analyses of Data Interpretation of Analyses

Project Format Linear regression  Variables student selects often survey based constructs  Survey design  Sampling  Regression analysis t-tests  Variables may use data previously collected  Designs Independent samples Dependent samples  Hypotheses

Materials Developed (Web-Based) Instructor Guide  Project overview Timelines Implementation tips Best practices  Handouts for different project phases  Evaluation rubrics  Links to student resources

Materials Developed (Web-Based) Student Guide  Overall Project Guide Help for each project phase  Technology Guide  Variables and Constructs

Critical Issue: Defining Variables We advise that our students:  Try to measure interests, obsessions, or priorities  Narrow their focus Some great variable ideas we’ve seen include:  Number of text messages sent / received during class  Pairs of shoes owned  Minutes spent getting ready this morning  Allowance money per week / month  The car you drive regularly is ______ years old Better than “rich parents” variable for correlations  Age you were when you had your first real kiss  Interest in an MRS degree (Scale of 1 to 10)

Critical Issue: Defining Variables We advise that our students:  Try to measure interests, obsessions, or priorities  Narrow their focus Some great variable ideas we’ve seen include:  Number of text messages sent / received during class  Pairs of shoes owned  Minutes spent getting ready this morning  Allowance money per week / month  The car you drive regularly is ______ years old Better than “rich parents” variable for correlations  Age you were when you had your first real kiss  Interest in an MRS degree (Scale of 1 to 10)

Critical Challenge Good mathematics teachers without experience doing quantitative research struggled with “numericizing” variables. Work Around  Give teachers “hands on” experience  Workshop: “Make it Real” Teachers developed their own survey questions Math majors worked during workshop –Copied/distributed survey to classes near Math office –Entered data sets into an Excel spreadsheet Teachers analyzed data and presented findings about their research question by end of day

Instructor Experiences and Observations

Importance of Project Structure Intermediate goals Defined deliverables and project phases Student accountability at each phase Class time needed for project guidance Requirements for final report  outline  template  prior work samples

Instructor Experiences and Observations Setting Student Expectations Students underestimate time/effort required Students often unclear on exactly what to do once they have collected the data Students should be prepared for results that may be weak, non-significant, etc.  realistic view of statistics  avoid too much disappointment

Instructor Experiences and Observations Student Expectations – Quotes Shared by Instructors “The main thing that we have learned is that statistics take time. They cannot be conjured up by a few formulas in a few minutes. The time and effort that is put into a small research project such as this is significant. On a large scale, one can quickly understand the kind of commitment of money and time that is required just to obtain reasonable data.” “While our results did not meet our initial expectations, this is not an utter disappointment. Before this project, statistics looked simple enough for anyone to sit down and do, but now it is evident that it requires more creativity and critical thinking than initially expected. Overall, it was an edifying experience.”

Instructor Experiences and Observations Resolving Team Issues Set guidelines for team communication Assign roles for different team members Individual accountability for group result Independent project options for some projects Assigned teams vs. student-selected

Our Results

Instrument Perceived Usefulness  Pretest:  Posttest:  Significance: p = Self-Efficacy for Statistics  Pretest:  Posttest:  Significance: p = 0.032** Content Knowledge  Pretest: 6.78  Posttest: 7.21  Significance: p = 0.088*

Self-Beliefs Statistics Self-Efficacy  Self-efficacy improved significantly overall Strong Gains –SE for Regression Techniques ( p = ) –SE for General Statistical Tasks ( p = ) Little or No Improvement –SE for t-test Techniques ( p = ) Perceived Utility for Statistics  Students perceptions of the usefulness of statistics improved slightly but not significantly Perceived Utility ( p = )

Performance Gains Concept Knowledge: 3 Components  Regression Techniques Moderately Significant ( p = )  T-test Usage Moderately Significant ( p = )  T-test Inference No gain Instrumentation timeline  Difficult to “squeeze in” time for the instrumentation and all t-test topics for first time instructors

Summing Up Impact on Students Students experienced strong gains in their confidence for regression and data analysis tasks Students experienced moderate gains in their content knowledge

Impact on Teachers of Mathematics Training Teachers  Experiential Learning Instructors needed authentic experiences before they felt comfortable guiding student efforts  Mentoring Providing quality feedback to students during their study design phase was the most critical feature in helping teams produce high quality projects Future Challenges  Discovery Learning Student freedom in choosing topics and variables requires a great deal of instructor creativity.

For more information Project Website  Instructional Materials Home  Contact Us  Robb:  Dianna: