Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks

Slides:



Advertisements
Similar presentations
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Advertisements

Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
A quasi-experimental comparison of assessment feedback mechanisms Sven Venema School of Information and Communication Technology.
Magia G. Krause Ph.D. Candidate School of Information University of Michigan Society of American Archivists Annual Meeting August 14, 2009 Undergraduates.
HILT Learning Bundles: Journey to a Fully Online Synchronous Lesson ABCD – TIE Presentation September 16 th, 2013.
RUBRIC FOR CLASS DISCUSSION 0 Absent. 1 Present, not disruptive. Tries to respond when called on but does not offer much. Demonstrates very infrequent.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Fostering “Habits of Mind” for Student Learning in the First Year of College: Results from a National Study Linda DeAngelo, CIRP Assistant Director for.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Community College Survey of Student Engagement CCSSE 2014.
“Would Someone Say Something, Please?” Increasing Student Participation in College Classrooms Jane L. Kenney & Padmini Banerjee Presented by Amy Stonger.
CCSSE 2013 Findings for Cuesta College San Luis Obispo County Community College District.
This comprehensive selection of hundreds of lessons provides teachers with a wide variety of strategies to give every type of student access to core content.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
 Examines the nature of culture and the diverse ways in which societies make meaning and are organized across time and space. Topics include cultural.
Integrating the Natural & Social Sciences in a "Sustainable Agriculture Science & Policy" Course Heather D. Karsten 1 and Clare Hinrichs 2, 1 Dept. of.
Comparative effectiveness of research ethics teaching methods Michael Kalichman and Dena Plemmons UC San Diego Research on Research Integrity Annual Meeting.
 Lunch and Learn April 22, 2015 Office of Human Research.
Student Engagement as Policy Direction: Community College Survey of Student Engagement (CCSSE) Skagit Valley College Board of Trustees Policy GP-4 – Education.
April 8, 2004Washington University Teaching Center Self-Evaluation as an Instructor; Course Redesign Regina Frey, Director Washington University Teaching.
Instructional Plan Template | Slide 1 AET/515 Instructional Plan Cultural Diversity in Health Science Barry L. Rimpsey.
Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu.
Source : The Problem Learning and innovation skills increasingly are being recognized as the skills that separate students who are.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
SCOTT MARION, CENTER FOR ASSESSMENT PRESENTATION AT CCSSO NCSA AS PART OF THE SYMPOSIUM ON: STUDENT GROWTH IN THE NON-TESTED SUBJECTS AND GRADES: OPTIONS.
21 st century Teaching and Learning District Educator Deborah Harris EDU620: Meeting Individual Student Needs With Technology Instructor: Adriane Wheat.
Kenneth C. C. Yang The University of Texas at El Paso Presented at 2016 Sun Conference TEACHING INFORMATION LITERACY SKILLS IN COLLEGE CLASSROOMS: EMPIRICAL.
Information Seeking Behavior and Information Literacy Among Business Majors Casey Long Business Liaison Librarian University Library Georgia State University,
Innovative Applications of Formative Assessments in Higher Education Beyond Exams Dan Thompson M.S. & Brandy Close M.S. Oklahoma State University Center.
Reflections, Discussion Threads and Peer Review for Assessment in Online Learning Kristine Rabberman, Ph.D. Carol A. Muller, Ph.D.
4/16/07 Assessment of the Core – Humanities with Writing Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
DIRECTED ADMINISTRATIVE PORTFOLIO MSA 698. DIRECTED ADMINISTRATIVE PORTFOLIO CAPSTONE ALTERNATIVE Credits: 3 16 weeks The course is centered on the development.
Angelina R. Morgante Raising Awareness EDU 620 Meeting Individual Student Needs With Technology Instructor: Deborah Moerland October 19, 2015.
Good teaching for diverse learners
Learning Assessment Techniques
DIRECTED ADMINISTRATIVE PORTFOLIO
From Good To Great! Sherrell Wheeler–Director, Online Quality Assurance Tanya Allred–Associate Professor of English Karen May–Adjunct Faculty of Business.
IRIS Education and Outreach
MBA CLASS CONTENT AND TEACHING EFFECTIVENESS
Online Ed Ambivalence, Doubts About Data: Faculty Views of Technology
Experienced Faculty Rate Distance Education Most Effective for Achieving Many Student and Administrative Outcomes ePoster Presented Wednesday July 26,
Online Student Engagement Strategies that Fosters Critical Thinking
Improving Student Engagement Through Audience Response Systems
NSSE Results for Faculty
General Education Assessment
Introduction Results Methods Conclusion
Director, Institutional Research
EDU 675 Teaching Effectively-- snaptutorial.com
Derek Herrmann & Ryan Smith University Assessment Services
21st Century Skills The 4 C’s
Beth Perrell Arri Stone
Wait, Why Are We Doing This
Indiana University School of Social Work
Project Category Grade Level
Spring 2018 College Algebra Assessment
Destiny Spry (Tiara Ahu)
Student Satisfaction Results
The Heart of Student Success
John Symons, Department of Philosophy
Carla Vecchiola ICED, June 6th, 2018
Giving Diverse & Underserved Students a Leg up on Student Success
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
2013 NSSE Results.
FLIPPED CLASSROOM PRESENTED BY Dr.R.JEYANTHI Asst.Professor,
Aspects of Online Courses That Are More Effective and Successful Than Face-to-Face Courses Eli Collins-Brown, Ed. D. Methodist College of Nursing.
Authors’ Names (First and Last)
PD Goals Program Overview December, 2012
Approaches to Learning (ATL)
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

An Evaluation Of The Effectiveness Of Digital Tool Use In Undergraduate History Classes Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks Cathy Mikulas, SEG Measurement Paper presented at E-Learn 2016 World Conference Washington, D.C. November 14, 2016

Overview of Presentation Background and Overview of Effectiveness Study Results of Effectiveness Study Instructor and Student Feedback

Use of Digital Tools in Higher Education Utilized in online, blended, and face-to-face courses Support teaching and learning Multiple devices used 24/7 General findings support that tools can increase engagement, increase learning, and provide new models for education Allow for reduced costs and more efficient use of instructional time Allow for increased access to educational resources

Effectiveness study

About the Treatment: MindTap® Online learning solution developed by Cengage Learning Course Management System Digital textbook Interactive chapter assignments/problem sets Multimedia Assessments Gradebook Performance monitoring Subject for this study: History Use regularly for at least 10 weeks of a typical 15-week semester

Research Questions Do students in history classes that use MindTap earn higher grades than students in comparable history classes that do not use MindTap? Is MindTap differentially effective among students of different genders and ethnicities? Is MindTap differentially effective between face-to-face or online courses?

Study Design Overview Quasi-experimental (depicted on next slide) Both Treatment Group and Control Groups had access to online tools Students were all in History courses that used the same textbook(s) Analysis of Comparability of Pretest performance Monitor fidelity of MindTap use throughout the semester End of Course Grades served as outcome Collected evaluations of MindTap from instructors and students

Study Design by Study Group Treatment Group Critical Thinking Pretest Use of Textbook and MindTap Student Final Course Grades Control Group Critical Thinking Pretest Use of Textbook and other online tools -- No MindTap Use -- Student Final Course Grades

Profile of Participating Instructors 13 instructors teaching 21 classes in 12 institutions Equal representation of 2- and 4-year institutions 50% suburban, 33% rural, 17% urban locations 62% male instructors, 38% female 46% more than10 years of teaching experience, 46% between 4 and 10 years of experience, remainder less than 4 years of experience 38% online courses, 52% face-to-face

Student Demographics Variable Treatment Control Gender Female 46% 51%   Female 46% 51% Male 48% Not Reported 6% 3% Ethnicity Caucasian 57% 60% African American 21% 17% Hispanic 8% 10% Other Races or Mixed Race 7% 9% 4%

Use of Digital Tools Beyond MindTap Digital Tool (Number of Instructors) Treatment Control Lecture capture system 2 1 Student response system Social networking 3 Google apps for education Video activities or assessment 5 Web collaboration tools YouTube Khan Academy Other

Analysis Comparison of starting ability Analysis of Covariance Compared treatment and control groups’ grades while controlling for initial ability Evaluated interaction effects between study group and gender Evaluated interaction effects between study group and ethnicity Evaluated interaction effects between study group and course type

Initial Comparability of Study Groups   Study Group Critical Thinking Test Mean Critical Thinking Test Standard Deviation Treatment Group 14.61 4.63 Control Group 14.42 5.06 There were no statistically significant differences in the means (F=0.126, p=.723).

Results Students using MindTap outperformed their comparison classes. (F=13.302, p<.001). Effect size = .31

Results – By Subgroups Gender Ethnicity Course Type Equally effective for males and females (F=2.4, p=.122) Ethnicity While ethnicity was not a significant factor, there was an interaction effect found for study group and ethnicity in the history courses (F=5.654; p<.001). There is an insufficient number of students to fully compare study group differences across all of the ethnicities. The Caucasian students who used the digital tool had higher grades than the Caucasian students who did not use the digital tool. Course Type Equally effective for face-to-face and online courses (F=.132, p=.717)

Effect Size A common metric that can be used to evaluate the amount of growth across studies, even when different measures are used.   The Effect Size for the use of MindTap was .31 (similar to moving from 50th to 62nd percentile)

Instructor and Student Feedback

Instructors’ Evaluation of the Effectiveness of MindTap History instructors reported that MindTap improves both teaching and learning Rated their overall experience an A (excellent) or B (very good). In addition, all instructors gave the program a rating of A or B for: Meeting the needs of a diverse group of students Making teaching easier and better Impacting student engagement Contributing to student learning

Instructors’ Evaluation of the Effectiveness of MindTap – Influencing Instruction History instructors reported that MindTap greatly or somewhat influenced the way they taught by: Providing useful tools and simulations to encourage deeper analysis of the issues Helping students analyze information more completely through the variety of assignments MindTap provides Making instructors’ work easier by providing well thought-out simulations that help to introduce and analyze issues Giving instructors more confidence in asking students challenging questions Giving students the foundation on which to build critical thinking Giving students practice restructuring their knowledge, reading primary sources, considering cause and effect, and seeing the patterns of history

Instructors’ Evaluation of the Effectiveness of MindTap – improving students’ abilities History instructors reported that MindTap greatly or somewhat contributed to the improvement of students’ ability to: Analyze information and arguments Evaluate evidence Draw conclusions and make deductions Separate facts from opinions, values, beliefs, and attitudes Interpret data Make inferences Understand causality

Sampling of Instructors’ Comments MindTap is a remarkable product. As structured, it engages the students in meaningful reading, quizzes, and real learning. With MindTap my students are learning far more. It will be a central and important part of my courses from now on. I have seen a marked improvement in the quality of student responses to analytical questions, demonstrating an advancement of their skills as they increasingly engage with the MindTap materials.

Students’ Evaluation of MindTap MindTap helps history students learn the following aspects of critical thinking (a great deal or somewhat) % saying a great deal or somewhat N= 263 Critical thinking skills overall 71% Analyzing information and arguments 62 Interpreting data Separating facts from opinions, values, beliefs, & attitudes 61 Evaluating evidence Drawing conclusions and making deductions Making inferences 60 Writing up information and data 59 Understanding causality 58 Understanding experimentation and the scientific process Analyzing the credibility of sources 56

Summary Students who used MindTap in history classes earned higher grades than students who used the same textbook without MindTap (effect size .31) Faculty found MindTap to be effective and useful Students indicated that MindTap helped them to learn important skills required for success in their history classes

Please contact Scott Elliot at selliot@segmeasurement.com Questions? Please contact Scott Elliot at selliot@segmeasurement.com