Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions.

Slides:



Advertisements
Similar presentations
REDESIGNING GENERAL PSYCHOLOGY at Frostburg State University Dr
Advertisements

FINAL PROPOSALS Due July 15, 2007 Application Narrative Select a redesign model: how you will embody the Five Principles Modularization: greater flexibility.
WHAT IS THE MOST IMPORTANT THING THAT WE HAVE LEARNED ABOUT QUALITY AND COST? The factors that lead to increased student learning and increased student.
R EDESIGN OF G ENERAL P SYCHOLOGY Getting Started on Course Redesign NOVA, 10/21/11 Dr. Megan E. Bradley, Professor of Frostburg State University.
READINESS CRITERIA What does it mean to be ready to do a major course redesign? Is your institution ready? Which courses are readyi.e., are good candidates.
Principles of Chemistry I University of Maryland Eastern Shore Goals of the Maryland Course Redesign Initiative (MCRI, ) – Adopt new methods to.
University of Maryland Baltimore County Department of Psychology Eileen OBrien, Ph.D. Introductory Psychology University of Maryland System Course Redesign.
READINESS CRITERIA What does it mean to be ready to do a major course redesign? Is your institution ready? Which courses are readyi.e., are good candidates.
Raouf Boules, Ph.D. Redesign Colloquium, USF December 9, 2011.
REDESIGNING OTHER DISCIPLINES NCATS Six Models. HUMANITIES British Literature Communication Studies Developmental Reading Developmental Writing English.
Supplemental Instruction in Precalculus
Just what you need to know and do NOW!
Maryland Course Redesign Initiative University of Maryland Eastern Shore Pilot Assessment Report: Principles of Chemistry I Jennifer L. Hearne, Ph.D. May.
Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions.
M = ∑ T JD Math is something positive at JDCC JDCC Quality Enhancement Plan Improving Student Performance in High Risk Math Courses with COURSE REDESIGN.
GETTING STARTED ON COURSE REDESIGN. TODAY’S DISCUSSION  Introduction to Course Redesign  Proven Model for Successful Redesign: Developmental Math at.
FULLY ONLINE MODEL Moves all or most of the learning environment online Provides access to anyone, anywhere, anytime – on demand Allows international groups.
What Worked and What Didn’t MOMATYC Spring Meeting April 2, 2011.
REDESIGNING STUDENT LEARNING ENVIRONMENTS. TODAY’S DISCUSSION  Overview of the Methodology and Findings of the Successful Redesign Projects  Proven.
Peer-Led Team Learning: A Model for Enhancing Student Learning Claire Berardini & Glenn Miller Third Annual Faculty Institute Pace University.
R EDESIGNING G ENERAL P SYCHOLOGY USING U NDERGRADUATE L EARNING A SSISTANTS AS P EER M ENTORS Increasing Student Success in Social Sciences Conference.
State Assessment Meeting Thursday, June 20, 2013 Get Ready for College – Math: A MOOC Designed for Remediation.
Tools of the Trade: Using Technology in Your Course Tools of the Trade: Using Technology in Your Course 1 Ms. Darla Runyon Assistant Director/Curriculum.
Blended Courses: How to have the best of both worlds in higher education By Susan C. Slowey.
Techniques for Improving Student Learning Outcomes Lynn M. Forsythe Ida M. Jones Deborah J. Kemp Craig School of Business California State University,
Design and Development Awards Spring 2015 TLOS Networked Learning Design and Strategies (NLDS)
Temple University Russell Conwell Learning Center Office of Senior Vice Provost for Undergraduate Studies GETTING INVOLVED IN RESEARCH AT TEMPLE UNIVERSITY.
Michael J. Badolato, EdD, Senior Academic Technology Officer Middlesex Community College | Bedford and Lowell MA.
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
PRESENTATION TO THE DIVISION OF INSTRUCTIONAL INNOVATION AND ASSESSMENT, THE UNIVERSITY OF TEXAS AT AUSTIN Musings on Course Redesign.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Eileen O’Brien, Ph.D. Department of Psychology Tampa, Fl December, 2011.
Tammy Muhs General Education Program Mathematics Coordinator University of Central Florida NCAT Redesign Scholar Course Redesign: A Way To Improve Student.
Tammy Muhs Assistant Chair, MALL Director University of Central Florida NCAT Redesign Scholar Course Redesign: A Way To Increase Student Success.
Evidence Based Teaching Strategy: Applied to Student Orientation
Raouf Boules, Ph.D. January 17, DVMT 101- Developmental Mathematics (4 contact hours) DVMT Intermediate Algebra (3 contact hours)
University of Maryland Baltimore County Department of Psychology Eileen O’Brien, PhD, Linda Baker, PhD, Laura Stapleton, PhD, Adia Garrett, PhD, Karen.
A Supplemental Instruction Model for Precalculus Gabriela Schwab El Paso Community College Helmut Knaust Emil Schwab The University of Texas at El Paso.
Planning and Implementation Workshop 1/19/2012.  Lumina Course Redesign Awardees ◦ 2 Community Colleges ◦ 1 Non-USM Public ◦ 1 Independent Institution.
Academic Transformation/Course Redesign Vincent J. Granito, Ph.D. Chair, Center for Teaching Excellence Assistant Professor, Psychology.
1 Restructure of the Developmental Mathematics Courses.
R EDESIGNING G ENERAL P SYCHOLOGY Redesign Alliance 4 th Annual Conference; March, 2010 Presented by Dr. Megan E. Bradley
Tammy Muhs, Ph.D. Assistant Chair, Mathematics Department University of Central Florida NCAT Redesign Scholar Getting Started with Course Redesign.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
University of Maryland Baltimore County Department of Psychology Psyc100: Introductory Psychology Eileen O’Brien, Ph.D.
Glen Hatton Introduction to Financial Accounting TURNING THE ACCOUNTING CLASSROOM UPSIDE DOWN Randy Hoffma n Introduction to Managerial Accounting PHASE.
Redesign of Precalculus Mathematics THE UNIVERSITY OF ALABAMA College of Arts and Sciences Course Redesign Workshop October 21, 2006.
MML R2R LSU Precalculus Redesign October 2003 – May 2006 Phoebe Rouse.
Getting Started with Course Redesign Lead Speaker: Carolyn Jarmon, Vice President, National Center for Academic Transformation Participants: Beverlee Drucker,
The Redesigned Elements of Statistics Course University of West Florida March 2008.
Presented at the MCRI Workshop May 30, 2008 Redesigning General Frostburg State University MCRI Workshop May 30, 2008 Primary FSU NCAT Team:
Course Redesign Perspectives Course Redesign Perspectives University – Ron Henry University – Ron Henry College - Jerry Hogle College - Jerry Hogle.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Redesign of Biology 101 at Salisbury University Maryland Course Redesign Workshop 29 May 2009.
Redesign of Biology 101 at Salisbury University Maryland Course Redesign Workshop 30 May 2008.
A Redesign of Intermediate Algebra using the Hawkes Learning System Dr. Latonya Garner March 29, 2010 Mississippi Valley State University Department of.
CCHE690 MEDIA REVIEW Having taught for Yavapai college in Prescott for many years after having taught at Universal Technical Institute in Phoenix. I am.
Redesign of Precalculus Mathematics Joe Benson College of Arts and Sciences The University of Alabama.
Presented at the MCRI Workshop May 2009 by Dr. Megan E. Bradley Full Implementation Results for General Frostburg State University MCRI Workshop.
Redesign of Intermediate Algebra THE UNIVERSITY OF ALABAMA College of Arts and Sciences Department of Mathematics NCAT Redesign Alliance Conference March.
Interactivity and Intervention An Overview of Calculus Redesign at Missouri S&T.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
University of Maryland Eastern Shore Assessment Report: Principles of Chemistry I Jennifer L. Hearne, Ph.D. May 29, 2009 Baltimore MD.
Redesign of Developmental Mathematics THE UNIVERSITY OF ALABAMA College of Arts and Sciences Department of Mathematics NCAT Redesign Workshop March 17,
REDESIGNING STUDENT LEARNING ENVIRONMENTS: Getting Started.
Who are we???  Four Year Comprehensive College of the SUNY system  604 acre campus located on Long Island about 20 miles east of NYC  Multicultural.
College Credit Plus Welcome Students and Parents to: Information Session.
Student Success in Mathematics: Guiding Principles Shahla Peterman University of Missouri-St. Louis Math Technology Learning Center.
Getting Started on Course Redesign Jennifer L. Hearne, Ph. D
The Heart of Student Success
Presentation transcript:

Welcome

 Dr. Donald Spicer, USM

 The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions  Long recognized as an issue but no fully acceptable solutions were available  Pew Charitable Trusts funded a study in 1999 of models/strategies to address this ◦ Premise----improving learning outcomes and saving resources need not be mutually exclusive ◦ Appropriate use of technology may be the needed perturbation of the system

 Initial program was to fund 10 institutions per year for 3 years to undertake model redesigns ◦ All segments of higher education---public and private ◦ A wide range of disciplines  Outcome ◦ Most achieved improved learning outcomes ◦ All saved money in offering the course  Subsequent programs focused on scaling up and/or addressing specific issues (e.g., developmental courses)

 Five general models were used  Several common characteristics of successful redesigns were identified  Several hundred courses have been redesigned across the country subsequently Now six models

 Regents’ Effectiveness and Efficiency Initiative  In 2006 turned to E&E in academic sphere  Became first System to adopt Course Redesign as a System  10 projects funded for 3 years across the USM ◦ Biology, Chemistry, English, Mathematics, Psychology ◦ Developmental to 300 level courses ◦ Each solving a specific ‘problem’

 All showed improved learning outcomes  Most showed savings in staffing the course  Some one-time capital investments were required  Generally improved student satisfaction  Generally improved faculty satisfaction  Generally departments could reuse savings in supporting other courses  Plaudits from Chancellor, BOR  Notice in Annapolis that USM is taking initiative

 Needs to be faculty driven with administrative support  These are team efforts---not enough to have one really committed faculty member  Projects most successful if there is team enthusiasm about solving one or more real academic or financial problems

 Chancellor and USM raised money to extend these successes  This is one strategy in a broader discussion of academic transformation

 USM institutions only  One of 3 Initiatives funded by 2009 Carnegie Corporation Academic Leadership Award to Chancellor Kirwan and additional fund raising  Goal is to scale-up Course Redesign based on lessons learned from first round  Focus on core curriculum and gateway courses  Up to $20,000 from this USM fund to be matched by institution ◦ Many projects will not need this much funding

 Three cohorts with solicitation in the fall (2010, 2011, 2012) ◦ 12 awards last year ◦ 10+ this year ◦ More about RFP process this afternoon  Preliminary award in December  “How To” workshop in January  Planning during Spring Semester  Implementation during summer  Pilot in Fall  Full implementation in Spring (or next time course offered)

 Dr. Nancy Shapiro, USM  Erin Knepler, USM

 Part of the Lumina Foundation's “productivity grants” to increase the percentage of Americans with high-quality degrees and credentials to 60% by 2025  States receiving grants ( ) include Arizona, Indiana, Maryland, Montana, Ohio, Tennessee, and Texas  Lumina’s overarching goals through this initiative are to: ◦ Increase and reward completion ◦ Generate and reinvest savings ◦ Educate and train in affordable ways

 “Growing by Degrees” is a statewide partnership between: ◦ Maryland Governor's Office ◦ Maryland Department of Legislative Services ◦ Maryland Association for Community Colleges ◦ Maryland Higher Education Commission ◦ Maryland Independent College and University Association ◦ University System of Maryland  Our state’s agenda through this initiative is to: ◦ Engage the Governor’s P-20 Leadership Council ◦ Promote cross-institutional collaboration in effectiveness and efficiency (E&E) efforts, both academic and administrative ◦ Support faculty and institutional efforts to redesign bottleneck undergraduate courses

 Two rounds of funding: fall 2010 and fall 2011  Nine projects were funded in fall 2010  projects total over two cohorts  $20,000 in direct support with $20,000 institutional match  Funding for USM institutions will focus on developmental courses (pilots at University of Baltimore and Frostburg)  Funding for community colleges, independent institutions, and other public four-year institutions will consider bottleneck courses from any discipline (including developmental courses)  Competitive RFP process will be described in the afternoon: ◦ October 25: Concept Paper ◦ December 2: Full Proposal ◦ Cohort 2 selections made by the end of December 2011

 The Maryland Higher Education Commission (MHEC) was recently awarded a grant from Complete College America (CCA) and a portion of the grant will specifically fund more course redesign projects.  Maryland won a $1 million Completion Innovation Challenge Grant to redesign an additional 21 developmental math courses at community colleges and historically black institutions (HBIs) across the state.  This news means that in addition to Lumina funds, Maryland community colleges and HBIs are also eligible to apply for funds from the CCA grant to redesign developmental math courses.

 Faculty experienced in the methodologies of course redesign who will support other faculty course redesign efforts statewide through: ◦ Providing direct consultation to participating institutions ◦ Developing and delivering workshops for faculty cohorts working toward new redesigns ◦ Participating in the evaluation of proposals for course redesign grants ◦ Assisting in the development and content management of a course redesign Web site hosted by USM

Raouf Boules, Towson University MCRI Project: Developmental Mathematics & Intermediate Algebra Megan Bradley, Frostburg State University MCRI Project: General Psychology Ronald Gutberlet, Salisbury University MCRI Project: Fundamentals of Biology Jennifer Hearne, University of Maryland Eastern Shore MCRI Project: Principles of Chemistry I Eileen O’Brien, University of Maryland Baltimore County MCRI Project: Introductory Psychology

 Dr. Ronald Gutberlet, Salisbury University

 Redesign the Entire Course  Encourage Active Learning  Provide Students with Individualized Assistance  Build in Ongoing Assessment and Prompt (Automated) Feedback  Ensure Sufficient Time on Task and Monitor Student Progress

 Benefits ◦ Opportunity to evaluate and focus course goals ◦ Counteract course drift ◦ Capture the best that each instructor has to offer ◦ Reduce duplication of effort ◦ Meaningful and interesting faculty interaction

 Challenges ◦ Compromise and consensus building ◦ Instructor buy-in across all sections ◦ Time needed for team-building ◦ Can’t be forced ◦ Variation in faculty attitudes toward technology ◦ The more support, the better!  Dept Chair and entire department  Dean; Provost’s Office  Instructional Design

How effective is the traditional lecture? – Is anyone listening? – Is everyone listening?

Does information transfer still require a lecture in 2010? What are the course goals and does the traditional lecture serve them? Need not take an all-or-nothing approach – How to select an effective combination of lecture and active learning? Can we do a better job of “priming” students for a lecture? – Is anyone reading the textbook?

 Benefits ◦ Multiple ways for students to engage with the course material ◦ Students decide when to work on the course material, within the framework of scheduled deadlines ◦ Students held accountable for assigned work ◦ Student engagement and satisfaction ◦ Faculty engagement and satisfaction

 “…easy to want to come to class every time and not fall asleep.”  “This was my favorite course this semester.”  “I like the way this class is conducted better than my friend’s bio classes.”

 Challenges ◦ What to do with that uncomfortable feeling that you are failing to complete a critical transaction if you do not say something out loud to a room full of students?  Is anyone listening?  Is everyone listening? ◦ Significant paradigm shift, natural skepticism

Online learning tools Staffed computer labs Peer learning assistants Small group meetings – Discussion sections – Laboratories Supplementary instruction programs Online discussion tools

 Online quizzes  Sophisticated learning software with problem sets, instant feedback, and assistance  Clickers in the classroom

Track the time that students spend online doing coursework (Blackboard, publisher software) Incorporate periodic assessments to encourage completion of online work – Weekly online quizzes in Biol 101 Design an intervention strategy for students who are performing poorly on assessments or who are not spending sufficient time on task – A job for peer learning assistants?

 Dr. Raouf Boules, Towson University

 Retains the basic course structure, particularly the number of class meetings  Supplements lectures and textbooks with technology-based, out-of-class activities  Encourages greater student engagement with course content  Ensures that students are prepared when they come to class

 Replaces (rather than supplements) in-class time with online, interactive learning activities  Carefully considers why (and how often) classes need to meet face-to-face  Assumes that certain activities can be better accomplished online - individually or in groups  May keep remaining in-class activities the same or may make significant changes  May schedule out-of-class activities in computer labs or totally online so that students can participate anytime, anywhere  Examples: ◦ Towson University: Developmental Mathematics ◦ UMES: Principles of Chemistry

 Moves all classes to a lab setting  Multiple sections combined into one large section  Depends heavily on instructional software including interactive tutorials  Allows students to work as long as they need to master the content  Permits the use of multiple kinds of personnel  Requires a significant commitment of space and equipment  Can teach more than one course in the lab, thus leveraging the initial investment

Traditional (linear algebra)  38 sections (~40)  10 tenured faculty, 13 instructors, 15 GTAs  2 hours per week  $91 cost-per-student Redesign  1 section (~1520)  1 instructor, grad & undergrad TAs + 2 tech support staff  24*7 in open lab  $21 cost-per-student Replicated at U of Alabama, U of Idaho, LSU, Wayne State, U Missouri-St. Louis, Seton Hall

 Eliminates all in-class meetings and moves all learning experiences online.  Adopts successful design elements of Supplemental, Replacement and Emporium models including Web- based, multi-media resources, commercial software, automatically evaluated assessments with guided feedback, links to additional resources and alternative staffing models.  Examples: ◦ RIO SALADO COLLEGE: Pre-Calculus Mathematics ◦ U. OF S. MISSISSIPPI: World Literature

 Takes into account each student’s knowledge/skill level and preferred learning style  Provides an array of learning opportunities including : lectures, outside reading material, labs, small group study sessions, videos, oral and written presentations, homework assignments, individual and group, projects, etc  Started by Ohio State University for Introductory Statistics ◦ Initially students are made familiar with the various options through examples made available for them to experience ◦ Then the student selects what suits him/her and signs a contract detailing containing the choices and what needs to be accomplished

 Retains the basic structure of the college-level/core course, particularly the number of class meetings  Replaces the remedial/developmental course with just-in-time “workshops”  Workshops are designed to remove deficiencies in core course competencies (for a particular course)  Students are individually assigned software modules based on results of diagnostic assessments/placement  Workshops consist of computer-based instruction, small-group activities and test reviews to provide additional instruction on key concepts

 Workshops are facilitated by students who have previously excelled in the core course and are trained and supervised by core course faculty  Workshop activities are just-in-time—i.e., designed so that students use the concepts during the next core course class session, which in turn helps them see the value of the workshops and motivates them to do the workshop activities.

 Supplemental – Add to the current structure and/or change the content  Replacement – Blend face-to-face with online activities  Emporium – Move all classes to a lab setting  Fully online – Conduct all (most) learning activities online  Buffet – Mix and match according to student preferences  Linked Workshop - Replaces the remedial/developmental course with just-in-time “workshops”

Jennifer L. Hearne, Ph.D. September 27 & 28, 2010

 Does your institution want to control or reduce costs?  Does your institution want to increase productivity?  Academic productivity of students (retention and graduation)  Academic course offerings / program offerings  Scholarly activity of faculty

 Is the institution committed to providing the needed support for the redesign project?  Short term AND long term commitment  Scheduling flexibility  Staffing  Technology and support  Professional development

 Which courses are candidates for a redesign?  Will changes in the course have a high impact on the curriculum?  What course administrative offering issues should be considered in the selection process?

 High drop-failure-withdrawal rates  Student performance in subsequent courses  Students on waiting lists creating a bottleneck  Student complaints  Departmental complaints  Course drift -> Inconsistent learning outcomes  Difficulty finding personnel

 Large enrollment  Multiple sections  Gatekeeper/Developmental course  Resources  Personnel  Supplies  classroom space

 How are decisions about curriculum made?  Collective vs Individual  Have the course’s expected learning outcomes and a system for measuring their achievement been identified?

 Determine project participants’ skills  Learning Theory  Technology  Assessment  Identify activities, other than teaching, that will assist in the measurement of faculty efficiency  Cost analysis  Institutional cost  Student cost

 Academic Issues – student related  Administrative Issues – everything else

What evidence will demonstrate that your redesign made a difference? How do you know that you met the goals of the redesign? Dr. Eileen O’Brien, UMBC Dr. Megan E. Bradley, Frostburg State University

 DFW rates in the course over last few years  Attendance rates/participation rates  Enrollment rates- wait lists-student demand unmet/met

 Know class GPA  Design Pretest/Posttest  Identify level of students in class  Design a method to gather attendance/participation rates  Identify number of students repeating the course  Assess Faculty workload in course ◦ NCAT worksheets  Quantify resources needed ◦ Personnel ◦ Space  Agree on common test questions or exams

 Focus groups of students ◦ Midterm and Final  Faculty evaluation of change ◦ In the course ◦ In subsequent courses

 ABC rates/DFW rates  Student learning outcomes ◦ Choose targeted activity ◦ Evaluate competence  Pretest in subsequent courses  Retention rates in subsequent semesters

 Classroom space  Faculty time reallocated  Workload efficiency  Student course evaluations  Students qualitative data  Faculty development  NCAT cost data

 Build relationships within team  Use empirically supported models  Gather data for assessment BEFORE the pilot semester  Insure schedule of data collection prior to redesign semesters  Consult an evaluator  Look for indicators of success ◦ Students ◦ Faculty ◦ Resources  Look downstream ◦ Subsequent courses ◦ Retention rates

 In your groups, discuss the following case study. Answer these questions: ◦ What are the issues with how they assessed their pilot semester? ◦ What could they have done differently?  In your answer, indicate some timeline  If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot)  In your groups, discuss the following case study. Answer these questions: ◦ What are the issues with how they assessed their pilot semester? ◦ What could they have done differently?  In your answer, indicate some timeline  If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot)

Maryland University engaged in their pilot Spring  Half the sections were taught traditionally and the other half redesign. ◦ In the traditional sections, faculty members used different books and tests. They didn’t want to use common exams or questions because they believed it would have ruined their course. ◦ In the redesign sections, the same book and tests were used.  Students’ view of the course was assessed via the university’s course evaluation and anecdotes from student volunteers.  Results showed that students had higher pass rates in the redesign sections than the traditional sections for Spring ◦ Student course evaluation scores increased. ◦ Redesign faculty noticed more students attending than in the past and the students seemed happier.

Maryland University engaged in their pilot Spring  Half the sections were taught traditionally and the other half redesign. ◦ In the traditional sections, faculty members used different books and tests. They didn’t want to use common exams or questions because they believed it would have ruined their course. ◦ In the redesign sections, the same book and tests were used.  Students’ view of the course was assessed via the university’s course evaluation and anecdotes from student volunteers.  Results showed that students had higher pass rates in the redesign sections than the traditional sections for Spring ◦ Student course evaluation scores increased. ◦ Redesign faculty noticed more students attending than in the past and the students seemed happier. Answer these questions: What are the issues with how they assessed their pilot semester? What could they have done differently? In your answer, indicate some timeline If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot) Answer these questions: What are the issues with how they assessed their pilot semester? What could they have done differently? In your answer, indicate some timeline If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot)

 Issues:  What could have done:

 Dr. Ronald Gutberlet, Salisbury University

USM Course Redesign Orientation Workshop September 27 – 28, 2011

 Clem Counts  Mark Frana  Sam Geleta  Ron Gutberlet  Mark Holland  Wanda Kelly  Joan Maloof  Claudia Morrison-Parker  Wanda Perkins  Betty Lou Smith  Bob Tardiff  Melissa Thomas  McGraw-Hill (and now W.W. Norton)  Enhancement of online learning in Biology 210 ◦ Kim Hunter, Richard Hunter Dr. Les Erickson, learning technology guru (left)

 Three hours of lecture and two hours of lab per week  Approximately 1000 students per year  Gen Ed course for lab science requirement  “Large” lecture sections (72-96 students)  Small lab sections (24 students)  Common lab syllabus  Course drift and duplication of effort in lecture  Engagement issues (faculty and students)

 One hour of lecture and two hours of lab per week; 2 hours of lecture replaced with online work  Larger lecture sections (120 students)  Fewer lecture instructors needed  Instructors available to help in other courses  No pedagogical difference between 72, 96, and 120?  Small lab sections (24 students)  Shared online component

 Use of Blackboard to deliver online content that partially replaces traditional lectures ◦ Weekly instructions ◦ Study guide ◦ Online animations, activities, and narrated instruction ◦ Online quiz  Maximized use of lab time for activities, discussion, team contests ◦ Lecture instructor teaches all 5 labs from his/her lecture section (greater integration)  Use of clickers to engage more students, to initiate discussions, to automate some grading, to do team competitions, and to check class comprehension instantly

 Students spent an average of 4 hours per week on course material outside of class  100% of students agreed (29%) or strongly agreed (71%) that the study guides helped them understand the course material  95% of students agreed (70%) or strongly agreed (25%) that they understood the course material  Students performed as well or better than students in traditional sections on embedded exam questions.

 “I really like the mix between online work and class time.”  “It is new and a little hard to get used to, but I like it!”  “I never really liked bio until now.”  “I like the online material…it makes class easier to attend.”  “The breakdown of DNA and protein synthesis is interesting and never taught in my high school.”

Raouf Boules, Ph.D. September 27-28, 2011

DVMT 101- Developmental Mathematics (4 contact hours) DVMT Intermediate Algebra (3 contact hours)

 Students with SAT mathematics scores less than 500 and weak placement test scores  Typical academic year enrollment data  DVMT 101: 20 course sections enrolling 500 students  DVMT 110: 15 course sections enrolling 350 students  Total: 35 sections with close to 850 students  Relative size: 8% of the Department academic year operation 78

 Lecture format  DVMT 101: 4 hours  DVMT 110: 3 hours  Taught mainly by adjunct faculty  Use common exams with pass/fail grades  Challenges  Students enter with varying background experience and skill levels  Students may move with varying pace  May even enroll in DVMT 101 and finish both  Some need more individualized attention than others 79

 Uses a replacement model where one hour in each of the 2 classes is replaced by at least one mandated hour in an open computer lab  Lab uses interactive learning software with thousands of practice problems and tutorials  Self-paced learning environment with immediate feedback and tutorials  Lab is mainly staffed by Undergraduate Learning Assistants (ULA’s), some graduate Teaching Assistants (TA’s) and some instructors  Lab focus: Providing individualized on-demand guidance and individualized attention

 Resistance to change  Large number of adjunct faculty involved  Isolated bad technology experiences  Initial lack of space for an open computer lab

 Time to completion  Pass rates (fall 06 to fall 09 change):  DVMT 101: 77%  85%  DVMT 110: 62%  65%  Positive student experience (from course evaluations)  Increased faculty enthusiasm  Some realizable cost saving  18% of total cost of $150K/year

 Dr. Jennifer Hearne, University of Maryland Eastern Shore

University of Maryland Eastern Shore Jennifer L. Hearne, Ph.D. September 27 & 28, 2011

 University of Maryland Eastern Shore ◦ Thelma B. Thompson, Ph.D., President / Mortimer Neufville, Ph.D. ◦ Charles Williams, Ph.D., Vice President for Academic Affairs  Goals of the Maryland Course Redesign Initiative (MCRI, ) ◦ Adopt new methods to improve student learning outcomes ◦ Reduce institutional costs ◦ Release instructional resources for other purposes  MCRI Team at UMES ◦ Joseph M. Okoh, Ph.D., Yan Y. Waguespack, Ph.D., Gladys G. Shelton, Ph.D., Charles Williams, Ph.D., Amelia G. Potter, James R. Hayes

 Population ◦ Caters to science and health professions students  20% of freshman class ◦ 73% Freshman  Goals ◦ Basic atomic and molecular theory ◦ Nomenclature ◦ Reaction stoichiometry ◦ Gas laws  Academic Issues ◦ inconsistent knowledge of incoming students ◦ 55% student retention rate ◦ lack of coordination among the professors teaching the sections of the course leading to course drift and inconsistent learning outcomes

CourseSection Size MeetingsSections / Professors per academic year Learning Assistant Traditional Chemistry MWF 50 min 7/6No Pilot Chemistry 111E Up to 80 M 75 min + 2h in computer lab Recitation offered 1Yes Chemistry 111E Up to 114 MW 50 minutes + 1h in computer lab 3/2Yes

 Technology Componenets ◦ Blackboard ◦ CengageNOW ◦ Computer Laboratory  New and Mixed Staffing ◦ Undergraduate Learning Assistant (ULA) and Learning Assistant (LA)  Individualized, Active Assistance ◦ On-demand assistance ◦ Cumulative grade posted every Monday ◦ CengageNOW grade available at all times

54.5% earned A-C in Traditional course 65.7% earned A-C in Pilot course Traditional 3, 50 minute classes Pilot 75 minute classes, 2h in designated lab, 1 full-time LA

Traditional 3, 50 minute classes Pilot 75 minute classes, 2h in designated lab, 1 full-time LA Full Implementation F08 2, 50 minute classes, 2h in campus lab, 1 full-time LA + 1 ULA

Traditional 3, 50 minute classes Pilot 75 minute classes, 2h in designated lab, 1 full-time LA Full Implementation F08 2, 50 minute classes, 2h in campus lab, 1 full-time LA + 1 ULA Full Implementation S09 2, 50 minute classes, 1h in designated lab, 1 full-time LA Full Implementation F09 2, 50 minute classes, 1h in designated lab, 7 tutors, 2 ULAs, one TA

54.5% of students in the Traditional course earned A-C Of those students, 61.1% enrolled in Principles of Chemistry II Of those who enrolled in Principles of Chemistry II, 54.5% earned A-C

65.7% of students in the Pilot course earned A-C Of those students, 61.8% enrolled in Principles of Chemistry II Of those who enrolled in Principles of Chemistry II, 61.9% earned A-C

In comparison to students enrolled in the Traditional course, Pilot course students were: 7.4% more likely to earn A-C in Principles of Chemistry II More likely to earn A, B grades in Principles of Chemistry II

 Faculty Buy-in  Computer glitches  Funding for a new computer lab  Tech literacy  Funding for Learning Assistant  Growth of MS/PhD Toxicology Program

 Student attitude improvement  Continuous course improvement  Professional development opportunities  Publications  Presentations  Positive publicity for university  Networking  Research  Subsequent course redesign projects

 Dr. Eileen O’Brien, University of Maryland, Baltimore County

Redesign of Introductory Psychology University of Maryland System Course Redesign

 General education requirement course  180+ students per section  7 sections per year  4 credit course  3 ½ clock hours each week for instruction  15 weeks of class  Large lecture halls  4 unit multiple choice exams  Lecture style  2 FT Grad TAs (GTAs); managing exams and recording grades; administrative tasks.

 Failure rate peaked at 15% (Grade of F); withdrawal rate as high as 10%  60% class attendance rate  Lack of preparation for class–limited discussion  Overwhelming amount of course content  Faculty are not interested in teaching this course  Poor performance on multiple choice exams – (e.g., class exam mean of 62% on unit 1 exam)  Poor student evaluations re: content, learning, environment

 Faculty teaching the course* ◦ Lecturers and Adjuncts  Provost  Dean of Arts and Sciences  Department Chair*  Faculty Development Director*  Student Learning Resource Center Director*  Blackboard Administrator*  Evaluator*  Graduate TAs*  Undergraduate Student* * Review team for timeline tasks

 Maintained 1,000 students per year;  section size to 200.   from 7 to 5 course sections per/yr  Applied 3 credit hrs/wk to course meeting; 1 credit hr/wk for online labs  Created online labs – simulations, practice quizzes, online chapter tests (self-paced).

  amount of content in class sessions; shift to discussion  Integrated CPS questions to  interactivity and attendance  Created common multiple choice exams across sections  Added weekly small group activities to  interactivity  Assigned 1½ GTAs for database management and student support  Sequenced content for more engaging start  Created Peer Mentors for class activities, tutoring, and exam prep

Redesign sections compared to the Traditional section shows significantly proportionately fewer Cs, Ds and Fs and greater As and Bs, χ2(N=768,df=4) = 44.2, p<.001.

Course Withdrawal Rates Fall 2008 had the lowest withdraw rate (3.2%) documented since The range has been 4.1% to 10.3 %. Fall 2008 had the lowest withdraw rate (3.2%) documented since The range has been 4.1% to 10.3 %.

Grade Distributions for Psyc

 What was good about the redesign: ◦ “the ability to do labs on-line when ready or to review information online,” ◦ “small group work and discussion in class,” and ◦ “in-class movies and videos.”  What needs to be improved with the redesign: ◦ “review questions students got wrong on tests; discuss questions that lots of students got wrong,” ◦ “more clicker questions,” ◦ “make PowerPoints available on-line.” *focus group data

 What was good about the redesign: ◦ “removed pressure to teach every topic in the book,” ◦ “freed up time for scholarly activities, less need for office hours or lecturing an additional hour a week.”  What needs to be improved with the redesign: ◦ “faculty need training in technology, use of platforms, and software,” ◦ “accuracy of online test banks.” *faculty interview data

 Decreased the number of sections required each semester and increased class size.  Decreased withdrawal rates; retaining students.  Decreased the need for two faculty each year; free to offer an additional upper level course each semester.  Freed up classroom space for the University to teach other courses on campus.  Decreased the need for graduate teaching assistants from 2 grad students to 1.5 grad students.  Leveraged existing resources to fund undergraduate Peer Mentors.

 Dr. Megan Bradley, Frostburg State University

USM Course Redesign 2 Workshop 2011 Dr. Megan E. Bradley, Professor of Frostburg State University

 Psyc150: General Psychology ◦ Course characteristics  Annual enrollment: About 900  Mostly traditional students and 1 st year students  Required course for Psychology Majors and 5 other majors ◦ Academic Issues  Course drift & inefficiency  Financial difficulties at University level

 Primary Team (who were also tech savvy): ◦ Coordinator  FT Faculty member who did not teach PSYC 150 at FSU ◦ 1 FT faculty member  Regularly taught PSYC 150 ◦ 1 Adjunct instructor  Instructional Designer & regularly taught PSYC 150  Administration: ◦ Dean ◦ Associate Provost  Publisher: ◦ Worth

 Chose Replacement Model  Pilot semester ◦ Comparison:  2 traditional sections (N=42)  2 redesign sections (N=99) ◦ ULAs worked with Redesign Instructors via Independent Study  Full Implementation ◦ Tripled capacity (N = 150) ◦ Began ULA course ◦ 2 nd semester: Began Netbook Lab

Pilot Comparisons N compared to prior # ADMIT GPA*FSU GPA* Traditional16% smaller Redesign100% larger *Significantly different; p =.005 (eta 2 =.027), p =.000 (eta 2 =.075)

 Course drift eliminated ◦ Standard course, syllabus, schedule, grading, etc.

Impact on Student Learning  Success in the course

Mean Test Scores *A one-way ANOVA of section on final exam percentage grades was significant, F = , p =.000, eta 2 =.090. Also significant with GPA as covariate: F = , p =.000, eta 2 =.11. *Instructors blind to exam content

SectionAdmit GPAFSU GPAMean % on Final Exam Section 1 Traditional % Section 2 Traditional % Section 3 Redesign % Section 4 Redesign %

 Final exam scores positively correlated with average scores on Mastery Quizzes ◦ r =.523, p =.000

 Comprehensive final exam too much ◦ 3 unit exams ◦ Reduced overall coverage  “Deadline Disorder” ◦ 5 different activities reduced to the 3 types that helped students ◦ 2 weeks to complete  Students not ready for blended design ◦ Required computer lab 1x/wk

Full Implementation Results: 43 Common Questions (3 exams)* *A one-way ANOVA of section (3 total) on common question percentage was significant, F = , p =.000, eta 2 =.825. Mean Test Scores *Instructors blind to exam content

 Mode ◦ Pre-Essay = 0 or 1 ◦ Post-Essay = 0,1, 2, 3, 4 *A one-way repeated measures ANOVA on essay grades was significant, F = , p =.000, eta2 =.420. Mean Test Scores

Impact on Retention  DWF rate

 Previous average: 12.5% ◦ 18% prior to pilot  Pilot Semester ◦ Traditional sections: 4% ◦ Redesign sections: 22%  Full Implementation - Fall ◦ 12.8%

Other Impact on Students  ULAs

 “Field Experience” course for top students  Leadership in Psychology Certificate Program  Supplemental Instructor (SI) ◦ Receive additional training based on national SI program  Interning as a ULA ◦ Research experience included

 Future opportunities ◦ Commencement Speaker  Spring 2010 & Fall 2010 speakers were ULAs ◦ Graduate School  Teaching or Research Assistantship ◦ Prestigious Internships  Most recent: Sloop Institute for Excellence in Leadership

Impact on Cost Savings  Formula rating  Departmental implication

 NCAT formula: $89 to $26  Departmental implication: ◦ 1 FT faculty position  Biggest cost effectiveness: ◦ Tripling capacity in class ◦ Staffing

 Use of savings ◦ Realized:  Communication Response System Implementation (“clickers”)  Netbook lab  Yearly support for Tamarin Colony  Support for Coordinator  Payment to lab assistants, SI, intern

 Stan Jakubik, University System of Maryland

Stan Jakubik, Asst Vice Chancellor, USM

 3 Course Redesign Faculty cohorts over 4 years  Cohort 1 began this Fall with a pilot in Fall 2011; full course implementation Spring 2012  Cohort 2 begins Fall 2011 with target pilot Fall 2012; full course implementation Spring 2013  Cohort 3 – same pattern

 Cohort 1 awarded funding to 12 courses  Cohort 2 at least 10  Cohort 3 at least 10  Total number of courses might be expanded in each cohort dependent on ability to raise additional funding  ?

 RFP handout outlines all requirements  Short application required by Thanksgiving break  Decisions on awards by 2 nd week of December  Decisions made by USM and CR Faculty Fellows

 September 27 / 28, 2011Introductory Workshop for Cohort 1I   Thanksgiving Break, 2011Concept Papers Due   2 nd Week of December 2011Participant Selections Announced   2 nd Week January 2012Planning and Implementation Workshop   Spring 2012Full Proposal Due (date TBD)   Fall 2012Implementation of Project pilot   October 20112Introductory Workshop for Cohort III  January 2013Combined workshop for Cohorts II and III   Spring 2013Full course implementation   August 1, 2013Final Report Due 