Presentation is loading. Please wait.

Presentation is loading. Please wait.

Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions.

Similar presentations


Presentation on theme: "Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions."— Presentation transcript:

1 Welcome

2  Dr. Donald Spicer, USM

3  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions  Long recognized as an issue but no fully acceptable solutions were available  Pew Charitable Trusts funded a study in 1999 of models/strategies to address this ◦ Premise----improving learning outcomes and saving resources need not be mutually exclusive ◦ Appropriate use of technology may be the needed perturbation of the system

4  Initial program was to fund 10 institutions per year for 3 years to undertake model redesigns ◦ All segments of higher education---public and private ◦ A wide range of disciplines  Outcome ◦ Most achieved improved learning outcomes ◦ All saved money in offering the course  Subsequent programs focused on scaling up and/or addressing specific issues (e.g., developmental courses)

5  Five general models were used  Several common characteristics of successful redesigns were identified  Several hundred courses have been redesigned across the country subsequently Now six models

6  Regents’ Effectiveness and Efficiency Initiative  In 2006 turned to E&E in academic sphere  Became first System to adopt Course Redesign as a System  10 projects funded for 3 years across the USM ◦ Biology, Chemistry, English, Mathematics, Psychology ◦ Developmental to 300 level courses ◦ Each solving a specific ‘problem’

7  All showed improved learning outcomes  Most showed savings in staffing the course  Some one-time capital investments were required  Generally improved student satisfaction  Generally improved faculty satisfaction  Generally departments could reuse savings in supporting other courses  Plaudits from Chancellor, BOR  Notice in Annapolis that USM is taking initiative

8  Needs to be faculty driven with administrative support  These are team efforts---not enough to have one really committed faculty member  Projects most successful if there is team enthusiasm about solving one or more real academic or financial problems

9  Chancellor and USM raised money to extend these successes  This is one strategy in a broader discussion of academic transformation

10  USM institutions only  One of 3 Initiatives funded by 2009 Carnegie Corporation Academic Leadership Award to Chancellor Kirwan and additional fund raising  Goal is to scale-up Course Redesign based on lessons learned from first round  Focus on core curriculum and gateway courses  Up to $20,000 from this USM fund to be matched by institution ◦ Many projects will not need this much funding

11  Three cohorts with solicitation in the fall (2010, 2011, 2012) ◦ 12 awards last year ◦ 10+ this year ◦ More about RFP process this afternoon  Preliminary award in December  “How To” workshop in January  Planning during Spring Semester  Implementation during summer  Pilot in Fall  Full implementation in Spring (or next time course offered)

12  Dr. Nancy Shapiro, USM  Erin Knepler, USM

13  Part of the Lumina Foundation's “productivity grants” to increase the percentage of Americans with high-quality degrees and credentials to 60% by 2025  States receiving grants (2009-2013) include Arizona, Indiana, Maryland, Montana, Ohio, Tennessee, and Texas  Lumina’s overarching goals through this initiative are to: ◦ Increase and reward completion ◦ Generate and reinvest savings ◦ Educate and train in affordable ways

14  “Growing by Degrees” is a statewide partnership between: ◦ Maryland Governor's Office ◦ Maryland Department of Legislative Services ◦ Maryland Association for Community Colleges ◦ Maryland Higher Education Commission ◦ Maryland Independent College and University Association ◦ University System of Maryland  Our state’s agenda through this initiative is to: ◦ Engage the Governor’s P-20 Leadership Council ◦ Promote cross-institutional collaboration in effectiveness and efficiency (E&E) efforts, both academic and administrative ◦ Support faculty and institutional efforts to redesign bottleneck undergraduate courses

15  Two rounds of funding: fall 2010 and fall 2011  Nine projects were funded in fall 2010  20-24 projects total over two cohorts  $20,000 in direct support with $20,000 institutional match  Funding for USM institutions will focus on developmental courses (pilots at University of Baltimore and Frostburg)  Funding for community colleges, independent institutions, and other public four-year institutions will consider bottleneck courses from any discipline (including developmental courses)  Competitive RFP process will be described in the afternoon: ◦ October 25: Concept Paper ◦ December 2: Full Proposal ◦ Cohort 2 selections made by the end of December 2011

16  The Maryland Higher Education Commission (MHEC) was recently awarded a grant from Complete College America (CCA) and a portion of the grant will specifically fund more course redesign projects.  Maryland won a $1 million Completion Innovation Challenge Grant to redesign an additional 21 developmental math courses at community colleges and historically black institutions (HBIs) across the state.  This news means that in addition to Lumina funds, Maryland community colleges and HBIs are also eligible to apply for funds from the CCA grant to redesign developmental math courses.

17  Faculty experienced in the methodologies of course redesign who will support other faculty course redesign efforts statewide through: ◦ Providing direct consultation to participating institutions ◦ Developing and delivering workshops for faculty cohorts working toward new redesigns ◦ Participating in the evaluation of proposals for course redesign grants ◦ Assisting in the development and content management of a course redesign Web site hosted by USM

18 Raouf Boules, Towson University MCRI Project: Developmental Mathematics & Intermediate Algebra Megan Bradley, Frostburg State University MCRI Project: General Psychology Ronald Gutberlet, Salisbury University MCRI Project: Fundamentals of Biology Jennifer Hearne, University of Maryland Eastern Shore MCRI Project: Principles of Chemistry I Eileen O’Brien, University of Maryland Baltimore County MCRI Project: Introductory Psychology

19  Dr. Ronald Gutberlet, Salisbury University

20  Redesign the Entire Course  Encourage Active Learning  Provide Students with Individualized Assistance  Build in Ongoing Assessment and Prompt (Automated) Feedback  Ensure Sufficient Time on Task and Monitor Student Progress

21  Benefits ◦ Opportunity to evaluate and focus course goals ◦ Counteract course drift ◦ Capture the best that each instructor has to offer ◦ Reduce duplication of effort ◦ Meaningful and interesting faculty interaction

22  Challenges ◦ Compromise and consensus building ◦ Instructor buy-in across all sections ◦ Time needed for team-building ◦ Can’t be forced ◦ Variation in faculty attitudes toward technology ◦ The more support, the better!  Dept Chair and entire department  Dean; Provost’s Office  Instructional Design

23 How effective is the traditional lecture? – Is anyone listening? – Is everyone listening?

24 Does information transfer still require a lecture in 2010? What are the course goals and does the traditional lecture serve them? Need not take an all-or-nothing approach – How to select an effective combination of lecture and active learning? Can we do a better job of “priming” students for a lecture? – Is anyone reading the textbook?

25  Benefits ◦ Multiple ways for students to engage with the course material ◦ Students decide when to work on the course material, within the framework of scheduled deadlines ◦ Students held accountable for assigned work ◦ Student engagement and satisfaction ◦ Faculty engagement and satisfaction

26  “…easy to want to come to class every time and not fall asleep.”  “This was my favorite course this semester.”  “I like the way this class is conducted better than my friend’s bio classes.”

27  Challenges ◦ What to do with that uncomfortable feeling that you are failing to complete a critical transaction if you do not say something out loud to a room full of students?  Is anyone listening?  Is everyone listening? ◦ Significant paradigm shift, natural skepticism

28

29 Online learning tools Staffed computer labs Peer learning assistants Small group meetings – Discussion sections – Laboratories Supplementary instruction programs Online discussion tools

30  Online quizzes  Sophisticated learning software with problem sets, instant feedback, and assistance  Clickers in the classroom

31 Track the time that students spend online doing coursework (Blackboard, publisher software) Incorporate periodic assessments to encourage completion of online work – Weekly online quizzes in Biol 101 Design an intervention strategy for students who are performing poorly on assessments or who are not spending sufficient time on task – A job for peer learning assistants?

32

33

34  Dr. Raouf Boules, Towson University

35  Retains the basic course structure, particularly the number of class meetings  Supplements lectures and textbooks with technology-based, out-of-class activities  Encourages greater student engagement with course content  Ensures that students are prepared when they come to class

36  Replaces (rather than supplements) in-class time with online, interactive learning activities  Carefully considers why (and how often) classes need to meet face-to-face  Assumes that certain activities can be better accomplished online - individually or in groups  May keep remaining in-class activities the same or may make significant changes  May schedule out-of-class activities in computer labs or totally online so that students can participate anytime, anywhere  Examples: ◦ Towson University: Developmental Mathematics ◦ UMES: Principles of Chemistry

37  Moves all classes to a lab setting  Multiple sections combined into one large section  Depends heavily on instructional software including interactive tutorials  Allows students to work as long as they need to master the content  Permits the use of multiple kinds of personnel  Requires a significant commitment of space and equipment  Can teach more than one course in the lab, thus leveraging the initial investment

38 Traditional (linear algebra)  38 sections (~40)  10 tenured faculty, 13 instructors, 15 GTAs  2 hours per week  $91 cost-per-student Redesign  1 section (~1520)  1 instructor, grad & undergrad TAs + 2 tech support staff  24*7 in open lab  $21 cost-per-student Replicated at U of Alabama, U of Idaho, LSU, Wayne State, U Missouri-St. Louis, Seton Hall

39

40

41

42

43  Eliminates all in-class meetings and moves all learning experiences online.  Adopts successful design elements of Supplemental, Replacement and Emporium models including Web- based, multi-media resources, commercial software, automatically evaluated assessments with guided feedback, links to additional resources and alternative staffing models.  Examples: ◦ RIO SALADO COLLEGE: Pre-Calculus Mathematics ◦ U. OF S. MISSISSIPPI: World Literature

44  Takes into account each student’s knowledge/skill level and preferred learning style  Provides an array of learning opportunities including : lectures, outside reading material, labs, small group study sessions, videos, oral and written presentations, homework assignments, individual and group, projects, etc  Started by Ohio State University for Introductory Statistics ◦ Initially students are made familiar with the various options through examples made available for them to experience ◦ Then the student selects what suits him/her and signs a contract detailing containing the choices and what needs to be accomplished

45  Retains the basic structure of the college-level/core course, particularly the number of class meetings  Replaces the remedial/developmental course with just-in-time “workshops”  Workshops are designed to remove deficiencies in core course competencies (for a particular course)  Students are individually assigned software modules based on results of diagnostic assessments/placement  Workshops consist of computer-based instruction, small-group activities and test reviews to provide additional instruction on key concepts

46  Workshops are facilitated by students who have previously excelled in the core course and are trained and supervised by core course faculty  Workshop activities are just-in-time—i.e., designed so that students use the concepts during the next core course class session, which in turn helps them see the value of the workshops and motivates them to do the workshop activities.

47  Supplemental – Add to the current structure and/or change the content  Replacement – Blend face-to-face with online activities  Emporium – Move all classes to a lab setting  Fully online – Conduct all (most) learning activities online  Buffet – Mix and match according to student preferences  Linked Workshop - Replaces the remedial/developmental course with just-in-time “workshops”

48 Jennifer L. Hearne, Ph.D. September 27 & 28, 2010

49  Does your institution want to control or reduce costs?  Does your institution want to increase productivity?  Academic productivity of students (retention and graduation)  Academic course offerings / program offerings  Scholarly activity of faculty

50  Is the institution committed to providing the needed support for the redesign project?  Short term AND long term commitment  Scheduling flexibility  Staffing  Technology and support  Professional development

51  Which courses are candidates for a redesign?  Will changes in the course have a high impact on the curriculum?  What course administrative offering issues should be considered in the selection process?

52  High drop-failure-withdrawal rates  Student performance in subsequent courses  Students on waiting lists creating a bottleneck  Student complaints  Departmental complaints  Course drift -> Inconsistent learning outcomes  Difficulty finding personnel

53  Large enrollment  Multiple sections  Gatekeeper/Developmental course  Resources  Personnel  Supplies  classroom space

54  How are decisions about curriculum made?  Collective vs Individual  Have the course’s expected learning outcomes and a system for measuring their achievement been identified?

55  Determine project participants’ skills  Learning Theory  Technology  Assessment  Identify activities, other than teaching, that will assist in the measurement of faculty efficiency  Cost analysis  Institutional cost  Student cost

56  Academic Issues – student related  Administrative Issues – everything else

57 What evidence will demonstrate that your redesign made a difference? How do you know that you met the goals of the redesign? Dr. Eileen O’Brien, UMBC Dr. Megan E. Bradley, Frostburg State University

58  DFW rates in the course over last few years  Attendance rates/participation rates  Enrollment rates- wait lists-student demand unmet/met

59  Know class GPA  Design Pretest/Posttest  Identify level of students in class  Design a method to gather attendance/participation rates  Identify number of students repeating the course  Assess Faculty workload in course ◦ NCAT worksheets  Quantify resources needed ◦ Personnel ◦ Space  Agree on common test questions or exams

60  Focus groups of students ◦ Midterm and Final  Faculty evaluation of change ◦ In the course ◦ In subsequent courses

61  ABC rates/DFW rates  Student learning outcomes ◦ Choose targeted activity ◦ Evaluate competence  Pretest in subsequent courses  Retention rates in subsequent semesters

62  Classroom space  Faculty time reallocated  Workload efficiency  Student course evaluations  Students qualitative data  Faculty development  NCAT cost data

63  Build relationships within team  Use empirically supported models  Gather data for assessment BEFORE the pilot semester  Insure schedule of data collection prior to redesign semesters  Consult an evaluator  Look for indicators of success ◦ Students ◦ Faculty ◦ Resources  Look downstream ◦ Subsequent courses ◦ Retention rates

64  In your groups, discuss the following case study. Answer these questions: ◦ What are the issues with how they assessed their pilot semester? ◦ What could they have done differently?  In your answer, indicate some timeline  If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot)  In your groups, discuss the following case study. Answer these questions: ◦ What are the issues with how they assessed their pilot semester? ◦ What could they have done differently?  In your answer, indicate some timeline  If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot)

65 Maryland University engaged in their pilot Spring 2011.  Half the sections were taught traditionally and the other half redesign. ◦ In the traditional sections, faculty members used different books and tests. They didn’t want to use common exams or questions because they believed it would have ruined their course. ◦ In the redesign sections, the same book and tests were used.  Students’ view of the course was assessed via the university’s course evaluation and anecdotes from student volunteers.  Results showed that students had higher pass rates in the redesign sections than the traditional sections for Spring 2011. ◦ Student course evaluation scores increased. ◦ Redesign faculty noticed more students attending than in the past and the students seemed happier.

66 Maryland University engaged in their pilot Spring 2011.  Half the sections were taught traditionally and the other half redesign. ◦ In the traditional sections, faculty members used different books and tests. They didn’t want to use common exams or questions because they believed it would have ruined their course. ◦ In the redesign sections, the same book and tests were used.  Students’ view of the course was assessed via the university’s course evaluation and anecdotes from student volunteers.  Results showed that students had higher pass rates in the redesign sections than the traditional sections for Spring 2011. ◦ Student course evaluation scores increased. ◦ Redesign faculty noticed more students attending than in the past and the students seemed happier. Answer these questions: What are the issues with how they assessed their pilot semester? What could they have done differently? In your answer, indicate some timeline If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot) Answer these questions: What are the issues with how they assessed their pilot semester? What could they have done differently? In your answer, indicate some timeline If you think they should have gathered a certain type of data, indicate when (i.e., Semester prior to Pilot)

67  Issues:  What could have done:

68  Dr. Ronald Gutberlet, Salisbury University

69 USM Course Redesign Orientation Workshop September 27 – 28, 2011

70  Clem Counts  Mark Frana  Sam Geleta  Ron Gutberlet  Mark Holland  Wanda Kelly  Joan Maloof  Claudia Morrison-Parker  Wanda Perkins  Betty Lou Smith  Bob Tardiff  Melissa Thomas  McGraw-Hill (and now W.W. Norton)  Enhancement of online learning in Biology 210 ◦ Kim Hunter, Richard Hunter Dr. Les Erickson, learning technology guru (left)

71  Three hours of lecture and two hours of lab per week  Approximately 1000 students per year  Gen Ed course for lab science requirement  “Large” lecture sections (72-96 students)  Small lab sections (24 students)  Common lab syllabus  Course drift and duplication of effort in lecture  Engagement issues (faculty and students)

72  One hour of lecture and two hours of lab per week; 2 hours of lecture replaced with online work  Larger lecture sections (120 students)  Fewer lecture instructors needed  Instructors available to help in other courses  No pedagogical difference between 72, 96, and 120?  Small lab sections (24 students)  Shared online component

73  Use of Blackboard to deliver online content that partially replaces traditional lectures ◦ Weekly instructions ◦ Study guide ◦ Online animations, activities, and narrated instruction ◦ Online quiz  Maximized use of lab time for activities, discussion, team contests ◦ Lecture instructor teaches all 5 labs from his/her lecture section (greater integration)  Use of clickers to engage more students, to initiate discussions, to automate some grading, to do team competitions, and to check class comprehension instantly

74  Students spent an average of 4 hours per week on course material outside of class  100% of students agreed (29%) or strongly agreed (71%) that the study guides helped them understand the course material  95% of students agreed (70%) or strongly agreed (25%) that they understood the course material  Students performed as well or better than students in traditional sections on embedded exam questions.

75  “I really like the mix between online work and class time.”  “It is new and a little hard to get used to, but I like it!”  “I never really liked bio until now.”  “I like the online material…it makes class easier to attend.”  “The breakdown of DNA and protein synthesis is interesting and never taught in my high school.”

76 Raouf Boules, Ph.D. September 27-28, 2011

77 DVMT 101- Developmental Mathematics (4 contact hours) DVMT 110 - Intermediate Algebra (3 contact hours)

78  Students with SAT mathematics scores less than 500 and weak placement test scores  Typical academic year enrollment data  DVMT 101: 20 course sections enrolling 500 students  DVMT 110: 15 course sections enrolling 350 students  Total: 35 sections with close to 850 students  Relative size: 8% of the Department academic year operation 78

79  Lecture format  DVMT 101: 4 hours  DVMT 110: 3 hours  Taught mainly by adjunct faculty  Use common exams with pass/fail grades  Challenges  Students enter with varying background experience and skill levels  Students may move with varying pace  May even enroll in DVMT 101 and finish both  Some need more individualized attention than others 79

80  Uses a replacement model where one hour in each of the 2 classes is replaced by at least one mandated hour in an open computer lab  Lab uses interactive learning software with thousands of practice problems and tutorials  Self-paced learning environment with immediate feedback and tutorials  Lab is mainly staffed by Undergraduate Learning Assistants (ULA’s), some graduate Teaching Assistants (TA’s) and some instructors  Lab focus: Providing individualized on-demand guidance and individualized attention

81  Resistance to change  Large number of adjunct faculty involved  Isolated bad technology experiences  Initial lack of space for an open computer lab

82  Time to completion  Pass rates (fall 06 to fall 09 change):  DVMT 101: 77%  85%  DVMT 110: 62%  65%  Positive student experience (from course evaluations)  Increased faculty enthusiasm  Some realizable cost saving  18% of total cost of $150K/year

83  Dr. Jennifer Hearne, University of Maryland Eastern Shore

84 University of Maryland Eastern Shore Jennifer L. Hearne, Ph.D. September 27 & 28, 2011

85  University of Maryland Eastern Shore ◦ Thelma B. Thompson, Ph.D., President / Mortimer Neufville, Ph.D. ◦ Charles Williams, Ph.D., Vice President for Academic Affairs  Goals of the Maryland Course Redesign Initiative (MCRI, 2006-2009) ◦ Adopt new methods to improve student learning outcomes ◦ Reduce institutional costs ◦ Release instructional resources for other purposes  MCRI Team at UMES ◦ Joseph M. Okoh, Ph.D., Yan Y. Waguespack, Ph.D., Gladys G. Shelton, Ph.D., Charles Williams, Ph.D., Amelia G. Potter, James R. Hayes

86  Population ◦ Caters to science and health professions students  20% of freshman class ◦ 73% Freshman  Goals ◦ Basic atomic and molecular theory ◦ Nomenclature ◦ Reaction stoichiometry ◦ Gas laws  Academic Issues ◦ inconsistent knowledge of incoming students ◦ 55% student retention rate ◦ lack of coordination among the professors teaching the sections of the course leading to course drift and inconsistent learning outcomes

87 CourseSection Size MeetingsSections / Professors per academic year Learning Assistant Traditional Chemistry 111 30-40MWF 50 min 7/6No Pilot Chemistry 111E Up to 80 M 75 min + 2h in computer lab Recitation offered 1Yes Chemistry 111E Up to 114 MW 50 minutes + 1h in computer lab 3/2Yes

88  Technology Componenets ◦ Blackboard ◦ CengageNOW http://login.cengage.com/sso//http://login.cengage.com/sso// ◦ Computer Laboratory  New and Mixed Staffing ◦ Undergraduate Learning Assistant (ULA) and Learning Assistant (LA)  Individualized, Active Assistance ◦ On-demand assistance ◦ Cumulative grade posted every Monday ◦ CengageNOW grade available at all times

89 54.5% earned A-C in Traditional course 65.7% earned A-C in Pilot course Traditional 3, 50 minute classes Pilot 75 minute classes, 2h in designated lab, 1 full-time LA

90 Traditional 3, 50 minute classes Pilot 75 minute classes, 2h in designated lab, 1 full-time LA Full Implementation F08 2, 50 minute classes, 2h in campus lab, 1 full-time LA + 1 ULA

91 Traditional 3, 50 minute classes Pilot 75 minute classes, 2h in designated lab, 1 full-time LA Full Implementation F08 2, 50 minute classes, 2h in campus lab, 1 full-time LA + 1 ULA Full Implementation S09 2, 50 minute classes, 1h in designated lab, 1 full-time LA Full Implementation F09 2, 50 minute classes, 1h in designated lab, 7 tutors, 2 ULAs, one TA

92 54.5% of students in the Traditional course earned A-C Of those students, 61.1% enrolled in Principles of Chemistry II Of those who enrolled in Principles of Chemistry II, 54.5% earned A-C

93 65.7% of students in the Pilot course earned A-C Of those students, 61.8% enrolled in Principles of Chemistry II Of those who enrolled in Principles of Chemistry II, 61.9% earned A-C

94 In comparison to students enrolled in the Traditional course, Pilot course students were: 7.4% more likely to earn A-C in Principles of Chemistry II More likely to earn A, B grades in Principles of Chemistry II

95

96  Faculty Buy-in  Computer glitches  Funding for a new computer lab  Tech literacy  Funding for Learning Assistant  Growth of MS/PhD Toxicology Program

97  Student attitude improvement  Continuous course improvement  Professional development opportunities  Publications  Presentations  Positive publicity for university  Networking  Research  Subsequent course redesign projects

98  Dr. Eileen O’Brien, University of Maryland, Baltimore County

99 Redesign of Introductory Psychology University of Maryland System Course Redesign

100  General education requirement course  180+ students per section  7 sections per year  4 credit course  3 ½ clock hours each week for instruction  15 weeks of class  Large lecture halls  4 unit multiple choice exams  Lecture style  2 FT Grad TAs (GTAs); managing exams and recording grades; administrative tasks.

101  Failure rate peaked at 15% (Grade of F); withdrawal rate as high as 10%  60% class attendance rate  Lack of preparation for class–limited discussion  Overwhelming amount of course content  Faculty are not interested in teaching this course  Poor performance on multiple choice exams – (e.g., class exam mean of 62% on unit 1 exam)  Poor student evaluations re: content, learning, environment

102  Faculty teaching the course* ◦ Lecturers and Adjuncts  Provost  Dean of Arts and Sciences  Department Chair*  Faculty Development Director*  Student Learning Resource Center Director*  Blackboard Administrator*  Evaluator*  Graduate TAs*  Undergraduate Student* * Review team for timeline tasks

103  Maintained 1,000 students per year;  section size to 200.   from 7 to 5 course sections per/yr  Applied 3 credit hrs/wk to course meeting; 1 credit hr/wk for online labs  Created online labs – simulations, practice quizzes, online chapter tests (self-paced).

104   amount of content in class sessions; shift to discussion  Integrated CPS questions to  interactivity and attendance  Created common multiple choice exams across sections  Added weekly small group activities to  interactivity  Assigned 1½ GTAs for database management and student support  Sequenced content for more engaging start  Created Peer Mentors for class activities, tutoring, and exam prep

105 Redesign sections compared to the Traditional section shows significantly proportionately fewer Cs, Ds and Fs and greater As and Bs, χ2(N=768,df=4) = 44.2, p<.001.

106 Course Withdrawal Rates Fall 2008 had the lowest withdraw rate (3.2%) documented since 2000. The range has been 4.1% to 10.3 %. Fall 2008 had the lowest withdraw rate (3.2%) documented since 2000. The range has been 4.1% to 10.3 %.

107 Grade Distributions for Psyc100 2009-2011

108  What was good about the redesign: ◦ “the ability to do labs on-line when ready or to review information online,” ◦ “small group work and discussion in class,” and ◦ “in-class movies and videos.”  What needs to be improved with the redesign: ◦ “review questions students got wrong on tests; discuss questions that lots of students got wrong,” ◦ “more clicker questions,” ◦ “make PowerPoints available on-line.” *focus group data

109  What was good about the redesign: ◦ “removed pressure to teach every topic in the book,” ◦ “freed up time for scholarly activities, less need for office hours or lecturing an additional hour a week.”  What needs to be improved with the redesign: ◦ “faculty need training in technology, use of platforms, and software,” ◦ “accuracy of online test banks.” *faculty interview data

110  Decreased the number of sections required each semester and increased class size.  Decreased withdrawal rates; retaining students.  Decreased the need for two faculty each year; free to offer an additional upper level course each semester.  Freed up classroom space for the University to teach other courses on campus.  Decreased the need for graduate teaching assistants from 2 grad students to 1.5 grad students.  Leveraged existing resources to fund undergraduate Peer Mentors.

111  Dr. Megan Bradley, Frostburg State University

112 USM Course Redesign 2 Workshop 2011 Dr. Megan E. Bradley, Professor of Psychology @ Frostburg State University mbradley@frostburg.edu

113  Psyc150: General Psychology ◦ Course characteristics  Annual enrollment: About 900  Mostly traditional students and 1 st year students  Required course for Psychology Majors and 5 other majors ◦ Academic Issues  Course drift & inefficiency  Financial difficulties at University level

114  Primary Team (who were also tech savvy): ◦ Coordinator  FT Faculty member who did not teach PSYC 150 at FSU ◦ 1 FT faculty member  Regularly taught PSYC 150 ◦ 1 Adjunct instructor  Instructional Designer & regularly taught PSYC 150  Administration: ◦ Dean ◦ Associate Provost  Publisher: ◦ Worth

115  Chose Replacement Model  Pilot semester ◦ Comparison:  2 traditional sections (N=42)  2 redesign sections (N=99) ◦ ULAs worked with Redesign Instructors via Independent Study  Full Implementation ◦ Tripled capacity (N = 150) ◦ Began ULA course ◦ 2 nd semester: Began Netbook Lab

116 Pilot Comparisons N compared to prior # ADMIT GPA*FSU GPA* Traditional16% smaller3.112.76 Redesign100% larger2.922.17 *Significantly different; p =.005 (eta 2 =.027), p =.000 (eta 2 =.075)

117  Course drift eliminated ◦ Standard course, syllabus, schedule, grading, etc.

118 Impact on Student Learning  Success in the course

119 Mean Test Scores *A one-way ANOVA of section on final exam percentage grades was significant, F = 23.251, p =.000, eta 2 =.090. Also significant with GPA as covariate: F = 29.192, p =.000, eta 2 =.11. *Instructors blind to exam content

120 SectionAdmit GPAFSU GPAMean % on Final Exam Section 1 Traditional 3.022.7267.5% Section 2 Traditional 3.212.8169% Section 3 Redesign 2.952.4675.3% Section 4 Redesign 2.892.0875%

121  Final exam scores positively correlated with average scores on Mastery Quizzes ◦ r =.523, p =.000

122  Comprehensive final exam too much ◦ 3 unit exams ◦ Reduced overall coverage  “Deadline Disorder” ◦ 5 different activities reduced to the 3 types that helped students ◦ 2 weeks to complete  Students not ready for blended design ◦ Required computer lab 1x/wk

123 Full Implementation Results: 43 Common Questions (3 exams)* *A one-way ANOVA of section (3 total) on common question percentage was significant, F = 25.852, p =.000, eta 2 =.825. Mean Test Scores *Instructors blind to exam content

124  Mode ◦ Pre-Essay = 0 or 1 ◦ Post-Essay = 0,1, 2, 3, 4 *A one-way repeated measures ANOVA on essay grades was significant, F = 230.71, p =.000, eta2 =.420. Mean Test Scores

125 Impact on Retention  DWF rate

126  Previous average: 12.5% ◦ 18% prior to pilot  Pilot Semester ◦ Traditional sections: 4% ◦ Redesign sections: 22%  Full Implementation - Fall ◦ 12.8%

127 Other Impact on Students  ULAs

128  “Field Experience” course for top students  Leadership in Psychology Certificate Program  Supplemental Instructor (SI) ◦ Receive additional training based on national SI program  Interning as a ULA ◦ Research experience included

129  Future opportunities ◦ Commencement Speaker  Spring 2010 & Fall 2010 speakers were ULAs ◦ Graduate School  Teaching or Research Assistantship ◦ Prestigious Internships  Most recent: Sloop Institute for Excellence in Leadership

130 Impact on Cost Savings  Formula rating  Departmental implication

131  NCAT formula: $89 to $26  Departmental implication: ◦ 1 FT faculty position  Biggest cost effectiveness: ◦ Tripling capacity in class ◦ Staffing

132  Use of savings ◦ Realized:  Communication Response System Implementation (“clickers”)  Netbook lab  Yearly support for Tamarin Colony  Support for Coordinator  Payment to lab assistants, SI, intern

133  Stan Jakubik, University System of Maryland

134 Stan Jakubik, Asst Vice Chancellor, USM

135  3 Course Redesign Faculty cohorts over 4 years  Cohort 1 began this Fall with a pilot in Fall 2011; full course implementation Spring 2012  Cohort 2 begins Fall 2011 with target pilot Fall 2012; full course implementation Spring 2013  Cohort 3 – same pattern

136  Cohort 1 awarded funding to 12 courses  Cohort 2 at least 10  Cohort 3 at least 10  Total number of courses might be expanded in each cohort dependent on ability to raise additional funding  ?

137  RFP handout outlines all requirements  Short application required by Thanksgiving break  Decisions on awards by 2 nd week of December  Decisions made by USM and CR Faculty Fellows

138  September 27 / 28, 2011Introductory Workshop for Cohort 1I   Thanksgiving Break, 2011Concept Papers Due   2 nd Week of December 2011Participant Selections Announced   2 nd Week January 2012Planning and Implementation Workshop   Spring 2012Full Proposal Due (date TBD)   Fall 2012Implementation of Project pilot   October 20112Introductory Workshop for Cohort III  January 2013Combined workshop for Cohorts II and III   Spring 2013Full course implementation   August 1, 2013Final Report Due 


Download ppt "Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions."

Similar presentations


Ads by Google