Presentation is loading. Please wait.

Presentation is loading. Please wait.

Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions.

Similar presentations


Presentation on theme: "Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions."— Presentation transcript:

1 Welcome

2  Dr. Donald Spicer, USM

3  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions  Long recognized but no full acceptable solutions were available  Pew Charitable Trusts funded a study in 1999 of models/strategies to address this ◦ Premise----improving learning outcomes and saving resources need not be mutually exclusive ◦ Appropriate use of technology may be the needed perturbation of the system

4  Initial program was to fund 10 institutions per year for 3 years to undertake model redesigns ◦ All segments of higher education---public and private ◦ A wide range of disciplines  Outcome ◦ Most achieved improved learning outcomes ◦ All saved money in offering the course  Subsequent programs focused on scaling up and/or addressing specific issues (e.g., developmental courses)

5  Five general models were used  Several common characteristics of successful redesigns were identified  Several hundred courses have been redesigned across the country subsequently

6  Regents’ Effectiveness and Efficiency Initiative  In 2006 turned to E&E in academic sphere  Became first System to adopt Course Redesign as a System  10 projects funded for 3 years across the USM ◦ Biology, Chemistry, English, Mathematics, Psychology ◦ Developmental to 300 level courses ◦ Each solving a specific ‘problem’

7  All showed improved learning outcomes  Most showed savings in staffing the course  Some one-time capital investments were required  Generally improved student satisfaction  Generally improved faculty satisfaction  Generally departments could reuse savings in supporting other courses  Plaudits from Chancellor, BOR  Notice in Annapolis that USM is taking initiative

8  Needs to be faculty driven with administrative support  These are team efforts---not enough to have one really committed faculty member  Projects most successful if there is team enthusiasm about solving one or more real academic or financial problems

9  Chancellor and USM raised money to extend these successes  This is one strategy in a broader discussion of academic transformation

10  One of 3 Initiatives funded by 2009 Carnegie Corporation Academic Leadership Award to Chancellor Kirwan and additional fund raising  Goal is to scale-up Course Redesign based on lessons learned from first round  Focus on core curriculum and gateway courses  Up to $20,000 from this USM fund to be matched by institution ◦ Many projects will not need this much funding

11  Jennifer Frank, USM

12  Part of the Lumina Foundation's “productivity grants” to increase the percentage of Americans with high-quality degrees and credentials to 60% by 2025  States receiving grants (2009-2013) include Arizona, Indiana, Maryland, Montana, Ohio, Tennessee, and Texas  Lumina’s overarching goals through this initiative are to: ◦ Increase and reward completion ◦ Generate and reinvest savings ◦ Educate and train in affordable ways

13  “Growing by Degrees” is a statewide partnership between: ◦ Maryland Governor's Office ◦ Maryland Department of Legislative Services ◦ Maryland Association for Community Colleges ◦ Maryland Higher Education Commission ◦ Maryland Independent College and University Association ◦ University System of Maryland  Our state’s agenda through this initiative is to: ◦ Engage the Governor’s P-20 Leadership Council ◦ Promote cross-institutional collaboration in effectiveness and efficiency (E&E) efforts, both academic and administrative ◦ Support faculty and institutional efforts to redesign bottleneck undergraduate courses

14  Two rounds of funding: Fall 2010 and Fall 2011  20-24 projects total over two cohorts  $20,000 in direct support with $20,000 institutional match  Funding for USM institutions will focus on developmental courses (pilots at University of Baltimore and Frostburg)  Funding for community colleges, independent institutions, and other public four-year institutions will consider bottleneck courses from any discipline (including developmental courses)  Competitive RFP process will be described in the afternoon: ◦ October 25: Concept Paper ◦ December 1: Full Proposal ◦ Cohort 1 selections made in December 2010

15  Faculty experienced in the methodologies of course redesign who will support other faculty course redesign efforts statewide through: ◦ Providing direct consultation to participating institutions ◦ Developing and delivering workshops for faculty cohorts working toward new redesigns ◦ Participating in the evaluation of proposals for course redesign grants ◦ Assisting in the development and content management of a course redesign Web site hosted by USM

16 Raouf Boules, Towson University MCRI Project: Developmental Mathematics & Intermediate Algebra Megan Bradley, Frostburg State University MCRI Project: General Psychology Ronald Gutberlet, Salisbury University MCRI Project: Fundamentals of Biology Jennifer Hearne, University of Maryland Eastern Shore MCRI Project: Principles of Chemistry I Eileen O’Brien, University of Maryland Baltimore County MCRI Project: Introductory Psychology

17  Dr. Ronald Gutberlet, Salisbury University

18 (as developed by NCAT)

19  Redesign the Entire Course  Encourage Active Learning  Provide Students with Individualized Assistance  Build in Ongoing Assessment and Prompt (Automated) Feedback  Ensure Sufficient Time on Task and Monitor Student Progress

20  Benefits ◦ Opportunity to evaluate and focus course goals ◦ Counteract course drift ◦ Capture the best that each instructor has to offer ◦ Reduce duplication of effort ◦ Meaningful and interesting faculty interaction

21  Challenges ◦ Compromise and consensus building ◦ Instructor buy-in across all sections ◦ Time needed for team-building ◦ Can’t be forced ◦ Variation in faculty attitudes toward technology ◦ The more support, the better!  Dept Chair and entire department  Dean; Provost’s Office  Instructional Design

22 How effective is the traditional lecture? – Is anyone listening? – Is everyone listening?

23 Does information transfer still require a lecture in 2010? What are the course goals and does the traditional lecture serve them? Need not take an all-or-nothing approach – How to select an effective combination of lecture and active learning? Can we do a better job of “priming” students for a lecture? – Is anyone reading the textbook?

24  Benefits ◦ Multiple ways for students to engage with the course material ◦ Students decide when to work on the course material, within the framework of scheduled deadlines ◦ Students held accountable for assigned work ◦ Student engagement and satisfaction ◦ Faculty engagement and satisfaction

25  “…easy to want to come to class every time and not fall asleep.”  “This was my favorite course this semester.”  “I like the way this class is conducted better than my friend’s bio classes.”

26  Challenges ◦ What to do with that uncomfortable feeling that you are failing to complete a critical transaction if you do not say something out loud to a room full of students?  Is anyone listening?  Is everyone listening? ◦ Significant paradigm shift, natural skepticism

27

28 Online learning tools Staffed computer labs Peer learning assistants Small group meetings – Discussion sections – Laboratories Supplementary instruction programs Online discussion tools

29  Online quizzes  Sophisticated learning software with problem sets, instant feedback, and assistance  Clickers in the classroom

30 Track the time that students spend online doing coursework (Blackboard, publisher software) Incorporate periodic assessments to encourage completion of online work – Weekly online quizzes in Biol 101 Design an intervention strategy for students who are performing poorly on assessments or who are not spending sufficient time on task – A job for peer learning assistants?

31

32

33  Dr. Raouf Boules, Towson University

34  Retains the basic course structure, particularly the number of class meetings  Supplements lectures and textbooks with technology-based, out-of-class activities  Encourages greater student engagement with course content  Ensures that students are prepared when they come to class

35  Replaces (rather than supplements) in-class time with online, interactive learning activities  Carefully considers why (and how often) classes need to meet face-to-face  Assumes that certain activities can be better accomplished online - individually or in groups  May keep remaining in-class activities the same or may make significant changes  May schedule out-of-class activities in computer labs or totally online  Examples: ◦ Towson University: Developmental Mathematics ◦ UMES: Principles of Chemistry

36  Moves all classes to a lab setting  Multiple sections combined into one large section  Depends heavily on instructional software including interactive tutorials  Allows students to work as long as they need to master the content  Permits the use of multiple kinds of personnel  Requires a significant commitment of space and equipment  Can teach more than one course in the lab, thus leveraging the initial investment

37 Traditional (linear algebra)  38 sections (~40)  10 tenured faculty, 13 instructors, 15 GTAs  2 hours per week  $91 cost-per-student Redesign  1 section (~1520)  1 instructor, grad & undergrad TAs + 2 tech support staff  24*7 in open lab  $21 cost-per-student Replicated at U of Alabama, U of Idaho, LSU, Wayne State, U Missouri-St. Louis, Seton Hall

38

39

40

41

42  Eliminates all in-class meetings and moves all learning experiences online.  Adopts successful design elements of Supplemental, Replacement and Emporium models including Web- based, multi-media resources, commercial software, automatically evaluated assessments with guided feedback, links to additional resources and alternative staffing models.  Examples: ◦ RIO SALADO COLLEGE: Pre-Calculus Mathematics ◦ U. OF S. MISSISSIPPI: World Literature

43  Takes into account each student’s knowledge/skill level and preferred learning style  Provides an array of learning opportunities including : lectures, outside reading material, labs, small group study sessions, videos, oral and written presentations, homework assignments, individual and group, projects, etc  Started by Ohio State University for Introductory Statistics ◦ Initially students are made familiar with the various options through examples made available for them to experience ◦ Then the student selects what suits him/her and signs a contract detailing containing the choices and what needs to be accomplished

44  Retains the basic structure of the college-level/core course, particularly the number of class meetings  Replaces the remedial/developmental course with just-in-time “workshops”  Workshops are designed to remove deficiencies in core course competencies (for a particular course)  Students are individually assigned software modules based on results of diagnostic assessments/placement  Workshops consist of computer-based instruction, small-group activities and test reviews to provide additional instruction on key concepts

45  Workshops are facilitated by students who have previously excelled in the core course and are trained and supervised by core course faculty  Workshop activities are just-in-time—i.e., designed so that students use the concepts during the next core course class session, which in turn helps them see the value of the workshops and motivates them to do the workshop activities.

46  Supplemental – Add to the current structure and/or change the content  Replacement – Blend face-to-face with online activities  Emporium – Move all classes to a lab setting  Fully online – Conduct all (most) learning activities online  Buffet – Mix and match according to student preferences  Linked Workshop - Replaces the remedial/developmental course with just-in-time “workshops”

47  Dr. Jennifer Hearne, UMES

48 Jennifer L. Hearne, Ph.D. October 7-8, 2010

49  Does your institution want to control or reduce costs?  Does your institution want to increase productivity?  Academic productivity of students  Academic course offerings  Scholarly activity of faculty

50  Is the institution committed to providing the needed support for the redesign project?  Short term AND long term commitment  Scheduling flexibility  Staffing  Technology and support

51  Which courses are candidates for a redesign?  What should be considered in selecting courses?  Will changes in the course have a high impact on the curriculum?

52  High drop-failure-withdrawal rates  Student performance in subsequent courses  Students on waiting lists creating a bottleneck  Student complaints  Departmental complaints  Course drift -> Inconsistent learning outcomes  Difficulty finding qualified adjuncts

53  Are decisions about curriculum made collectively?  Faculty member  Committee  Department  School  Are the faculty able and willing to incorporate existing curricular materials or commercially available materials in order to focus work on redesign issues rather than materials creation?

54  Have the course’s expected learning outcomes and a system for measuring their achievement been identified?  Has the cost of offering the course been examined critically?  Institutional cost  Student cost  Have activities other than teaching been identified that will assist in the measurement of faculty efficiency?  Do the project participants have the requisite skills?  Learning Theory  Technology  Assessment

55  Requisite skills  High impact on student learning, cost savings and faculty efficiency  Willingness to incorporate existing materials  Assessment tools developed  Learning outcomes  Cost savings  Faculty efficiency

56  Dr. Ronald Gutberlet, Salisbury University

57 USM Course Redesign Orientation Workshop 7–8 October 2010

58  Clem Counts  Mark Frana  Sam Geleta  Ron Gutberlet  Mark Holland  Wanda Kelly  Joan Maloof  Claudia Morrison-Parker  Wanda Perkins  Betty Lou Smith  Bob Tardiff  Melissa Thomas  McGraw-Hill (and now W.W. Norton)  Enhancement of online learning in Biology 210 ◦ Kim Hunter, Richard Hunter Dr. Les Erickson, learning technology guru (left)

59  Three hours of lecture and two hours of lab per week  Approximately 1000 students per year  Gen Ed course for lab science requirement  “Large” lecture sections (72-96 students)  Small lab sections (24 students)  Common lab syllabus  Course drift and duplication of effort in lecture  Engagement issues (faculty and students)

60  One hour of lecture and two hours of lab per week; 2 hours of lecture replaced with online work  Larger lecture sections (120 students)  Fewer lecture instructors needed  Instructors available to help in other courses  No pedagogical difference between 72, 96, and 120?  Small lab sections (24 students)  Shared online component

61  Use of Blackboard to deliver online content that partially replaces traditional lectures ◦ Weekly instructions ◦ Study guide ◦ Online animations, activities, and narrated instruction ◦ Online quiz  Maximized use of lab time for activities, discussion, team contests ◦ Lecture instructor teaches all 5 labs from his/her lecture section (greater integration)  Use of clickers to engage more students, to initiate discussions, to automate some grading, to do team competitions, and to check class comprehension instantly

62  “I really like the mix between online work and class time.”  “It is new and a little hard to get used to, but I like it!”  “I never really liked bio until now.”  “I like the online material…it makes class easier to attend.”  “The breakdown of DNA and protein synthesis is interesting and never taught in my high school.”

63  Dr. Raouf Boules, Towson University

64 Raouf Boules, Ph.D. October 7 – 8, 2010

65 DVMT 101- Developmental Mathematics (4 contact hours) DVMT 110 - Intermediate Algebra (3 contact hours)

66  Students with SAT mathematics scores less than 500 and weak placement test scores  Typical academic year enrollment data  DVMT 101: 25 course sections enrolling 650 students  DVMT 110: 17 course sections enrolling 350 students  Total: 42 sections with close to 1000 students  Relative size: 10% of the Department academic year operation 66

67  Lecture format  DVMT 101: 4 hours  DVMT 110: 3 hours  Taught mainly by part-time faculty  Use common exams with pass/fail grades  Challenges  Students enter with varying background experience and skill levels  Students may move with varying pace  May even enroll in DVMT 101 and finish both  Some need more individualized attention than others 67

68  Uses a replacement model where one hour in each of the 2 classes is replaced by at least one mandated hour in an open computer lab  Lab uses interactive learning software with thousands of practice problems and tutorials  Self-paced learning environment with immediate feedback and tutorials  Lab is mainly staffed by Undergraduate Learning Assistants (ULA’s) and some graduate Teaching Assistants (TA’s) and instructors  Lab focus: Providing individualized on-demand guidance and individualized attention

69  Resistance to change  Large number of part-time instructors involved with long experience in traditional modes of teaching  Isolated bad technology experiences  Initial lack of space for an open computer lab

70  Time to completion  Pass rates (fall 09 data):  DVMT 101: 77% →85%  DVMT 110: 65% → 74%  Positive student experience (from course evaluations)  Increased faculty enthusiasm  Some cost saving  18% of total cost of $150K/year

71  Dr. Jennifer Hearne, University of Maryland Eastern Shore

72 University of Maryland Eastern Shore Jennifer L. Hearne, Ph.D. October 7-8, 2010

73  University of Maryland Eastern Shore ◦ Thelma B. Thompson, Ph.D., President ◦ Charles Williams, Ph.D., Vice President for Academic Affairs  Goals of the Maryland Course Redesign Initiative ◦ Adopt new methods to improve student learning outcomes ◦ Reduce institutional costs ◦ Release instructional resources for other purposes  MCRI Team at UMES ◦ Joseph M. Okoh, Ph.D., Yan Y. Waguespack, Ph.D., Gladys G. Shelton, Ph.D., Charles Williams, Ph.D., Amelia G. Potter, James R. Hayes

74  Population ◦ Caters to science and health professions students  20% of freshman class ◦ 73% Freshman  Goals ◦ Basic atomic and molecular theory ◦ Nomenclature ◦ Reaction stoichiometry ◦ Gas laws  Academic Issues ◦ inconsistent knowledge of incoming students ◦ 55% student retention rate ◦ lack of coordination among the professors teaching the sections of the course leading to course drift and inconsistent learning outcomes

75 CourseSection Size MeetingsSections / Professors per academic year Learning Assistant Traditional Chemistry 111 30-40MWF 50 min 7/6No Pilot Chemistry 111E Up to 80 M 75 min + 2h in computer lab Recitation offered 1Yes Chemistry 111E Up to 114 MW 50 minutes + 1h in computer lab 3/2Yes

76  Technology Componenets ◦ Blackboard ◦ CengageNOW http://login.cengage.com/sso//http://login.cengage.com/sso// ◦ Computer Laboratory  Integrated Staffing ◦ Undergraduate Learning Assistant (ULA) and Learning Assistant (LA)  Individualized, Active Assistance ◦ On-demand assistance ◦ Cumulative grade posted every Monday ◦ CengageNOW grade available at all times

77 54.5% earned A-C in Traditional course Of those students, 61.1% enrolled in Principles of Chemistry II Of those who enrolled in Principles of Chemistry II, 54.5% earned A-C or S 65.7% earned A-C in Pilot course Of those students, 61.8% enrolled in Principles of Chemistry II Of those who enrolled in Principles of Chemistry II, 61.9% earned A-C In comparison to students enrolled in the Traditional course, Pilot course students were: 7.4% more likely to earn A-C in Principles of Chemistry II More likely to earn A, B grades in Principles of Chemistry II

78

79  Dr. Eileen O’Brien, UMBC

80 University of Maryland Baltimore County Department of Psychology Eileen O’Brien, PhD Redesign of Introductory Psychology University of Maryland System Course Redesign

81 General education requirement course 180+ students per section 7 sections per year 4 credit course 3 ½ clock hours each week for instruction 15 weeks of class Large lecture halls 4 unit multiple choice exams Lecture style 2 FT Grad TAs (GTAs); managing exams and recording grades; administrative tasks. Traditional Introductory Psychology Course

82 Failure rate peaked at 15% (Grade of F); withdrawal rate as high as 10% 60% class attendance rate Lack of preparation for class–limited discussion Overwhelming amount of course content Faculty are not interested in teaching this course Poor performance on multiple choice exams – (e.g., class exam mean of 62% on unit 1 exam) Poor student evaluations re: content, learning, environment Traditional Introductory Psychology Course

83  Faculty teaching the course* ◦ Lecturers and Adjuncts  Provost  Dean of Arts and Sciences  Department Chair*  Faculty Development Director*  Student Learning Resource Center Director*  Blackboard Administrator*  Evaluator*  Graduate TAs*  Undergraduate Student* * Review team for timeline tasks Course Redesign Team

84   amount of content in class sessions; shift to discussion  Integrated CPS questions to  interactivity and attendance  Created common multiple choice exams across sections  Added weekly small group (dyad) activities to  interactivity  Assigned 1½ GTAs for database management and student support  Sequenced content for more engaging start  Created Peer Mentors for class activities, tutoring, and exam prep Redesigned Introductory Psychology Course

85  Class GPA for each section  Common exams  Same content sequence  Student evaluations  Faculty evaluation  Peer Mentor utilization

86 Redesign sections compared to the Traditional section shows significantly proportionately fewer Cs, Ds and Fs and greater As and Bs, χ2(N=768,df=4) = 44.2, p<.001.

87 Impact on Learning Redesigned data differed from the historical data, with proportionately fewer Cs, Ds and Fs and a greater percentage of As and Bs (χ2(N=7,766,df=4) = 240.6, p<.001 )

88 Course Withdrawal Rates Fall 2008 had the lowest withdraw rate (3.2%) documented since 2000. The range has been 4.1% to 10.3 %.

89  What was good about the redesign: ◦ “the ability to do labs on-line when ready or to review information online,” ◦ “small group work and discussion in class,” and ◦ “in-class movies and videos.”  What needs to be improved with the redesign: ◦ “review questions students got wrong on tests; discuss questions that lots of students got wrong,” ◦ “more clicker questions,” ◦ “make PowerPoints available on-line.” *focus group data

90  What was good about the redesign: ◦ “removed pressure to teach every topic in the book,” ◦ “freed up time for scholarly activities, less need for office hours or lecturing an additional hour a week.”  What needs to be improved with the redesign: ◦ “faculty need training in technology, use of platforms, and software,” ◦ “accuracy of online test banks.” *faculty interview data

91  Decreased the number of sections required each semester and increased class size.  Decreased withdrawal rates; retaining students.  Decreased the need for two faculty each year; free to offer an additional upper level course each semester.  Freed up classroom space for the University to teach other courses on campus.  Decreased the need for graduate teaching assistants from 2 grad students to 1.5 grad students.  Leveraged existing resources to fund undergraduate Peer Mentors.

92  Dr. Megan Bradley, Frostburg State

93 USM Course Redesign 2 Workshop October 7/8, 2010 Dr. Megan E. Bradley, Associate Professor of Psychology @ Frostburg State University mbradley@frostburg.edu

94  Psyc150: General Psychology ◦ Course characteristics  Annual enrollment: About 900  Mostly traditional students and 1 st year students  Required course for Psychology Majors and 5 other majors ◦ Academic Issues  Course drift & inefficiency  Financial difficulties at University level

95  Primary Team (who were also tech savvy): ◦ Coordinator  FT Faculty member who did not teach PSYC 150 at FSU ◦ 1 FT faculty member  Regularly taught PSYC 150 ◦ 1 Adjunct instructor  Instructional Designer & regularly taught PSYC 150  Administration: ◦ Dean ◦ Associate Provost  Publisher: ◦ Worth

96  Team-approach  Invited publishers to present products  Support ◦ Department: Off-campus retreat before pilot ◦ Campus-wide support: held 3 Course Redesign Workshops  Created online instructor’s manual ◦ Active learning, clicker questions  Created ULA Course & Certificate Program ◦ Put through at same time as Pilot semester

97  Chose Replacement Model  Pilot semester ◦ Comparison:  2 traditional sections (N=42)  2 redesign sections (N=99) ◦ ULAs worked with Redesign Instructors via Independent Study  Full Implementation ◦ Tripled capacity (N = 150) ◦ Began ULA course ◦ 2 nd semester: Began Netbook Lab

98  Course drift eliminated  Financial difficulties improved  Demonstrated redesign pedagogy worked  Brought back DWF rate to previous average, stopping trend of increasing failure  Created unique learning experience for ULAs

99 Impact on Student Learning  Success in the course

100 Mean Test Scores *A one-way ANOVA of section on final exam percentage grades was significant, F = 23.251, p =.000, eta 2 =.090. Also significant with GPA as covariate: F = 29.192, p =.000, eta 2 =.11. *Instructors blind to exam content

101 SectionAdmit GPAFSU GPAMean % on Final Exam Section 1 Traditional 3.022.7267.5% Section 2 Traditional 3.212.8169% Section 3 Redesign 2.952.4675.3% Section 4 Redesign 2.892.0875%

102  Final exam scores positively correlated with average scores on MQs ◦ r =.523, p =.000

103  Mode ◦ Traditional = 0 or 1 ◦ Redesign = 1, 2, 2, 3 *A one-way ANOVA of section on essay grades was significant, F = 6.787, p =.000, eta2 =.153. Mean Test Scores

104  Essay score positively correlated with  Grade on semester long prejudice project  r =.328, p =.000  Grade across all online discussions  r =.244, p =.005

105  Comprehensive final exam too much ◦ 3 unit exams ◦ Reduced overall coverage  “Deadline Disorder” ◦ Reduced to: MQs, Discussions, Prejudice Activity ◦ 2 weeks to complete  Students not ready for blended design ◦ Required computer lab 1x/wk

106  Students needed more in-class assistance ◦ Updated instructor’s manual to include brief direct instruction  Need for more campus-wide support ◦ Held 3 workshops on redesign ◦ Implemented student support services programs  Tutoring, Supplemental Instruction  Wellness initiative

107 Full Implementation Results: 43 Common Questions (3 exams)* *A one-way ANOVA of section (3 total) on common question percentage was significant, F = 25.852, p =.000, eta 2 =.825. Mean Test Scores *Instructors blind to exam content

108 Factually-Based versus Conceptually-Based Questions *Factual: F = 18.480, p =.000, eta 2 =.771 Conceptual: F = 23.941, p =.000, eta 2 =.813 Mean Test Scores

109  Mode ◦ Pre-Essay = 0 or 1 ◦ Post-Essay = 0,1, 2, 3, 4 *A one-way repeated measures ANOVA on essay grades was significant, F = 230.71, p =.000, eta2 =.420. Mean Test Scores

110 Impact on Retention  DWF rate

111  Previous average: 12.5% ◦ 18% prior to pilot  Pilot Semester ◦ Traditional sections: 4% ◦ Redesign sections: 22%  Full Implementation - Fall ◦ 12.8%

112 Other Impact on Students  ULAs

113  “Field Experience” course for top students  Leadership in Psychology Certificate Program  Supplemental Instructor (SI) ◦ Receive additional training based on national SI program  Interning as a ULA ◦ Research experience included

114  Future opportunities ◦ Commencement Speaker  Spring 2010 & Fall 2010 speakers were ULAs ◦ Graduate School  Teaching or Research Assistantship ◦ Prestigious Internships  Most recent: Sloop Leadership Institute

115 Impact on Cost Savings  Formula rating  Departmental implication

116  NCAT formula: $89 to $26  Departmental implication: ◦ 1 FT faculty position  Biggest cost effectiveness: ◦ Tripling capacity in class ◦ Staffing

117  Use of savings ◦ Realized:  Communication Response System Implementation (“clickers”)  Netbook lab  Yearly support for Tamarin Colony  Support for Coordinator  Payment to lab assistants, SI, intern

118  Stan Jakubik, University System of Maryland

119 Stan Jakubik, Asst Vice Chancellor, USM

120  3 Course Redesign Faculty cohorts over 4 years  Cohort 1 begins this Fall with target for pilot in Fall 2011; full course implementation Spring 2012  Cohort 2 begins Fall 2011 with target pilot Fall 2012; full course implementation Spring 2013  Cohort 3 – same pattern

121  Cohort 1 at least 10  Cohort 2 at least 5 – 10  Cohort 3 at least 5 – 10  Total number of courses might be expanded in each cohort dependent on ability to raise additional funding  Why not expand first cohort and do 30 now?

122  RFP handout outlines all requirements  Short application required by December 1  Decisions on awards by end of December  Decisions made by USM and CR Faculty Fellows

123  October 7 & 8, 2010Introductory Workshop for Cohort I   December 1, 2010Concept Papers Due   End of December 2010Participant Selections Announced   2 nd Week January 2011Planning and Implementation Workshop   Spring 2011Full Proposal Due (date TBD)   Fall 2011Implementation of Project pilot   October 2011Introductory Workshop for Cohort II  January 2012Combined workshop for Cohorts I and II   Spring 2012Full course implementation   August 1, 2012Final Report Due 


Download ppt "Welcome.  Dr. Donald Spicer, USM  The problem ◦ Large enrollment courses are almost universal and are problematic for students, faculty, and institutions."

Similar presentations


Ads by Google