Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating Blended Learning in a Large Introductory Psychology Course

Similar presentations


Presentation on theme: "Evaluating Blended Learning in a Large Introductory Psychology Course"— Presentation transcript:

1 Evaluating Blended Learning in a Large Introductory Psychology Course
Talk about my courses Face-to-face elements Online elements Talk about the evaluation of these courses Tools/techniques Findings/results The latest at SDSU/take-home messages Mark A. Laumakis, Ph.D. San Diego State University Lecturer, Department of Psychology Faculty in Residence, Instructional Technology Services

2 What I Teach: Mega Courses
Two 500-student sections of Psychology 101 (Introductory Psychology) One fully face-to-face (traditional) One in a blended learning format (45% online) Give a psychologist two groups and he/she’s bound to do something to one and not the other

3 Setting the Stage Spent Summer 2006 redesigning Psych 101 for a blended learning format Blended learning integrates online and face-to-face activities in a planned, pedagogically valuable manner (Sloan-C Workshop on Blended Learning, 2005) Utilized fundamental principles of instructional design Employed scholarship of teaching approach Sloan-C Workshop on Blended Learning (2005) defined blended learning in this way: “courses that integrate online with face-to-face class activities in a planned, pedagogically valuable manner; and where a portion (institutionally defined) of face-to-face time is replaced by online activity” (Laster, Otte, Picciano, and Sorg) Focused on inquiry and gathering data

4 Face-to-Face Classes Extensive use of CPS clickers
ConceptCheck questions Attendance Demonstrations Anonymous polling Predicting outcomes Peer instruction (Mazur) Extensive use of multimedia Videos, demonstrations, and simulations from text and web

5 Clicker ConceptCheck Question

6 Clicker Results Chart

7 % Agree or Strongly Agree
Clicker Data: Spring 2008 Question % Agree or Strongly Agree Class clicker usage makes me more likely to attend class. 92% Class clicker usage helps me to feel more involved in class. 84% Class clicker usage makes it more likely for me to respond to a question from the professor. I understand why my professor is using clickers in this course. 94% My professor asks clicker questions which are important to my learning. 7 7

8 Online Sessions Delivered via Wimba Live Classroom
Live sessions were archived for later viewing Sessions included Mini-lectures Demonstrations Polling questions Feedback at the end of each session via polling questions

9 Wimba Classroom Interface

10 Polling Question in Wimba Classroom

11 Review of Key Tools Face-to-Face Classes PowerPoint CPS clickers
Tablet PC Online Sessions Wimba Live Classroom

12 Evaluating Blended Learning
Evaluation led by Marcie Bober, Ph.D. (Educational Technology) Efforts supported by Academic Affairs, Instructional Technology Services, and College of Sciences Initial evaluation is part of ongoing evaluation process Course (re)design is an iterative process Focus on continuous improvement Some background information

13 Evaluation Tools and Strategies
Multimethod approach included the following: Week 7 “How’s It Going?” Online Survey In-class Observations IDEA Diagnostic Survey Student Focus Groups Departmental Course Evaluations Course Grades Collected data in a variety of ways (multimethod approach) Goal is to complement traditional performance measures with student perceptions (via surveys and focus groups) and classroom observations (both classroom-based and online) I’m NOT going to talk about the How’s It Going survey, the in-class observations, or the student focus groups (1,2,4) I will talk about the IDEA survey, course evaluations, and course grades (3,5,6)

14 Evaluation Findings: IDEA Diagnostic Survey
Normed, standardized instrument Focuses on student perceptions of progress on learning outcomes identified by the instructor Provides comprehensive, comparative, and longitudinal reporting

15 Evaluation Findings: IDEA Diagnostic Survey
Fall 2006 Blended Fall 2006 Traditional Spring 2007 Blended Spring 2007 Traditional Progress on objectives 70 73 77 Excellent teacher 65 68 69 Excellent course 62 72 71 Compared to classes in Psychology in the IDEA national database Scores: Top 10% = 63 or more Next 20% = 56-62 Middle 40% = 45-55 Next 20% = 38-44 Lowest 10% = 37 or less Clearly outstanding results Note: Top 10% = 63 or more

16 Evaluation Findings: Departmental Course Evaluations
Differences from Fall 2006 largely leveled out in Spring 2007 +: presentation style, responsive/helpful, stimulate interest, and summary instructor -: testing/assessment

17 Evaluation Findings: Fall 2006 Course Grades
In Fall 2006, students in traditional section significantly outperformed students in the blended section 50 SAT difference in favor of the traditional section (nonequivalent groups) Grades = 700 points 4 tests = 120 points each (480 points) 12 online quizzes (homework) = 120 points Clicker points (attendance = 40, participation = 60)

18 Evaluation Findings: Spring 2007 Course Grades
Differences were negligible by Spring 2007

19 Evaluation Findings: Course Grades Fall/Spring Combined

20 Evaluation Findings: Fall 2007 Course Grades

21 Evaluation Findings: Spring 2008 Course Grades

22 Summary of Course Grade Data

23 The Learning Continuum
Percentage and/or Quality of online learning activities Conventional Face-to-Face Classes Entirely On-line Classes 20% 40% 60% 80% 23 23

24 Blended Learning = “The Sweet Spot”
Conventional Face-to-Face Classes Entirely On-line Classes 20% 40% 60% 80% 24 24

25 What’s the Latest? Introduction of more blended learning courses at SDSU Students now seek out the blended learning section Continued evolution of online sessions Less lecture More demonstrations, simulations, and polling questions Fully online Psych 101 course in Summer 2008 Course enrollment of 66 students vs. average of 46 in previous 5 years (traditional face-to-face course) D/F rate dropped from 14.1% to 11.0% Spring 2008 blended section filled more quickly than traditional section (first time ever) Performance has been indistinguishable for Fall 2007 and Spring 2008 (Fall 2006 appears to have been an aberration)

26 Lessons Learned Yes, you can do blended learning in a mega course!
Course redesign takes time and effort Support is key Moving to blended learning format does NOT mean moving your face-to-face course online You must change the way you teach Provide rationale to students Why you’re doing what you’re doing Predict problems with technology


Download ppt "Evaluating Blended Learning in a Large Introductory Psychology Course"

Similar presentations


Ads by Google