Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jim Julius SDSU Course Design Institute May 27, 2009.

Similar presentations


Presentation on theme: "Jim Julius SDSU Course Design Institute May 27, 2009."— Presentation transcript:

1 Jim Julius SDSU Course Design Institute May 27, 2009

2 Guiding Questions Why collect formative feedback on course design? How should one decide what kind of feedback to seek? What tools are available to collect feedback? What do I do with the data?

3 What (and why) are you measuring? Summative Assessment of Student Learning Formative Assessment of (for) Student Learning Formative Evaluation of Course Design

4 What (and why) are you measuring? Outcomes: tell you what you got, not how or why Inputs Processes Seeking continuous improvement Approaching course design from an inquiry mindset

5 Outcomes Satisfaction Retention Success Achievement External proficiencies Real-world performance

6 Inputs Learner characteristics Context Design Learning resources Faculty development

7 Processes Pedagogies Presentation media Assignments/assessments Student use of technologies Community of Inquiry model (social, cognitive, teaching presence) Interactions (content, peers, instructor, technology itself)

8 Community of Inquiry Model

9 CoI - Interactions

10 Narrowing Your Inquiry Do you want to evaluate your course according to best practices, i.e. standard course design quality criteria? Do you want to know more about your learners in general: needs, preferences, motivation, satisfaction? Do you want to focus on student achievement? Do you want feedback on your facilitation of learning? Do you want feedback on specific course elements and/or technologies?

11 Course Design Quality Criteria Chico rubric Quality Matters Related to Chickering and Gamsons 7 Principles for Good Practice in Undergraduate Education From Indiana University, 2001 From VCU, 2009 Paid tool: FlashlightFlashlight

12 Learning about Learners DirectIndirect Learning styles surveys Learning styles Parallel faculty-student surveys ELI – student and faculty ELIstudent faculty SDSUs LRS faculty and student surveys, adapted from LITRE (NC State)faculty student Distance Education Learning Environment faculty and student surveysfaculty student National and institutional data (aggregate) Institutional data (for your learners) LMS data

13 Student Achievement DirectIndirect Low-stakes: muddiest point, minute papers, clickers, discussion boards Pre- and post- tests Grade data Attendance/participation Outcome comparisons (Different technology/pedagogy and same outcome, or Same technology/pedagogy and different outcomes)

14 Teacher Behaviors/Overall DirectIndirect Community of Inquiry Survey Community of Inquiry Survey Small Group Analysis Mid-semester surveys End of course evaluations Assessing online facilitation Assessing online facilitation Paid: IDEA survey of student ratings of instructionIDEA survey Observation Protocols

15 Course Elements DirectIndirect Student Assessment of Learning Gains: SALG Student Assessment of Learning Gains: SALG Clicker opinions survey Examine usage data from Blackboard

16 Data from M. Laumakis pICT fellow in 2005 Began teaching parallel 500-student sections of PSYCH 101 in 2006, one traditional and one hybrid First fully online PSYCH 101, Summer 2008

17 Evaluating the Face-to-Face Class Evaluated Fall 2005 innovations via the Student Assessment of Learning Gains (SALG) How much did the following aspects of the class help your learning? Rated from 1 (no help) to 5 (great help)

18 Evaluating the Face-to-Face Class What did the data show? QuestionMWF Section TTH Section ConceptCheck Questions 4.1 Discussion Boards 2.93.1

19 19 Evaluation Findings: IDEA Diagnostic Survey

20 20 Evaluation Findings: IDEA Diagnostic Survey Fall 2006 Blended Fall 2006 Traditional Spring 2007 Blended Spring 2007 Traditional Progress on objectives 707377 Excellent teacher 65686968 Excellent course 62727371 Note: Top 10% = 63 or more

21 21 Evaluation Findings: Departmental Course Evaluations

22 22 Evaluation Findings: Course Grades Fall 2007

23 Clicker Data: Spring 2007 Question % Agree or Strongly Agree Class clicker usage makes me more likely to attend class.93% Class clicker usage helps me to feel more involved in class.84% Class clicker usage makes it more likely for me to respond to a question from the professor. 91% I understand why my professor is using clickers in this course.90% My professor asks clicker questions which are important to my learning. 90%

24 Summer 2008 Fully Online: SALG Data How much did the following aspects of the class help your learning? Rated from 1 (no help) to 5 (great help)

25 Summer 2008 Fully Online: SALG Data QuestionSummer 2008 Online Taking the test online 4.27 Discussion Forums 3.00 Introduction e-mail that explained the basics of the course 4.50

26 SALG Data over time QuestionFall 2007 Blended Fall 2007 F2F Spring 2008 Blended Spring 2008 F2F Summer 2008 Online Questions, answers, and discussions in class 3.964.044.104.014.36 Live online class sessions 3.394.204.15 Archives of live online class sessions 4.154.504.44 Quality of contact with the teacher 3.413.483.943.904.26 Working with peers outside of class/online 3.123.223.313.393.82

27 Summer 2008: Community of Inquiry Survey Statements rated from 1 (strongly disagree) to 5 (strongly agree) Based on the Community of Inquiry frameworks three elements: 1. Social Presence 2. Cognitive Presence 3. Teaching Presence

28 Summer 2008: Community of Inquiry Survey CoI DimensionStudent Ratings Social Presence 3.94 Affective Expression 3.56 Open Communication 4.29 Group Cohesion 3.97 Cognitive Presence 3.96 Triggering Event 3.91 Exploration 3.73 Integration 4.09 Resolution 4.10 Teaching Presence 4.38 Design and Organization 4.50 Facilitation 4.38 Direct Instruction 4.23

29 So, what would you like to further explore?


Download ppt "Jim Julius SDSU Course Design Institute May 27, 2009."

Similar presentations


Ads by Google