Presentation is loading. Please wait.

Presentation is loading. Please wait.

Both descriptive and correlational. First, the researcher reviewed the online courses according to the Quality Matters scoring rubric to provide a score.

Similar presentations


Presentation on theme: "Both descriptive and correlational. First, the researcher reviewed the online courses according to the Quality Matters scoring rubric to provide a score."— Presentation transcript:

1 Both descriptive and correlational. First, the researcher reviewed the online courses according to the Quality Matters scoring rubric to provide a score. Then the faculty and students associated with each course rated each area of the course on a survey designed for this project and aligned to the 8 2011-2013 QM standards. Upon completion of all courses, data will be reported to describe online course quality according to the QM rubric and then correlated to describe relationships between faculty and student perceptions and the QM score. S TUDENT AND F ACULTY P ERCEPTIONS OF O NLINE C OURSE Q UALITY Robert C. Sipes, EdD, ATC, CSCS Director, Athletic Training; UWO Kinesiology Department; UW System Teaching Fellow INTRODUCTION Since 2003, enrollment in online courses and programs has grown 358% to now include over 6 million students who are enrolled in at least one online course. 1 Since 2003, faculty that “fully accept” online learning has remained relatively constant (~30-32%) despite this increase in offering and enrollment and 1/3 of academic leaders believe online instruction to be inferior to face-to-face classrooms. 1 Many people have investigated faculty perceptions of online instruction, although not usually regarding course quality. 2-4 Several authors have developed models to assess the quality of online learning, including Peer Review/Observation, 5,6 the Community of Inquiry model, 7 and the META model; 8 however the research- based Quality Matters Program rubric assessment seems to be the current standard in higher education. 9 Quality Matters (QM) assessment is “based on recognized best practices, built on the expertise of instructional designers and experienced online teachers, and supported by distance education literature and research.” 9 PURPOSE The purpose of this project is to assess the state of online learning on our campus and evaluate the faculty and student perceptions of the quality of online courses. METHODS Subjects 2 experienced online instructors at UWO (1 male, 1 female) 12 online learners completed valid surveys (4 males, 8 females) (Novice-Experienced) Research Design RESULTS DISCUSSION In a small sample size, the students perceived the course higher than the QM score, while the instructor rated themselves lower than the QM score in both courses. The faculty seem to know which standards their courses are weaker on based upon self- assessment (*with no analysis yet). These courses show a fairly wide spectrum of scores on the QM rubric, demonstrating a variation in online courses at UWO. Preliminary results are promising that students and faculty may give honest and accurate responses when surveyed regarding the quality of online courses. REFERENCES 1.Allen, I. and Seaman, J. (2011) Going the Distance: Online Education in the USA 2011 Wellesley MA: Babson Survey Research Group. 2.Bolliger, D. U., & Wasilik, O. (2009). Factors Influencing Faculty Satisfaction with Online Teaching and Learning in Higher Education. Distance Education, 30(1), 103-116. 3.Ward, M. E., Peters, G., & Shelley, K. (2010). Student and Faculty Perceptions of the Quality of Online Learning Experiences. International Review Of Research In Open And Distance Learning, 11(3), 57-77. 4.Mancuso, J. (2009). Perceptions of distance education among nursing faculty members in North America. Nursing & Health Sciences, 11(2), 194-205. doi:10.1111/j.1442-2018.2009.00456.x, 5.Swinglehurst, D. D., Russell, J. J., & Greenhalgh, T. T. (2008). Peer Observation of Teaching in the Online Environment: An Action Research Approach. Journal Of Computer Assisted Learning, 24(5), 383-393. 6.Wood, D., & Friedel, M. (2009). Peer Review of Online Learning and Teaching: Harnessing Collective Intelligence to Address Emerging Challenges. Australasian Journal Of Educational Technology, 25(1), 60-79. 7.Shea, P., & Bidjerano, T. (2008). Measures of Quality in Online Education: An Investigation of the Community of Inquiry Model and the Net Generation. Journal Of Educational Computing Research, 39(4), 339-361 8.Dittmar, E., & McCracken, H. (2012). Promoting Continuous Quality Improvement in Online Teaching: The META Model. Journal Of Asynchronous Learning Networks, 16(2), 163-175. 9.Quality Matters. (2013). Introduction to the Quality Matters Program. Retrieved from https://www.qualitymatters.org/sites/default/files/ Introduction%20to%20the%20Quality%20Matters%20Program% 20HyperlinkedFinal2014.pdf. Retrieved on April 1, 2014. ACKNOWLEDGMENTS Special thanks to OPID and UW Oshkosh for providing funding throughout the year in the completion of this Teaching Fellow opportunity. While there is not enough date yet to run statistical analysis, the first two courses show a trend towards faculty and student perceptions aligning with the QM rubric scores. *points are not on same scales, so percentage of points is used. The faculty, student, and Quality Matters scores are shown for Course 1 in percentage of possible points. (Table 2) The faculty, student, and Quality Matters scores are shown for Course 2 in percentage of possible points. (Table 3) Currently, 2 courses have been completed, evaluated, and surveys collected. Variation in the two courses thus far as one course received 81 points (85%) on the QM rubric and the second received 68 points (72%) out of 95 possible. A low response rate has been seen for student perceptions (14/44 = 32%)  2 surveys removed for invalid responses Demographic statistics for student respondents can be found in Table 1. Table 1. Demographic statistics of the student respondents in 2 courses Table 2. Scores by QM standard for Course 1 (student average, faculty-self, and QM rubric) QM Standard Student Scores Faculty Scores QM Rubric 1. Overview/Intro.85.75.93 2. Objectives.80.75.80 3. Assessment.89.75.77 4. Course Materials.81.75.83 5. Interaction.86.88.73 6. Course Technology.84.831.00 7. Learner Support.83.881.00 8. Accessibility.95.75.78 Total Score.89.79.85 Characteristic Course 1 n=8 Course 2 n=4 Gender4F/4M4F/0M Age33+/-7.3536.5+/-1.73 First online?3Y/5N2Y/2N # of Online3.88+/-4.764.75+/-8.22 # of Online Range0 - 120 - 17 Table 3. Scores by QM standard for Course 2 (student average, faculty-self, and QM rubric) QM Standard Student Scores Faculty Scores QM Rubric 1. Overview/Intro.79.71.36 2. Objectives.77.50.60 3. Assessment.81.751.00 4. Course Materials.81.75.83 5. Interaction.69.50.45 6. Course Technology.75.58.67 7. Learner Support.81.751.00 8. Accessibility.78.751.00 Total Score.81.65.72


Download ppt "Both descriptive and correlational. First, the researcher reviewed the online courses according to the Quality Matters scoring rubric to provide a score."

Similar presentations


Ads by Google