Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa, Ontario
University of Ontario Institute of Technology Opened Sept hour east of Toronto Focus on math & science Faculty of Education 70 students (Year 1) 90 students (Year 2 & 3) 225 students (Year 4) 3000 students in total Ubiquitous computing
Overview Background Defining Learning Objects Previous Research Our Approach Our Scale Sample & Procedure Results Conclusions Future Research
Background A review of 58 articles 11 studies focussed on evaluation, but only 2 evaluated learning The “learning object” revolution will never take place unless instructional use and pedagogy are explored and evaluated (Muzio, Heins & Mundell, 2002; Richards, 2002; Wiley, 2000)
Defining Learning Objects Majority of researchers have emphasized technological issues: accessibility, adaptability, the effective use of metadata, reusability, and standardization A second definitional pathway is emerging based on learning A question of values: learning object developers and designers programmers educators
Our Definition (Values) For our study, learning objects were defined as” “interactive web-based tools that support the learning of specific concepts by enhancing, amplifying, and guiding the cognitive processes of learners”
Previous Research – 6 features 1.Description of Learning Object Clear, often provide links Varied (assessment, tutorials, interactivity) 2.Multiple Sources of Data Surveys. Interviews, E-Mali, Tracking Use, Think-aloud-protocols 3. Focus on One Learning Object
Previous Research – 6 features 4.Sample Size Small, limited description, exclusively higher- education 5.Reliability & Validity None 6.Formal Statistics Largely absent Emphasis placed qualitative data
Our Approach a large, diverse, sample of secondary school students reliability and validity estimates calculated formal statistics were used where applicable specific learning objects features based on instructional design research were examined; a range of learning objects was tested evaluation criteria focussed on the learner, not the technology.
Our Scale – Reliability =.87 7 Point Likert Scale 1.The learning object has some benefit in terms of providing me with another learning strategy/another tool. 2.I feel the learning object did benefit my understanding of the subject matter’s concept/principle. 3.I did not benefit from using the learning object. 4.I am interested in using the learning object again.
Our Scale – Part 2 QualityQuality – Content Analysis (ID) 5.You used a digital learning object on the computer. Tell me about this experience when you used the object. a) What did you like? (found helpful, liked working with, what worked well for you) b) What didn’t you like? (found confusing, or didn’t like, or didn’t understand) Perceived Benefit Perceived Benefit – Content Analysis 6.Do you think you benefited from using this particular learning object? Do you think you learned the concept better? Do you think it helped you review a concept you just learned? Why? Why not
Sample and Procedure 211 students, grades 9-12, 12 different high schools 30 teachers, 21 pre-service, 9 experienced Each teacher used one of 5 learning objects: Biology Computer Science Chemistry Math Physics Use learning object in a 70 minute period
Results - Scales Scale was reliable (r=.87) Correlation among quantitative scale and qualitative results (r=.64, p <.001) – criterion validity Rating of qualitative data – 100% agreement on quality and benefits content analysis
Results – LO Quality CategorynMeanStd. Deviation Animations Interactivity Useful Assessment Graphics Theme/ Motivation Organization Learner Control Help Functions Clear Instructions Difficulty Information Correct Audio
Results – LO Benefits ReasonnMeanStd. Deviation Fun / Interesting Visual Learners Interactive Learning Related Good Review Computer Based Compare to Another Method Timing Clarity Not good at subject
Results - Comparing LOs LO Quality (Q5)Benefit (Scale)Benefit (Q6) MeanS.E.MeanS.E.MeanS.E. Biology Chemistry Computer Science Math Physics
Results – Focus Group Biology, Chemistry, Computer Science Majority of suggestions were cosmetic Math & Physics Suggestions for change based on learning
Conclusions Formative analysis just the beginning Data collection instruments reliable and valid LO qualities Research on instructional design categories provided a good base
Conclusions LO – Benefits Graphics and interactivity Learning is important to students Comparing LOs Tools sensitive to differences among LOS
Future Research Developed a new scale based on qualitative content analysis and further review of instruction design literature Recently tested on grade 7 & 8 math students Being tested on 48 classes, 1200 students, grades 5-12 Further tests on 24 teachers – focussing on credit recovery and special needs students
Contact Information Robin Kay Liesel Knaack