Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa,

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Critical Reading Strategies: Overview of Research Process
Brief Overview of Qualitative & Quantitative Research.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Seven Steps in Designing the eWriting Learning Objects (LOs) eWriting: ESL Writing Success September 2002 – August 2005 ESL/Foreign Languages Department.
Increasing Preservice Teachers' Capacity for Technology Integration Through the Use of Electronic Models Melissa Dark Purdue University.
Teacher’s Attitudes and Perceptions Toward the Use of Inspiration 6 Software in Inclusive World History Classes at the Secondary Level Summary by Katie.
Evaluating language learning in hypermedia and virtual environments CALL 2008 Antwerpen, Belgium August - September, 2008 GexCALL Research Group - University.
Texas A&M University College of Education eEducation Group.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Home Economics Teachers’ Readiness for Teaching STEM
Title Exploring the Effect of Participation in the Bridge to College (B2C) Programme on Learners Aoife Gaffney MSc Technology & Learning.
Integrating Problem-Solving and Educational Software
Introduction to Using Visual Representations in Math.
WRITING A RESEARCH PROPOSAL
Learning Objects in 9 Weeks! An Instructional Design Course Model Faculty of Education University of Ontario Institute of Technology Liesel Knaack, Robert.
Principles of Assessment
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
HANAM TEACHERS’ TRAINING COLLEGE
Professional Development Activity Log: Comparing Teacher Log and Survey Approaches to Evaluating Professional Development AERA Annual Meeting Montreal,
Qatar University Exemplary Online Course Award
“Knowledge” Do Now: As a teacher, what does this statement make think about or feel: “He Who Can Does He Who cannot Teaches” George Bernard Shaw.
1 How can self-regulated learning be supported in mathematical E-learning environments? Presenters: Wei-Chih Hsu Professor : Ming-Puu Chen Date : 11/10/2008.
Subject Matter Expert/Author: Assoc. Prof. Dr Rashid Johar (OUM) Faculty of Science and Foundation Studies Copyright © ODL Jan 2005 Open University Malaysia.
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
Seminar on Research and Article Writing Presented by Dr. Chan Chang Tik.
MERLOT’s Peer Review Report Composed from reports by at least two Peer Reviewers. Description Section Provides the pedagogical context (i.e. learning goals,
The Relationship Between Student Satisfaction with Learning Online and Cognitive Load 16 th Annual Sloan Consortium International Conference on Online.
DEBBIE FRENCH EDCI 5870 OCTOBER 30,  Title of research project:  “An Investigation of the NITARP/ SSTPTS Astronomy Research Experience for Teachers”
Educational Mini Clips ITEC Lab C135 9 to 12 Dr. Robin Kay University of Ontario Institute of Technology Oshawa, Ontario.
Online Resources for Pre- Service Teachers Laura J. Pyzdrowski West Virginia University Anthony S. Pyzdrowski California University Of Pennsylvania
Longitudinal Study to Measure Effects of MSP Professional Development on Improving Math and Science Instruction.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
NCATE Accreditation: Views from the Field Christopher Cratsley Fitchburg State College Robert A. Cohen East Stroudsburg University.
Web Based Learning Tools Grades K -12 Webpage for WBLTs faculty.uoit.ca/kay/te2009/wblt Canada Robin Kay Associate Professor, Faculty of Education Homepage:
Introduction to research methods 10/26/2004 Xiangming Mu.
CHILDREN’S PERCEPTIONS OF LEARNING WITH EDUCATIONAL GAMES USING IPOD TOUCHES Yasemin Allsop ICT Coordinator, Wilbury Primary School (UK)
Quality Evaluation methodologies for e-Learning systems (in the frame of the EC Project UNITE) Tatiana Rikure Researcher, Riga Technical University (RTU),
Further notes on methodology Indebted to Patton (1990)
Planning an Applied Research Project Chapter 3 – Conducting a Literature Review © 2014 by John Wiley & Sons, Inc. All rights reserved.
NAME Evaluation Report Name of author(s) Name of institution Year.
AERA Annual Meeting 2004, San Diego April, Optimizing Evaluation Quality & Cost Effectiveness Evaluating the University of Texas Master Teacher.
Evaluation of Shoreline Science Jia Wang & Joan Herman UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation,
Kesarkar Madhura, Ph.D. Head, Department of Education, SNDTWU, Mumbai.
Department of Secondary Education Program Assessment Report What We Assessed: Student Learning Outcomes (SLOs) and CA State Teaching Performance.
Teacher Professional Development When Using the SWH as Student-Oriented Teaching Approach Murat Gunel, Sozan Omar, Recai Akkus Center for Excellence in.
Jim Dorward Sarah Giersch Kaye Howe Rena Janke Mimi Recker Andy Walker NSF Awards: NSDL ;TPC Using Online Science & Math Resources in Classrooms.
USING MUSIC TO SUPPORT LEARNING How Can the Use of Music as a Teaching Tool Support and Enhance Learning and Improve Learning Outcomes ? Millicent Howard.
1 Instructional Practices Task Group Chicago Meeting Progress Report April 20, 2007.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Rapid Learning Object Development Reusability of Templates + Code Bill Muirhead, Liesel Knaack, Brad Carson, Robin Kay + Erin Banit University of Ontario.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
RESEARCH An Overview A tutorial PowerPoint presentation by: Ramesh Adhikari.
Welcome! A-Team Session 4 Fall Overview for Today 1 Minute Message review Outcomes Feedback Methodology Review Assessment Steps  Sampling  Validity/Reliability.
DOES EXPOSURE TO SCRATCH AND TYNKER AFFECT THE ATTITUDES OF TEACHERS TOWARD COMPUTING? Alexandria Wauneka Mentor: Dr. Suzanne Westbrook School of Information:
DEVELOPING INFORMATION TECHNOLOGY LITERACY AMONG TECHNOLOGY EDUCATION PRE-SERVICE TEACHERS IN PARAGUAY Hong Kong, January 2006 Aichi University of Education.
Developing a Metric for Evaluating Discussion Boards Dr. Robin Kay University of Ontario Institute of Technology 2 November 2004.
Measuring Mathematics Self Efficacy of students at the beginning of their Higher Education Studies With the TransMaths group BCME Manchester Maria.
Assessment Assessment is the collection, recording and analysis of data about students as they work over a period of time. This should include, teacher,
Measurement Chapter 6. Measuring Variables Measurement Classifying units of analysis by categories to represent variable concepts.
Dr. S. Hassan & Mr. W. Wium CPUT.  Prompted by the poor performance of students in an “at risk subject” in an Applied Science Faculty.  Lecturer performance.
By Jessica Foland December 12, 2013 Creating a Connection between Formative Assessment and Differentiated Instruction.
MOCK ACTION RESEARCH PROPOSAL PRESENTATION Beverly Houston-Stoute Week 5 Discussion 1 ASHFORD UNIVERSITY EDU 671 Dr. Ansoff.
Implementing on Multimedia Material in EFL Classroom - A Case of Chaoyang University of Technology Advisor: Dr. Po-Yi Hung Presentator: Shin-Yi Liao Date:
1 Using DLESE: Finding Resources to Enhance Teaching Shelley Olds Holly Devaul 11 July 2004.
Interactive Whiteboard Use and Student Achievement
Introduction to Standard Deviation
Learning Objects ITEC Lab C135 1:30 to 4:15
Athabasca University School of Computing and Information Science
Presentation transcript:

Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa, Ontario

University of Ontario Institute of Technology Opened Sept hour east of Toronto Focus on math & science Faculty of Education 70 students (Year 1) 90 students (Year 2 & 3) 225 students (Year 4) 3000 students in total Ubiquitous computing

Overview Background Defining Learning Objects Previous Research Our Approach Our Scale Sample & Procedure Results Conclusions Future Research

Background A review of 58 articles 11 studies focussed on evaluation, but only 2 evaluated learning The “learning object” revolution will never take place unless instructional use and pedagogy are explored and evaluated (Muzio, Heins & Mundell, 2002; Richards, 2002; Wiley, 2000)

Defining Learning Objects Majority of researchers have emphasized technological issues: accessibility, adaptability, the effective use of metadata, reusability, and standardization A second definitional pathway is emerging based on learning A question of values: learning object developers and designers programmers educators

Our Definition (Values) For our study, learning objects were defined as” “interactive web-based tools that support the learning of specific concepts by enhancing, amplifying, and guiding the cognitive processes of learners”

Previous Research – 6 features 1.Description of Learning Object Clear, often provide links Varied (assessment, tutorials, interactivity) 2.Multiple Sources of Data Surveys. Interviews, E-Mali, Tracking Use, Think-aloud-protocols 3. Focus on One Learning Object

Previous Research – 6 features 4.Sample Size Small, limited description, exclusively higher- education 5.Reliability & Validity None 6.Formal Statistics Largely absent Emphasis placed qualitative data

Our Approach a large, diverse, sample of secondary school students reliability and validity estimates calculated formal statistics were used where applicable specific learning objects features based on instructional design research were examined; a range of learning objects was tested evaluation criteria focussed on the learner, not the technology.

Our Scale – Reliability =.87 7 Point Likert Scale 1.The learning object has some benefit in terms of providing me with another learning strategy/another tool. 2.I feel the learning object did benefit my understanding of the subject matter’s concept/principle. 3.I did not benefit from using the learning object. 4.I am interested in using the learning object again.

Our Scale – Part 2 QualityQuality – Content Analysis (ID) 5.You used a digital learning object on the computer. Tell me about this experience when you used the object. a) What did you like? (found helpful, liked working with, what worked well for you) b) What didn’t you like? (found confusing, or didn’t like, or didn’t understand) Perceived Benefit Perceived Benefit – Content Analysis 6.Do you think you benefited from using this particular learning object? Do you think you learned the concept better? Do you think it helped you review a concept you just learned? Why? Why not

Sample and Procedure 211 students, grades 9-12, 12 different high schools 30 teachers, 21 pre-service, 9 experienced Each teacher used one of 5 learning objects: Biology Computer Science Chemistry Math Physics Use learning object in a 70 minute period

Results - Scales Scale was reliable (r=.87) Correlation among quantitative scale and qualitative results (r=.64, p <.001) – criterion validity Rating of qualitative data – 100% agreement on quality and benefits content analysis

Results – LO Quality CategorynMeanStd. Deviation Animations Interactivity Useful Assessment Graphics Theme/ Motivation Organization Learner Control Help Functions Clear Instructions Difficulty Information Correct Audio

Results – LO Benefits ReasonnMeanStd. Deviation Fun / Interesting Visual Learners Interactive Learning Related Good Review Computer Based Compare to Another Method Timing Clarity Not good at subject

Results - Comparing LOs LO Quality (Q5)Benefit (Scale)Benefit (Q6) MeanS.E.MeanS.E.MeanS.E. Biology Chemistry Computer Science Math Physics

Results – Focus Group Biology, Chemistry, Computer Science Majority of suggestions were cosmetic Math & Physics Suggestions for change based on learning

Conclusions Formative analysis just the beginning Data collection instruments reliable and valid LO qualities Research on instructional design categories provided a good base

Conclusions LO – Benefits Graphics and interactivity Learning is important to students Comparing LOs Tools sensitive to differences among LOS

Future Research Developed a new scale based on qualitative content analysis and further review of instruction design literature Recently tested on grade 7 & 8 math students Being tested on 48 classes, 1200 students, grades 5-12 Further tests on 24 teachers – focussing on credit recovery and special needs students

Contact Information Robin Kay Liesel Knaack