Developing a Metric for Evaluating Discussion Boards Dr. Robin Kay University of Ontario Institute of Technology 2 November 2004.

Slides:



Advertisements
Similar presentations
Cornell Notes.
Advertisements

Performance Assessment
A didactic plan for a communicative translation class Dr. Constanza Gerding Salas Leipzig Universität - Universidad de Concepción May 2012.
TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT RUTGERS GRADUATE SCHOOL OF EDUCATION.
Computer Science Dr. Peng NingCSC 774 Adv. Net. Security1 CSC 774 Advanced Network Security Preparation for In-class Presentations.
BUSINESS AND FINANCIAL LITERACY FOR YOUNG ENTREPRENEURS: EVIDENCE FROM BOSNIA-HERZEGOVINA Miriam Bruhn and Bilal Zia (World Bank, DECFP)
PER User’s Guide. Development of the PER User’s Guide: Identifying key features of research-based pedagogical tools for effective implementation Sam McKagan.
INTERACTIVE LEARNING IN THE LECTURE-CLASS SETTING Alan Slavin Department of Physics and Jonathan Swallow (deceased) Instructional Development Centre TRENT.
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Classroom Climate and Students’ Goal Structures in High-School Biology Classrooms in Kenya Winnie Mucherah Ball State University Muncie, Indiana, USA June,
T. Stevens, G. Harris, Z. Aguirre-Munoz, and L. Cobbs Southern Right Delta (ΣΡΔ'09) Gordons Bay, South Africa, December 3, 2009.
UMCP Educational Technology Outreach On-line Course Assessment and Evaluation Model 2005 EDUCAUSE Mid-Atlantic Regional Conference Baltimore, Maryland.
Enjoyability of English Language Learning from Iranian EFL Learners' Perspective.
The purpose of this workshop is to introduce faculty members to some of the foundation issues associated with designing assignments for writing intensive.
Research Methods for Computer Science CSCI 6620 Spring 2014 Dr. Pettey CSCI 6620 Spring 2014 Dr. Pettey.
ICT TEACHERS` COMPETENCIES FOR THE KNOWLEDGE SOCIETY
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
Evaluation and analysis of the application of interactive digital resources in a blended-learning methodology for a computer networks subject F.A. Candelas,
Extensive Reading Research in Action
Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa,
Margaret J. Cox King’s College London
Nov 9th 2006 Determinants of Engagement in an Online Community of Inquiry Jim Waters College of Information Science and Technology Drexel University Philadelphia.
“Understand Why” rather than “Remember How”
Learners’ Attitudes and Perceptions of Online Instruction Presented by: Dr. Karen S. Ivers Dr. JoAnn Carter-Wells Dr. Joyce Lee California State University.
Educational Mini Clips ITEC Lab C135 9 to 12 Dr. Robin Kay University of Ontario Institute of Technology Oshawa, Ontario.
A Study of the Role of Technology in Modern Education By Fan Jin, James Cong, and Kevin Wong.
CompSci 725 Handout 7: Oral Presentations, Projects and Term Reports Version July 2009 Clark Thomborson University of Auckland.
1 Meeting the Challenge of Community: Online Social Networking to Facilitate Online Distance Learning.
Learning Performance and Computer Software: An Exploration of Transfer Dr. Robin Kay University of Ontario Institute of Technology Presented at SITE 2004,
Blogs and Wikis Dr. Norm Friesen. Questions What is a blog? What is a Wiki? What is Wikipedia? What is RSS?
MY E-PORFOLIO. ¨Evaluation¨… What I know…What I want to know…What I learned… -Process/formative vs product/summative evaluation -Necessary to make changes.
Cornell note taking stimulates critical thinking skills. Note taking helps students remember what is said in class. A good set of notes can help students.
Tempus Workshop Zagreb Quality Assurance Procedures and Activities at Ghent University.
RESEARCH IN MATHEMATİCS EDUCATION Karadeniz Technical University Fatih Faculty of Education Prof.Dr. Adnan Baki.
FEBRUARY KNOWLEDGE BUILDING  Time for Learning – design schedules and practices that ensure engagement in meaningful learning  Focused Instruction.
Presenter: Ming-Chuan Chen Advisor: Ming-Puu Chen Date: 6/8/2009 The effect of feedback for blogging on college students' reflective learning processes.
CT 854: Assessment and Evaluation in Science & Mathematics
The effect of peer feedback for blogging on college Advisor: Min-Puu Chen Presenter: Pei- Chi Lu Xie, Y., Ke, F., & Sharma, P. (2008). The effect of feedback.
Instructional Strategies Teacher Knowledge, Understanding, and Abilities The online teacher knows and understands the techniques and applications of online.
MATHEMATICS GCSE Subject changes from 2015 First Exam 2017.
Outcomes for Mathematical Literacy: Do Attitudes About Math Change?
Standards Based Grading: A New Outlook on Grading.
Use of Surveys N J Rao and K Rajanikanth
Classroom Questioning Basic elements of Classroom Questioning techniques.
Providing Effective Instructor Feedback in the Virtual Classroom: Strategies that Improve Learner Motivation, Satisfaction, and Performance Jody S. Oomen,
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 1 Introduction to Research in Communication Research: –Process of asking questions.
Feedback: Keeping Learners Engaged Adult Student Recruitment & Retention Conference Sponsored by UW-Oshkosh; March 21-22; Madison, WI Bridget Powell,
Interim report from the ELFE 2 study visits in Slovenia, Poland and Latvia: analysis of practices and experiences in schools and Teacher Education Institutions.
- Collaborative report writing - Bridging the divide between formal and informal learning Richard Walker & Wayne Britcliffe E-Learning Development Team,
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
IV – Conclusion: The performing of this investigation as well as of similar others, acquire great significance, in so far as, on one side, it will contribute.
Information Retention in e-Learning De Leon Kimberly Obonyo Carolyne Penn John Yang Xiaoyan.
Assessing Student Learning Using the Blackboard Discussion Board Assessment Workshop York College CUNY April 7, 2010 Wenying Huang-Stolte, Ph.D. William.
Reflective Practice March 7, 2012 Laurel Black
Research Methods for Computer Science
A nationwide US student survey
EDU 310 Help Bcome Exceptional / uopedu310.com
ASSESSMENT OF STUDENT LEARNING
Creating Meaningful Information Literacy Assignments for an Introductory Agriculture Course Marianne Stowell Bracke Purdue University Libraries
GAN-MVL Interview Study Results Silvia Gabrielli & Markus Hodapp.
Critically Evaluating an Assessment Task
Learning Assessment Learning Teaching Dr. Md. Mozahar Ali
HOW do we assess? WHAT do we assess?.
FLIPPED CLASSROOM PRESENTED BY Dr.R.JEYANTHI Asst.Professor,
Seven Principles of Good Teaching
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Developing a Metric for Evaluating Discussion Boards Dr. Robin Kay University of Ontario Institute of Technology 2 November 2004

Overview Background Data Collected 12 Areas of Evaluation Sample Results Summary

Background  Discussion board use has grown extensively (e.g., Cooper, 2001)  Some say tool is revolutionary ( e.g., Hara et. al., 1998, Li, 2003)  Others say our understanding of discussion boards is minimal (e.g., Blignaut & Trollip, 2003)  One problem: The metric used to evaluate effectiveness of discussion boards

Previous Metrics  Most studies examine one or two aspects of online discussion  A few researchers have attempted more complete analyses (e.g., Hara et. al., 1998; Zhu, 1998)  Metrics are rarely theory-driven

Metric Needs  Comprehensive  Theory-Based  Consistent

Data Collected (1 of 3)

Data Collected (2 of 3)

Data Collected (3 of 3)  Subject Pool secondary, higher education  Purpose of Discussion Board debate, posting resources, solving problems  Individual Differences gender, typing speed, access  Type of Course Online Only vs. Mixed

12 Areas of Evaluation  Social Learning Social Learning  Cognitive Processing Cognitive Processing  Discussion Quality Discussion Quality  Initial Question Initial Question  Role of Educator Role of Educator  Navigation Navigation  Challenges Challenges  Types of Users Types of Users  Attitudes Toward DB Attitudes Toward DB  Reponses Time Reponses Time  Time of Learning Time of Learning  Performance Performance

Social Learning Research Vygotsky (1978) and Slavin (1995) Number of researchers have reported that true social interaction is rare in DB Data Used Length of discussion thread Number of messages read Primary focus of message 2 or more times in the same thread

Cognitive Processing Research Rare to see theoretical taxonomy Revised Bloom’s Taxonomy (Anderson & Krathwohl, 2001) Data Used Knowledge Type Processing Level

Discussion Quality Research Researcher have looked at tone, reasoning, degree of controversy, and content Data Used Message clarity & quality New Knowledge Added Reference to Course Knowledge External Resources Used

Initial Question Research Some research supports clear, provocative questions that promote higher level thinking Other research notes that it is hard to find clear patterns Data Used Clarity, quality, knowledge and processing type  Number of times question was read  Length of discussion

Role of Educator Research Some researchers say instructor’s role is critical for raising the level of discussion Others claim students need to construct their own knowledge; instructor stifles discussion Data Used Student vs. Instructor  Number of times message was read  Length of message  Response Time

Navigation Research Problems reported with respect to message length, number of entries, unclear subject lines, lack of organization Data Used Subject line clarity & location  How often message was read  Response time Interview data

Challenges for Participants Research Ability to participate, pace slower, time taken to participate, being grader Data Used Interview data  Open ended question about use

Types of Users Research Participants assume different roles based on participation, degree of reflection, and mediation skills Data Used Average number of message read Average response time Number of words Message quality Number of message posted

Attitudes Toward Discussion Board Research Little systematic research done in this area Data Used Interview  Perceived usefulness (consumer and provider)

Response Time Research No systematic research, but speculation that delays in response time could decrease value of discussion Data Used Message location and response time Response time corrected with how often messages are read

Time of Learning Research No research on how much discussion goes on outside school Data Used Inside vs. Outside School  Number of message posted  Clarity and quality

Learning Performance Research Has yet to be formally tested Do discussion boards improve learning? Data Used Final Test and Project Grade correlated with  number of visits  number of days visited  number of message posted

Sample Results  Social Learning Social Learning  Cognitive Processing Cognitive Processing  Discussion Quality Discussion Quality  Initial Question Initial Question  Role of Educator Role of Educator  Navigation Navigation  Challenges Challenges  Types of Users Types of Users  Attitudes Toward DB Attitudes Toward DB  Response Time Response Time  Time of Learning Time of Learning  Performance Performance Overview Sample

 45 secondary students (2 classes)  years old  Introductory computer science course  HTML and Programming  Private school  All boys  HTML (24 days)  Programming (36 days)

Overview of Results MeasureAverage Length of Thread3.5 msg. (1 to 11) Words48.3 (1 to 263) Clear subject line1.68 (0 to 3) Message quality2.3 (0 to 4) Number of times read11.3 (2 to 77) Response Time2.5 days (1 m. to 34 d.) Content86% course or beyond Non-Academic6%

Social Learning MeasureResults Threads with 5 or more messages 47% Average number of times a message was read 29.5 Ask questions66% Participating twice in same discussion 37%

Cognitive Processing MeasureResults Procedural Knowledge57% Conceptual Knowledge21% Understanding35% Remembering27% Applying22%

Discussion Quality MeasureResults Clear Messages67% Message quality good41% New Knowledge Added67% Content (course or more)86% Non-Academic6%

Effect of Initial Question MeasureResults Easily Answered No impact on number of times a message was read or length of discussion Subject line clarity Message quality Knowledge Type Processing Level

Role of Educator MeasureResults Student No difference with respect to number of times read, length of message, or response time Teacher

Navigation MeasureResults Correlation between clear subject line & number of times read Not Significant Correlation between clear subject line & response time Not Significant Correlation between message number & number of times read r =.26; p <.001 Correlation between message number & response time Not significant Navigation a problem54% of the time

Challenges MeasureResults Too Slow38% Technical of software problems25% Trusting peers answers22% Difference in Learning Style12% Lack of ability12% Inhibited by grades11%

Types of Users (5+ messages) MeasureResults Average message readp <.001 Average response timep <.001 Number of words usedp <.001 Message qualityp <.001 Number of message posted1 to 17 Subject line clarity, question difficulty, knowledge and processing type No significant difference

Attitude Toward Discussion Board MeasureResults Effective Learning Tool37% Used Frequently38% Received useful information65% Provided helpful information39%

Response Time MeasureResults Average Response Time25.3 hours Jump from 3 rd to 4 th message19.7 to 31 hours Correlation between response time and number of times message read r = -.254; p <.01

Time of Learning MeasureResults Message posted outside of school hours 55% Difficulty of questionOutside class (p <.05) Subject line, quality of message, response time, number words No significant difference between inside and outside class

Performance

Summary  Previous research uses limited metric  12 areas covered offer a relatively rich analysis of discussion  If we want to understand the use if discussion boards we need metric that is Comprehensive (details are everything) Theory-Based

Contact Information  Robin Kay  Website for Paper & Presentation faculty.uoit.ca/kay/elearn2004/elearn2004.htm