The value of e-assessment in interprofessional education and large student numbers Melissa Owens* John Dermo* Fiona MacVane Phipps * Presenters.

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Auditing Subject Knowledge in Initial Teacher Training using Online Methods.
What is Assess2Know ® ? Assess2Know is an assessment tool that enables districts to create high-quality reading and math benchmark assessments for grades.
Contextualised teacher judgments in a high-stakes assessment system John A Pitman Queensland, Australia.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
Laura Snow SD Department of Education Office of Assessment and Technology Systems.
Iain Weir, Rhys Gwynllyw & Karen Henderson CETL-MSOR 2014
Some (Simplified) Steps for Creating a Personality Questionnaire Generate an item pool Administer the items to a sample of people Assess the uni-dimensionality.
Progress Monitoring project DATA Assessment Module.
Seminar Topic: Questionnaire Presented by : Rekha HR.
Learning and Teaching Conference 2012 Skill integration for students through in-class feedback and continuous assessment. Konstantinos Dimopoulos City.
Learning & Teaching Conference 2012 E-learning for interdisciplinary enterprise education: Making Ideas Happen Monday 9 January
On-line assessment. ‘If lower-order learning is an unintended educational consequence of on-line assessment, then any perceived or real gains made in.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Dr. Maria Limniou The School of Mechanical, Aerospace and Civil Engineering The University of Manchester Pedagogical beliefs of engineering teachers and.
Analysis of usage statistics in the Virtual Learning Environment Moodle shows that provision of learning resources significantly improves student grades.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
M.Sc Projects David Wilson M.Sc Projects Coordinator for Computing & Information Systems.
© AJC /18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University.
Creating Effective Classroom Tests by Christine Coombe and Nancy Hubley 1.
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
Replacing “Traditional Lectures” with Face-to-Face Directed Problem Solving Sessions and On-Line Content Delivery David G. Meyer Electrical & Computer.
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
The Research Skills exam: The four horsemen of the apocalypse: pestilence, war, famine and the RS1 exam.
Item Analysis: Classical and Beyond SCROLLA Symposium Measurement Theory and Item Analysis Modified for EPE/EDP 711 by Kelly Bradley on January 8, 2013.
Assessment Activities
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
ANGELA SHORT SCHOOL OF BUSINESS AND HUMANITIES KEVIN STARRS, SCHOOL OF ENGINEERING(RETIRED!) Designing and Delivering an online module.
Challenge the future Delft University of Technology Digital testing: What do students think of it? E-merge, November 2014 Ir. Meta Keijzer-de.
Khan Academy Implementation Models Making the Best Use of Khan Academy with Your Students 1.
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Online assessment: the use of web based self-assessment materials to support self-directed learning Online assessment: the use of web based self-assessment.
SBAC Preparation SBAC Preparation California Assessment of Student Performance and Progress (CAASPP)
Nancy Severe-Barnett Program Coordinator, SCIS
Evaluation of Respondus assessment tool as a means of delivering summative assessments A KU SADRAS Project Student Academic Development Research Associate.
Flexiblelearning.net.au/innovations E-learning Innovations Australian Flexible Learning Framework Project 09/73 – Spatial Information Skills for Professionals.
Wiley eGrade. What is eGrade? Web-based software that enables instructors to automate the process of assigning and grading homework and quiz assignments.
E- ASSESSMENT Why use it.. whatever it is? Phil Davies School of Computing University of Glamorgan.
Franklin Consulting E-assessment: What is it? Why does it matter? Tom Franklin Franklin Consulting
EAssessment Colin Milligan Heriot-Watt University.
Using Learning Technology to Design and Administer Student Assessments Catherine Kane Centre For Learning Technology, CAPSL Ref:
1 The Design of Multimedia Assessment Objects Gavin Sim, Stephanie Strong and Phil Holifield.
BSc Final Year Projects in Computing Computer Science, Creative Computing, Games Programming, Business Computing Dr Rodger Kibble.
Assessed: 2007, 2010, 2011,  PHIL 101 (Introduction to Philosophy: Ethics)  GE elective choice  BA 300 (Ethical Decision Making in Business)
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
1 © 2005 Cisco Systems, Inc. All rights reserved. 9429_03_2004_c1 Using Cisco Network Simulator.
1 Question types handled by OMR 1 ‣ Multiple choice (the more options the better, statistically) ‣ True/False ‣ Assertion/reason (combining MCQ and T/F)
Nicola Billam, Guinevere Glasfurd-Brown & Martin Sellens CAA Conference 2006 Engaging Students in Formative Assessment: Strategies and Outcomes.
Item Analysis: Classical and Beyond SCROLLA Symposium Measurement Theory and Item Analysis Heriot Watt University 12th February 2003.
e-marking in large-scale, high stakes assessments conference themes :  role of technology in assessments and teacher education  use of assessments for.
Self-assessing with adaptive exercises Chye-Foong Yong & Colin A Higgins University of Nottingham.
ASSESSMENT TO IMPROVE SELF REGULATED LEARNING 5 th July 2006, 10 th CAA conference, Poppy Pickard Assessment to improve Self Regulated Learning.
 The introduction of the new assessment framework in line with the new curriculum now that levels have gone.  Help parents understand how their children.
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
M253 Students Study Guide Mrs. Fatheya Al Mubarak – AOU Dammam.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Development of Assessments Laura Mason Consultant.
Blackboard Assignments & Feedback Rubrics
Developing CPBL Courses
The One-Two-Three Feedback Cycle
How can Blackboard assist in Assessment and Facilitation of Knowledge Exchange? Anne Nortcliffe.
UQ Course Site Design Guidelines
Emma Senior & Mark Telford.
An Introduction to e-Assessment
Interim Assessment Training NEISD Testing Services
Challenges (and Triumphs) of Module Development
UQ Course Site Design Guidelines
Presentation transcript:

The value of e-assessment in interprofessional education and large student numbers Melissa Owens* John Dermo* Fiona MacVane Phipps * Presenters

The Challenge Large cohort of approximately 350 students – Recognition of value of student-centred assessment – Linking the process of interprofessional learning to the outcome through assessment (WHO 2009) Desire to test student engagement as well as knowledge

The Module 10 credit year one, level one module One week of face2face contact involving lectures, small group work and on-line activities using problem based learning (PBL) and problem based e-learning (PBeL) 8 weeks of on-line activities in small discussion groups using PBeL Final assessment by MCQ and Peer Assessment, both delivered electronically

Attributes of Electronic Multiple Choice Questionnaire (eMCQ) It is objective, rather than subjective Can be used to test analytical skills It is an efficient and valid method of assessing large numbers of students in a short space of time Machine-graded, thus expedite results

Item bank with 21 topics, reflecting the content of the module and the learning outcomes MCQ Assessment with QMP

Using Questionmark Perception allowed for randomisation of item selection from different topics in item bank (35 questions from 21 topics)

In accordance with university regulations for e-assessment, students receive a mock practice test via the VLE before the real test: same format and layout, just shorter

MCQ Assessment report Questions automatically marked, then report generated in the form of a spreadsheet Gives raw score and percentage, also picks up student name, and ID from login Could give more detail e.g time taken, Computer ID

Delivery of Exam 3 sittings of one hour each for both components Computer suites used Allow for 10% computer failure

Attributes of Electronic Peer Assessment It is readily deliverable and the results can be speedily collated Enables students to better judge the quality of their work and that of others A skill that students develop over time What is being valued? – Tutor or student perceptions?

Answer 3 questions per group member, then click submit

Links from the webpage where the online exam had been delivered

Choose the correct group from the list

Analysis of Peer Assessment QMP generates a report in the form of a spreadsheet

Or potentially more detailed reports per student – if there was time to process so much detail….

Results on MCQ and Peer Assessment Tasks

Rank correlation between the scores of the 333 students who were assessed in both tests indicates a small but significant correlation with (rho=.25), significant at the.01 level. The Spearman test was used in preference to the Pearson test, because of the skewedness of the PA results

Ensure the tests are as reliable and fair as possible QMP reporting tools provide a wealth of data which enables instructors to analyse the performance of each item, including how each distracter performs This enables improvements to the bank to be made from one assessment period to the next Item Analysis

Classical Item analysis of the 84 items: shows a fair distribution of levels of difficulty of items, although the largest group is in the top band (0.9 and above). Items selected at random (35 out of 84), so standard tests of reliability (e.g. Cronbach’s Alpha or split-half) not suitable. Latent trait analysis (eg Rasch Analysis) not appropriate because insufficient number of attempts for some questions. Assessment Analysis

Conclusions Combining methods of assessment advantages a greater number of students Combining methods of assessment enables multidimensional testing to occur Preparation is time consuming. Results can be expedited speedily Use of QMP enables statistical analysis of MCQs to occur and enhance validity over time Use of QMP enabled large numbers of students to peer assess There was a small but significant correlation between level of achievement in MCQ and in Peer Assessment