1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.

Slides:



Advertisements
Similar presentations
Jump to Contents Instructor Tutorial essignments.com Paperless assignment submission system.
Advertisements

Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Interaction Design: Visio
Computer Aided Assessment using QuestionMark Perception by Catherine Ogilvie Educational Development Officer.
Academic Quality How do you measure up? Rubrics. Levels Basic Effective Exemplary.
Qwizdom. –Session 1 today – Introduction, construction and delivery of a presentation –Session 2: Your first live session (help if requested)
DT Coursework By D. Henwood.
Home This training presentation is designed to introduce the Residency Management Suite to new users. This presentation covers the following topics: Login.
Module 2: Creating a Likert Scale
Chapter 4 Design Approaches and Methods
Tailoring Needs Chapter 3. Contents This presentation covers the following: – Design considerations for tailored data-entry screens – Design considerations.
Presenter: Kay Fenton UNITEC Institute of Technology Auckland, New Zealand.
SE 450 Software Processes & Product Metrics Survey Use & Design.
1 Sources:  SusanTurner - Napier University  C. Robson, Real World Research, Blackwell, 1993  Steve Collesano: Director, Corporate Research and Development.
Spreadsheets in Finance and Forecasting Presentation 8 Buttons, Boxes and Bars.
Automating Tasks With Macros
Session 8. Today  Handouts—  Review of last week topics  Content validity and reliability  Survey construction –Two research questions  Leadership.
Gender Issues in Systems Design and User Satisfaction for e- testing software Prepared by Sahel AL-Habashneh. Department of Business information systems.
© De Montfort University, Design Process Howell Istance Department of Computer Science De Montfort University.
Principles and Methods
Usability Specifications
User interface design Designing effective interfaces for software systems Objectives To suggest some general design principles for user interface design.
Proposal 13 HUMAN CENTRIC COMPUTING (COMP106) ASSIGNMENT 2.
1 User Interface Design CIS 375 Bruce R. Maxim UM-Dearborn.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Website Design. Designing and creating different elements involved in developing a website for e- commerce can help you identify and describe the components.
Multimedia Design. Table of Content 1.Navigational structures 2.Storyboard 3.Multimedia interface components 4.Tips for interface design.
Business and Management Research
Changes in WebCT Vista Version 8 (AKA CourseDen) UWG Distance & Distributed Ed Center (adapted from Kings College, UK) October 2008.
Copyright © Texas Education Agency, All rights reserved. 1 Web Technologies Website Development with Dreamweaver.
Web Technologies Website Development Trade & Industrial Education
Microsoft PowerPoint Getting Started Guide Prepared for Towson University Dr. Jeff M. Kenton Amy Chase Martin 2007.
PassMeritDistinction Candidates will collect and display examples of: navigation buttons navigation bar advertising banner from at least two different.
The Physical World: Multimedia physics at a distance Jon Rosewell, Fiona Thomson - CES Canan Tosunoglu Blake - IET The Open University.
Evaluation of Respondus assessment tool as a means of delivering summative assessments A KU SADRAS Project Student Academic Development Research Associate.
CHAPTER 15 Questionnaires. What is a questionnaire? A questionnaire is a means of gathering survey data from a large number of people A questionnaire.
HOW TO MAKE A SURVEY WITH SURVEY MONKEY Directions with Diagrams Professional Development Webinar Survey Monkey Logo.
MarkNotes Question 1 The Human Computer Interface (HCI) is an important part of an ICT system. Describe four factors which should be taken.
Heuristic evaluation Functionality: Visual Design: Efficiency:
Malama, C. and Landoni, M. and Wilson, R. (2004) Fiction electronic books: a usability study. In: Eighth European Conference on Research and Advanced Technology.
6 th Annual Focus Users’ Conference 6 th Annual Focus Users’ Conference Teacher Uploads, Tests, and Answer Key Only Tests Presented by: Kori Watkins Presented.
1 The Design of Multimedia Assessment Objects Gavin Sim, Stephanie Strong and Phil Holifield.
Database Systems Microsoft Access Practical #2 Making Forms and Reports Nos 215.
1 ISE 412 Usability Testing Purpose of usability testing:  evaluate users’ experience with the interface  identify specific problems in the interface.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Final Presentation Red Team. Introduction The Project We are building an application that can potentially assist Service Writers at the Gene Harvey Chevrolet.
Designing & Testing Information Systems Notes Information Systems Design & Development: Purpose, features functionality, users & Testing.
Software Architecture
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Piloting CAA: All Aboard Gavin Sim & Phil Holifield.
Writing Software Documentation A Task-Oriented Approach Thomas T. Barker Chapter 5: Analyzing Your Users Summary Cornelius Farrell Emily Werschay February.
Fall 2003Sylnovie Merchant, Ph.D. ACCESS Tutorial Note: The purpose of this tutorial is to provide an introduction to some of the functions of ACCESS in.
Build a database V: Create forms for a new Access database Overview: A window into your data So far in this series of courses, you’ve built tables, relationships,
Unit 13 – Website Development FEATURES OF WEBSITES.
MarkNotes Question 1 The Human Computer Interface (HCI) is an important part of an ICT system. Describe four factors which should be taken.
Learning Aim A.  Websites are constructed on many different features.  It can be useful to think about these when designing your own websites.
By: Your Name ELEMENTS OF WEB DESIGN. VISUAL APPEAL Optimization of Graphics, for people to stay on your website, your pictures have to load out as soon.
Fact Finding (Capturing Requirements) Systems Development.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
Lesson objective To understand how to draw and write up a questionnaire Success criteria: Build – discuss what makes a good question in a questionnaire.
Research Methods for Business Students
PeerWise Student Instructions
IT Business Applications
System Design Ashima Wadhwa.
ASSESSMENT OF STUDENT LEARNING
Presentation Graphics
Creating an Effective Test or Survey Instrument LaShawnda Purdie EDU:652 Instructional Design and Delivery Instructor: Dr. Judith Marged 04/02/2018.
User Interface Design Notes are from: Wilson, Software Design and Development The Preliminary Course. Cambridge Press. pp and Fowler,
Mary Torjussen A2 ICT Week 1.
A multimedia and animation project
Presentation transcript:

1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield

2 Introduction Increase in the use of CAA Increase in the use of CAA Software  Dedicated Systems e.g. Questionmark, TOIA  Learning Management Systems Companies increase features of software to attract new customers and hold onto existing customers  One method is increasing the number of question styles

3 Introduction Students (users) have little influence on the design of CAA software Usability is generally measured by considering the effectiveness of an interface, the efficiency of the system, and the user experience User experience is a facet of usability User experience is often based on a measure of user satisfaction – questionnaires, observations

4 Experiment Three CAA applications were used to provide a variety of interface design characteristics which the users could evaluate User satisfaction was considered to be affected by :  Accessing and finishing the test  Navigation within the test  Visual layout  Interface for answering questions Purpose of the study was not to identify or claim that one software was better than another, merely to examine attributes of the interface that affect user satisfaction

5 Choice of CAA Applications – S1 An example of a CAA application integrated into a Learning Management System All the questions were displayed on the screen at once and three question styles were used Multiple Choice, Multiple Response and Text Entry (Football)

6 Choice of CAA Applications – S2 A dedicated CAA software application, offering a lot more functionality and question styles than learning management systems Question by question delivery and six question styles; Multiple Choice, Multiple Response, Order, Text Entry, Matrix and Drag and Drop (Films)

7 Choice of CAA Applications – S3 CAA software application, offering more advanced question styles than the other two applications Four sections of the demonstration were selected and sophisticated question styles such as drawing lines, assertion reason and matrix (Geology and Maths etc)

8 Survey Design Q1 - questionnaire consisted of 13 Likert style questions and 1 open ended question Q2 - a variation on a repertory grid  The participants to rank each application according to nine constructs Q2 - two questions that required the students to identify which of the CAA applications would be their preference for formative and summative assessment.

9 Procedure Students attended an experiment (Part 1) and then, a week later completed a post hoc survey (Part 2) Part 1 was conducted in three labs The order in which students met the three packages was counterbalanced to remove any learning effects that might otherwise have affected the results As each student completed a single application, they completed the questionnaire Q1

10 Participants 44 second year HCI students completed part 1 of the experiment Only 25 completed part 2 of the experiment To keep in line with their course they were asked to focus on usability

11 Analysis Likert Questions - scored in an ordinal way 1-5, where 5 represented Strongly Agree and 1 represented Strongly Disagree  If the question was negatively worded then the scoring was reversed The Repertory Grid (Q2), completed the week after the initial experiment, was again coded in an ordinal manner 1-3 for each of the criteria Test of reliability was carried out on the major instrument, Q1; the alpha reliability of the scale is 0.888

12 Results Asked whether they had any prior experience of using the software  17 had prior experience of S1  20 had experience of using S2  2 had used S3 before S1 and S2 no significant difference between the two groups (experience and no experience)

13 Accessing the Test Q1 Q2 S1S2S3 I had no problem gaining access to the test I encountered difficulties starting the test S1S2S3 Easiest to login 18161

14 Accessing the Test High scores for S1 could have been due to the fact that the majority of students access the associated LMS for teaching material The amount of interaction that is required before the user gets to the first question:  S1 and S2 both required 5 tasks  S3 required 6 tasks

15 Visual Layout S1S2S3 The interface required too much scrolling The amount of scrolling was acceptable It was difficult to read the text on the screen I would have preferred an alternative font The screen layout was clear The screen layout was consistent I liked the way the test looked

16 Visual Layout S1S2S3 Best for Screen layout 6163 Least amount of scrolling 6811 Easiest to read the text 12112

17 Visual Layout Legibility of the text within S3 may have been due to this application being evaluated with the ready made questions, there was a lot more text in both the questions and the feedback compared to S1 and S2. Layout – scores attributed to the fact that each question in S3 used a different style and therefore there was no continuity in the interface compared to the other applications

18 Visual Layout – Student Comments S2 – Simple but uninspiring S3 – Didn’t have the familiar feel of using windows application so I didn’t feel as comfortable S1 – Background off putting (x2) S3 – Liked single question per page S3 – Nice colour scheme S3 – Looked professional

19 Navigation S1S2S3 The button names are meaningful I always knew where I was within the software The navigation was logical The navigation was clear S1S2S3 Easiest to navigate 12112

20 Navigation The low results for S3 may have been due to the linear navigational structure; students being required to select an option then work through the questions in order. S3 more difficult to establish location informed question number and a percentage of test completed S2 displays e.g.11 of 17

21 Navigation - Students Comments S2 - Could not navigate or work using Firefox S2 -The red and green colours made it easy to see what was right or wrong S2 – Next and Previous buttons too small S3 – Buttons are too small S1 – Right hand side tracking the progress looked good and responded well

22 Answering Questions S1S2S3 Easiest to input answer 11 3 Easiest to change answer 1190 It is possible that because the level of interaction was more complex, Students found the process of answering questions more difficult within S3. Students Comments S1 - Save all button S1 – Can’t tell whether answer has been submitted S3 – Pressing enter after data entry was unnatural

23 Preference for Software Depending on Context Only 10 stated they would use the same application for both contexts 9 students stated their preference for summative assessment would be S2 and S1 for formative assessment.

24 Conclusions For S1 and S2 prior experience had no bearing on user satisfaction Difficult to identify most important factors that affect user satisfaction With regards to navigation, students appeared to prefer the ability to navigate freely and were less satisfied with the linear structure presented in S3. Further work may be needed to determine whether there is a complexity threshold within CAA environments in relation to question styles Students appear to prefer different systems depending on whether the software is being used for formative or summative assessment

25 Further Work Complex trying to compare systems due to variations in functionality in particular question styles Try to establish why they would choose different systems depending on context Further investigate question styles