Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.

Similar presentations


Presentation on theme: "1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield."— Presentation transcript:

1 1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield

2 2 Introduction Increase in the use of CAA Increase in the use of CAA Software  Dedicated Systems e.g. Questionmark, TOIA  Learning Management Systems Companies increase features of software to attract new customers and hold onto existing customers  One method is increasing the number of question styles

3 3 Introduction Students (users) have little influence on the design of CAA software Usability is generally measured by considering the effectiveness of an interface, the efficiency of the system, and the user experience User experience is a facet of usability User experience is often based on a measure of user satisfaction – questionnaires, observations

4 4 Experiment Three CAA applications were used to provide a variety of interface design characteristics which the users could evaluate User satisfaction was considered to be affected by :  Accessing and finishing the test  Navigation within the test  Visual layout  Interface for answering questions Purpose of the study was not to identify or claim that one software was better than another, merely to examine attributes of the interface that affect user satisfaction

5 5 Choice of CAA Applications – S1 An example of a CAA application integrated into a Learning Management System All the questions were displayed on the screen at once and three question styles were used Multiple Choice, Multiple Response and Text Entry (Football)

6 6 Choice of CAA Applications – S2 A dedicated CAA software application, offering a lot more functionality and question styles than learning management systems Question by question delivery and six question styles; Multiple Choice, Multiple Response, Order, Text Entry, Matrix and Drag and Drop (Films)

7 7 Choice of CAA Applications – S3 CAA software application, offering more advanced question styles than the other two applications Four sections of the demonstration were selected and sophisticated question styles such as drawing lines, assertion reason and matrix (Geology and Maths etc)

8 8 Survey Design Q1 - questionnaire consisted of 13 Likert style questions and 1 open ended question Q2 - a variation on a repertory grid  The participants to rank each application according to nine constructs Q2 - two questions that required the students to identify which of the CAA applications would be their preference for formative and summative assessment.

9 9 Procedure Students attended an experiment (Part 1) and then, a week later completed a post hoc survey (Part 2) Part 1 was conducted in three labs The order in which students met the three packages was counterbalanced to remove any learning effects that might otherwise have affected the results As each student completed a single application, they completed the questionnaire Q1

10 10 Participants 44 second year HCI students completed part 1 of the experiment Only 25 completed part 2 of the experiment To keep in line with their course they were asked to focus on usability

11 11 Analysis Likert Questions - scored in an ordinal way 1-5, where 5 represented Strongly Agree and 1 represented Strongly Disagree  If the question was negatively worded then the scoring was reversed The Repertory Grid (Q2), completed the week after the initial experiment, was again coded in an ordinal manner 1-3 for each of the criteria Test of reliability was carried out on the major instrument, Q1; the alpha reliability of the scale is 0.888

12 12 Results Asked whether they had any prior experience of using the software  17 had prior experience of S1  20 had experience of using S2  2 had used S3 before S1 and S2 no significant difference between the two groups (experience and no experience)

13 13 Accessing the Test Q1 Q2 S1S2S3 I had no problem gaining access to the test 4.213.843.53 I encountered difficulties starting the test 4.283.953.40 S1S2S3 Easiest to login 18161

14 14 Accessing the Test High scores for S1 could have been due to the fact that the majority of students access the associated LMS for teaching material The amount of interaction that is required before the user gets to the first question:  S1 and S2 both required 5 tasks  S3 required 6 tasks

15 15 Visual Layout S1S2S3 The interface required too much scrolling 3.444.193.81 The amount of scrolling was acceptable 4.283.953.40 It was difficult to read the text on the screen 3.863.843.09 I would have preferred an alternative font 3.403.232.86 The screen layout was clear 3.883.672.56 The screen layout was consistent 4.124.022.72 I liked the way the test looked 3.493.332.35

16 16 Visual Layout S1S2S3 Best for Screen layout 6163 Least amount of scrolling 6811 Easiest to read the text 12112

17 17 Visual Layout Legibility of the text within S3 may have been due to this application being evaluated with the ready made questions, there was a lot more text in both the questions and the feedback compared to S1 and S2. Layout – scores attributed to the fact that each question in S3 used a different style and therefore there was no continuity in the interface compared to the other applications

18 18 Visual Layout – Student Comments S2 – Simple but uninspiring S3 – Didn’t have the familiar feel of using windows application so I didn’t feel as comfortable S1 – Background off putting (x2) S3 – Liked single question per page S3 – Nice colour scheme S3 – Looked professional

19 19 Navigation S1S2S3 The button names are meaningful 4.023.883.33 I always knew where I was within the software 4.024.052.23 The navigation was logical4.053.842.65 The navigation was clear3.953.772.58 S1S2S3 Easiest to navigate 12112

20 20 Navigation The low results for S3 may have been due to the linear navigational structure; students being required to select an option then work through the questions in order. S3 more difficult to establish location informed question number and a percentage of test completed S2 displays e.g.11 of 17

21 21 Navigation - Students Comments S2 - Could not navigate or work using Firefox S2 -The red and green colours made it easy to see what was right or wrong S2 – Next and Previous buttons too small S3 – Buttons are too small S1 – Right hand side tracking the progress looked good and responded well

22 22 Answering Questions S1S2S3 Easiest to input answer 11 3 Easiest to change answer 1190 It is possible that because the level of interaction was more complex, Students found the process of answering questions more difficult within S3. Students Comments S1 - Save all button S1 – Can’t tell whether answer has been submitted S3 – Pressing enter after data entry was unnatural

23 23 Preference for Software Depending on Context Only 10 stated they would use the same application for both contexts 9 students stated their preference for summative assessment would be S2 and S1 for formative assessment.

24 24 Conclusions For S1 and S2 prior experience had no bearing on user satisfaction Difficult to identify most important factors that affect user satisfaction With regards to navigation, students appeared to prefer the ability to navigate freely and were less satisfied with the linear structure presented in S3. Further work may be needed to determine whether there is a complexity threshold within CAA environments in relation to question styles Students appear to prefer different systems depending on whether the software is being used for formative or summative assessment

25 25 Further Work Complex trying to compare systems due to variations in functionality in particular question styles Try to establish why they would choose different systems depending on context Further investigate question styles


Download ppt "1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield."

Similar presentations


Ads by Google