Evaluating the usability of the BlueJ environment. Marilyn Barallon Supervisor: Dr.Linda McIver.

Slides:



Advertisements
Similar presentations
COMPUTER PROGRAMMING I Essential Standard 5.02 Understand Breakpoint, Watch Window, and Try And Catch to Find Errors.
Advertisements

-Arthur Lewis (alew525). Contents Background and Introduction Overview of Popular Environments Empirical Studies Borland Delphi v/s SimplifIDE Gild v/s.
Project Proposal.
Overview of Digital Content Evaluation. Domains of Content Evaluation Quality and comprehensiveness of content Ease of use, functionality, navigation.
MT311 Tutorial Li Tak Sing( 李德成 ). Uploading your work You need to upload your work for tutorials and assignments at the following site:
© Janice Regan, CMPT 102, Sept CMPT 102 Introduction to Scientific Computer Programming The software development method algorithms.
Lecture Roger Sutton CO331 Visual programming 15: Debugging 1.
Python Programming Chapter 1: The way of the program Saad Bani Mohammad Department of Computer Science Al al-Bayt University 1 st 2011/2012.
Well-behaved objects Debugging. 2 Objects First with Java - A Practical Introduction using BlueJ, © David J. Barnes, Michael Kölling Prevention vs Detection.
Computer Concepts 5th Edition Parsons/Oja Page 546 CHAPTER 11 Software Engineering Section A PARSONS/OJA Computer Programming.
1 Keeping Track: Coordinating Multiple Representations in Programming Pablo Romero, Benedict du Boulay & Rudi Lutz IDEAs Lab, Human Centred Technology.
Principles and Methods
About the Presentations The presentations cover the objectives found in the opening of each chapter. All chapter objectives are listed in the beginning.
© 2006 Pearson Addison-Wesley. All rights reserved2-1 Chapter 2 Principles of Programming & Software Engineering.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Software Process and Product Metrics
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 8 Slide 1 Tools of Software Development l 2 types of tools used by software engineers:
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Debugging Logic Errors CPS120 Introduction to Computer Science Lecture 6.
The Project AH Computing. Functional Requirements  What the product must do!  Examples attractive welcome screen all options available as clickable.
Your Interactive Guide to the Digital World Discovering Computers 2012.
ITP © Ron Poet Lecture 1 1 IT Programming Introduction.
Introduction to High-Level Language Programming
Computer Programming and Basic Software Engineering 4. Basic Software Engineering 1 Writing a Good Program 4. Basic Software Engineering.
A First Program Using C#
LESSON 8 Booklet Sections: 12 & 13 Systems Analysis.
Where Innovation Is Tradition SYST699 – Spec Innovations Innoslate™ System Engineering Management Software Tool Test & Analysis.
Karel J Robot An introduction to BlueJ and Object- Oriented Programming.
Chapter 1: A First Program Using C#. Programming Computer program – A set of instructions that tells a computer what to do – Also called software Software.
Lecture 11 Testing and Debugging SFDV Principles of Information Systems.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Debugging in Java. Common Bugs Compilation or syntactical errors are the first that you will encounter and the easiest to debug They are usually the result.
1 Causal-Consistent Reversible Debugging Ivan Lanese Focus research group Computer Science and Engineering Department University of Bologna/INRIA Bologna,
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
Testing and Debugging Session 9 LBSC 790 / INFM 718B Building the Human-Computer Interface.
9/2/ CS171 -Math & Computer Science Department at Emory University.
An Anonymous Approach to Group Based Assessment Wayne Ellis & Mark Ratcliffe University of Wales, Aberystwyth
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Presented by IBM developer Works ibm.com/developerworks/ 2006 January – April © 2006 IBM Corporation. Making the most of The Eclipse debugger.
Jeliot – A powerful Java tutor for beginners Boro Jakimovski Institute of Informatics Faculty of Natural Sciences and Mathematics University “Ss Cyril.
The Software Development Process
interactive logbook Paul Kiddie, Mike Sharples et al. The Development of an Application to Enhance.
© 2006 Pearson Addison-Wesley. All rights reserved 2-1 Chapter 2 Principles of Programming & Software Engineering.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 4 Slide 1 Slide 1 What we'll cover here l Using the debugger: Starting the debugger Setting.
Intermediate 2 Computing Unit 2 - Software Development.
1First BlueJ Day, Houston, Texas, 1st March 2006 Debugging in BlueJ Davin McCall.
Teaching Abstract Data Type Semantics with Multimedia* Glenn D. Blank, Edwin J. Kay, William M. Pottenger Jeffrey J. Heigl, Soma Roy and Shreeram A. Sahasrabudhe.
The single most important skill for a computer programmer is problem solving Problem solving means the ability to formulate problems, think creatively.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
BlueJ X ICSE Syllabus. Board Pattern THEORY (100 marks) PRACTICAL (100 marks) PROJECT (50 marks) ASSIGNMENTS (50 marks)
INTRODUCTION CSE 470 : Software Engineering. Goals of Software Engineering To produce software that is absolutely correct. To produce software with minimum.
Day 8 Usability testing.
Principles of Programming & Software Engineering
PILeT: Python Interactive Learning Tool
Eclipse Navigation & Usage.
Testing and Debugging.
Computer Programming I
Introduction to Programmng in Python
Computer Programming.
Debugging with Eclipse
Tools of Software Development
CIS16 Application Development Programming with Visual Basic
BugHint: A Visual Debugger Based on Graph Mining
Programming Languages, Preliminaries, History & Evolution
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 8 Slide 1 Tools of Software Development l 2 types of tools used by software engineers:
Debugging with Eclipse
Presentation transcript:

Evaluating the usability of the BlueJ environment. Marilyn Barallon Supervisor: Dr.Linda McIver

Outline. What is BlueJ. What is usability. How we measure usability. What has been done in the past. Studying BlueJ. What we found- a ‘picture’ of what students are doing with BlueJ Conclusion Further Work

What is BlueJ? A visual introductory programming environment to help teach JAVA and OO. Based on the Blue System developed at Sydney and Monash University. Currently being maintained by Deakin University, Australia and the University of Kent, UK. Currently being used by 412 institutions worldwide. Encourages an ‘objects first’ approach to teaching Object Orientation.

What is BlueJ? (cont’d). Designed to be simple to use:  A simple main window: UML picture of the program ‘Workbench’ area that displays objects that can be interacted with.  Simple Debugger: Stack window Instance, local and static variable windows State of the program window (running, finished)  A simple syntax directed editor

What is usability? Subjective There are three main views from which usability can be observed and assessed (N Bevan and J Kirakowski et al):  Product-oriented view: features of the product  User-oriented view: mental effort and attitude of the user.  User-performance view: how the user interacts with the product. Our research focuses on an assessment of BlueJ’s usability based on the last two views.

Measuring Usability? Evaluation frameworks (examples):  Price et al. framework includes 6 broad and general categories: Scope Content Form Method Interaction Effectiveness  Phillips and Mehandjiska et al. framework included 5 categories, each of which was further broken down into specific subcategories: Learnability Efficiency Flexibility Robustness Feedback

What has been done? Only formal evaluations so far: surveys and data loggers. Problems with survey data:  Requires students to reflect on past interactions  Rank thoughts and/or feelings according to unique judgement of what is ‘very useful’  Never really used anything else, so how would they know? (How well did it help you learn JAVA?) Problems with program logger data:  Fails to tell us what was happening between compilations.  Or what were the ‘substantial edits’

Understanding the user. Usability assessment requires understanding of users. Surveys and data loggers do not provide a mechanism for understanding users. Video-taped ‘think-out-loud’ observational experiments  Directly observe and analyse ‘what’ students are doing, natural behaviour.  And ask ‘why’ at the time.  ‘Discuss’ problems they encounter, at the time. (JAVA or interface problem)  Create an overall ‘picture’ of BlueJ user behaviour.

Problems with Observational Experiments. Researcher Bias Reactive behaviour Limited degree to which we can generalise behaviour to other people. In order to mitigate these risks, long term observational studies and training could be done.

Studying BlueJ Observations of students using BlueJ. Better understanding of users. A first step towards a comprehensive usability study. Detailed picture of student understanding of BlueJ and JAVA.

Studying BlueJ (cont’d) Observations involved debugging a program using BlueJ.  Two-hour video taped ‘think-out-loud’ sessions including a short questionnaire  Conducted 4 pilot study experiments, with final year students (1 from software engineering, 2 from digital systems, 1 computer science).  Conducted 5 final experiments with 2 nd years (Computing degrees) whom have completed cse1203- Programming with JAVA 2.

The Program. Program draws pictures using graphical output, making it easy to ‘see’ what is an error. Students were first shown what the correct output of the program should look like. 5 semantic bugs.

Problem Solving Abilities. Did not use BlueJ to explore the program.  Minimal exploratory behaviour of the program with the workbench.  Jumped straight into code. Lacked independent exploration and testing abilities.  Sought guidance whenever they could. As a result, poor hypothesis establishment and refinement of the behaviours of errors.

Can students use the debugger? One student. Common misconceptions and difficulties:  Setting breakpoints on non-executable lines of code.  Instantiation of the debugger window- (which method to call)  Did not understand how the debugger traces the program. Should I set one on each line? Will it go from here to there? Can I set a breakpoint on each line in this class and use the debugger to step through this class? Can I move backwards? These problems were seen in both final and 2 nd years.

Common Debugging Strategies. All students used code comparison at least once. Students appeared to edit ‘Suspicious’ statements for no apparent reason.  For example in a loop condition, change ‘<‘ to a ‘<=‘ We hypothesis that students were selecting particular statements to change based in the unfamiliar ways in which they were being used  For example in a loop structure, count+=2 instead of count +=1 Interestingly, most students expressed print statements as their preferred testing strategy.

Problems with BlueJ. Compiler and JAVA exception error messages.  Failed to take notice of them.  Failed to understand what they were telling them. Status bar:  Students failed to use it to distinguish between when the machine is busy and when it is free. ‘Remove’ option on the workbench  Problem for ‘new’ users learning OO, language syntax and BlueJ.

Conclusion. Final and 2 nd years do not possess independent debugging and problem solving ability. Students cannot use the debugger.  Shown the correct workings of the program beforehand.  Graphical output makes it easy to understand ‘what’ is happening.  Could ask questions at any time.

Conclusion (cont’d) Lack of understanding of JAVA and OO. Find ways of teaching and developing debugging and problem solving ability.  Enforcing use of the debugger. Need to make aware to students how it facilitates exploration. The way BlueJ handles its error messages, displays its status bar and object workbench need to be investigated further for possible usability issues and redesign.

Further Work. We can now move forward and construct a usability framework specific to BlueJ. Redesign the way error messages are displayed:  Relocate the error message area to another section of the screen.  Remove the error message area and use pop-up windows  Keep the error message area and better direct the users attention to it. Further investigation to determine what ‘new’ BlueJ users think the workbench ‘remove’ option does. Re-locate the status bar so that it encourages students to use it to determine when they should perform their next action.