Presentation is loading. Please wait.

Presentation is loading. Please wait.

Empirical Evaluation of Web Survey Software Tools: Powerful or Friendly? Vasja Vehovar, Nejc Berzelak, Katja Lozar Manfreda, Tina Horvat University of.

Similar presentations


Presentation on theme: "Empirical Evaluation of Web Survey Software Tools: Powerful or Friendly? Vasja Vehovar, Nejc Berzelak, Katja Lozar Manfreda, Tina Horvat University of."— Presentation transcript:

1 Empirical Evaluation of Web Survey Software Tools: Powerful or Friendly? Vasja Vehovar, Nejc Berzelak, Katja Lozar Manfreda, Tina Horvat University of Ljubljana, Faculty of Social Sciences Matjaž Debevc University of Maribor, Faculty of Electrical Engineering and Computer Science General Online Research Conference – GOR 2009 April 8, 2009

2 Introduction Reasons for evaluation Many software tools for web surveys on the market (more than 350 entries in the WebSM database at www.websm.org).www.websm.org Large differences in  number of features (powerfulness),  usability (user-friendliness),  prices and costs. The problem of selection

3 Introduction The problem of selection Users often experience that more powerful computer applications are harder to use, less user-friendly. Key questions:  Is this also true for web survey software?  How to evaluate usability and is it important at all?  What other criteria should be taken into account when deciding for a specific web survey software?

4 Introduction Different approaches to evaluation 1.Availability of features in selected software tools. 2.Speed of use and other aspects of usability. 3.Specific requirements of research organization. (project types and size, future needs, organizational specifics). 4.Full cost-benefit analysis (cost models, support, learning, documentation, technical requirements,...).

5 State of the art Past evaluations Only few evaluations available:  Crawford (2002, 2006): review of main characteristics and features of current and future web survey software.  Kaczmirek (2004, 2008): general overview of web survey software tools and criteria for evaluation of their capabilities.  Berzelak et al. (2006): basic features of a sample of software tools on the market.

6 State of the art WebSM database overview Overview of basic characteristics of all software tools in the WebSM database. (Predominantly software tools with English and German interfaces included; there might be also other tools in other languages.)

7 State of the art WebSM database overview (continued) Brief summary:  94% closed source, only few open source;  68% completely chargeable, 11% completely free of charge (including open source applications);  28% require own server to upload survey questionnaire.

8 State of the art WebSM database overview (continued) Code availability N=351

9 State of the art WebSM database overview (continued) Pricing category N=353

10 State of the art WebSM database overview (continued) Language of user interface N=351 LanguagePercent English91% German19% French7% Dutch5% Spanish4% Italian3% Swedish3% Other15%

11 State of the art WebSM database overview (continued) Hosting survey questionnaire N=327

12 State of the art WebSM database overview (continued) Offices in countries N=300 CountryPercent USA54 United Kingdom17 Germany16 Canada11 Australia6 Sweden5 The Netherlands4 France4 Switzerland4 Other37

13 State of the art Development between 2006 and 2008 13 basic features reviewed for 72 software tools

14 Methodology of the current study Sample selection 27 software tools fulfilling the following criteria included:  freely and immediately accessible trial with sufficient functionality;  fully web-based user interface (no installation required);  English user interface. High-end solutions likely to be excluded (lack of immediately accessible trials). Data are not used for making generalizations.

15 Methodology of the current study Evaluation of features Two experts reviewed the selected tools for available features. Several basic and advanced features reviewed:  questionnaire design (question types, simple/advanced skip logic, types of answer control, randomizations, multi-language support, visual customizations, mixed-mode support…);  management of sample and survey access (automated e-mail invitation support, access restrictions, quota management, data editing…);  data analysis and export.

16 Methodology of the current study Evaluation of usability Six first-time users created a predefined survey project, consisting of:  questionnaire design (welcome and closing page, 5 different basic question types, inclusion of an image, one skip condition, different basic and intermediate question properties);  survey activation;  reporting and data export. All users took a basic training in web survey implementation before.

17 Methodology of the current study Evaluation of usability (continued) SUMI – Software Usability Measurement Inventory 50 items, measuring five dimensions of usability:  efficiency ;  affect;  helpfulness;  control;  learnability. Automated measurement of time and number of clicks for each part of the survey project.

18 Results Number of features

19 Results Time and mouse clicks rho = 0.75 p < 0.01

20 Results Time and SUMI scores rho = -0.40 p < 0.05

21 Results Number of features and SUMI scores rho = 0.08

22 Results Relations between different dimensions SUMI dimensionsMedian time for quest. design Number of features Global-0.40*0.08 Efficiency-0.52**0.12 Affect-0.40*0.12 Helpfulness-0.320.11 Control-0.45*0.13 Learnability-0.46*0.12 (Spearman's rank correlation coefficients) rho = 0.02 * p < 0.05 ** p < 0.01

23 Conclusions Consideration of usability aspects Two aspects of usability evaluated among first-time software users:  efficiency (measured using time);  satisfaction (measured using SUMI). Strong correlation between the two aspects. No evidence of explicit tradeoff between usability and powerfulness (the picture might be even reversed). Larger number of evaluators and inclusion of high-end software tools neccessary to further clarify the picture and obtain more reliable results.

24 Conclusions Does usability matter? Higher usability – higher efficiency, higher satisfaction. Importance of usability evaluation for succesful and effective use of the software tool. Criteria and the way of measuring usability might strongly depend on organizational specifics (e.g. who implement web surveys and how frequently).

25 Conclusions Other important selection criteria Availability of needed features and services and flexibility for changed needs. Technically and methodologically sound operation and compliance with standards. Costs and available resources:  software prices;  equipment;  personnel and knowledge. Other specific requirements.

26 Empirical Evaluation of Web Survey Software Tools: Powerful or Friendly? Thank you!


Download ppt "Empirical Evaluation of Web Survey Software Tools: Powerful or Friendly? Vasja Vehovar, Nejc Berzelak, Katja Lozar Manfreda, Tina Horvat University of."

Similar presentations


Ads by Google