We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byRaphael Thomas
Modified about 1 year ago
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 30 Evaluation – Part II Virtual University Human-Computer Interaction
Virtual University - Human Computer Interaction 2 © Imran Hussain | UMT In Last Lecture … Introduction to evaluation –What Evaluation is? –Significance and importance of evaluation –Different paradigms –Different techniques which constitute this paradigm
Virtual University - Human Computer Interaction 3 © Imran Hussain | UMT Evaluation The process of systematically collecting data that informs us about what it is like for a particular group of users to use a product for a particular task in a certain type of environment
Virtual University - Human Computer Interaction 4 © Imran Hussain | UMT In Today’s Lecture … Framework to guide evaluation process Usability testing
Virtual University - Human Computer Interaction 5 © Imran Hussain | UMT Framework to guide evaluation DECIDE framework 6 unique phases –Determine the overall goals that the evaluation addresses. –Explore the specific questions to be answered. –Choose the evaluation paradigm and techniques to answer the questions –Identify the practical issues that must be addressed, such as selecting participants –Decide how to deal with the ethical issues. –Evaluate, interpret, and present the data.
Virtual University - Human Computer Interaction 6 © Imran Hussain | UMT Determine the goals What are the high-level goals of the evaluation? Who wants it and why?
Virtual University - Human Computer Interaction 7 © Imran Hussain | UMT Explore the questions In order to make goals operational, questions must be answered to satisfy them. For example –What are customers’ attitude to these new tickets? –Do customers have adequate access to computers to make bookings? –Are they concerned about security? –Does this electronic system have a bad reputation?
Virtual University - Human Computer Interaction 8 © Imran Hussain | UMT Choose the evaluation paradigm and techniques
Virtual University - Human Computer Interaction 9 © Imran Hussain | UMT Identify the practical issues that must be addressed Users and the participants Facilities and equipment Schedule and budget constraints –Time and budget constraints are also important considerations to keep in mind. Expertise –Does the evaluation team have the expertise to do the evaluation.
Virtual University - Human Computer Interaction 10 © Imran Hussain | UMT Decide how to deal with the ethical issues.
Virtual University - Human Computer Interaction 11 © Imran Hussain | UMT Evaluate, interpret, and present the data Reliability –The reliability or consistency of a technique is how well it produces the same results on separate occasions under the same circumstances. Validity –Validity is concerned with whether the evaluation technique measures what it is supposed to measure. Biases –Biases occur when the results are distorted. Ecological validity –Ecological validity is concerned how the environment in which an evaluation is conducted influence or even distorts the results.
Virtual University - Human Computer Interaction 12 © Imran Hussain | UMT Usability testing Plan and Prepare for the test –Planning the usability test –Defining goals and concerns –Deciding participants –Recruiting participants –Selecting and organizing tasks to test –Deciding how the measure usability –Preparing test material –Preparing test environment –Preparing test team –Conducting a pilot test
Virtual University - Human Computer Interaction 13 © Imran Hussain | UMT Usability testing Defining goals and concerns and then decide who should be the participants in the usability test
Virtual University - Human Computer Interaction 14 © Imran Hussain | UMT Selecting task and creating task scenarios and preparing test materials
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
©2011 1www.id-book.com An evaluation framework Chapter 13.
Chapter 14: Usability testing and field studies. The aims: Explain how to do usability testing through examples. Outline the basics of experimental design.
+ Program Evaluation Planning & Data Analysis ScWk 242 – Session 11 Slides.
1 of 18 Information Dissemination New Digital Opportunities IMARK Investing in Information for Development Information Dissemination New Digital Opportunities.
1 Assessing Evidence Reliability In Performance Audits NSAA April 14, 2008.
1 of 20 Evaluating an Information Project From Questions to Results © FAO 2005 IMARK Investing in Information for Development Evaluating an Information.
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Testing Relational Database. Overview Once the design of a database system has been completed, the developers are ready to move into the implementation.
A Systems Approach To Training by Dr. James J. Kirk Professor of HRD Western Carolina University.
1 of 13 Organization and Management Information Management in Your Organization IMARK Investing in Information for Development Organization and Management.
1 Experimentation in Software Engineering: an introduction Mariano Ceccato FBK Fondazione Bruno Kessler
1 of 18 Information Access Introduction to Information Access © FAO 2005 IMARK Investing in Information for Development Information Access Introduction.
1 of 20 Information Dissemination Audiences and Markets IMARK Investing in Information for Development Information Dissemination Audiences and Markets.
1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Process Improvement IS301 – Software.
Lecture 5: Requirements Engineering Dr Valentina Plekhanova University of Sunderland, UK
1 Requirements Engineering Processes – 2. 2 Recap of Last Lecture - 1 We introduced the concept of requirements engineering process We discussed inputs.
Introduction to Team- based Usability Testing As companies design more for usability and understanding, they will discover a competitive edge, for these.
School Based Assessment and Reporting Unit Curriculum Directorate Assessment.
1 of 19 How to invest in Information for Development An Introduction IMARK How to invest in Information for Development An Introduction © FAO 2005.
1 Notes content copyright © 2004 Ian Sommerville. NU-specific content © 2004 M. E. Kabay. All rights reserved. Socio-technical Systems IS301 – Software.
1 of 17 Information Strategy The Features of an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy The.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
What is the ADDIE Model? By: Edith Leticia Cerda.
UNIT-V DEFECT PREVENTION 1Defect prevention (Arun)
Software Engineering Lecture 8 Systems Analysis: Concept and Principles 1.
Competence is the demonstrated ability to apply knowledge and/or skills and, where relevant, personal attributes. A certification scheme contains.
1 of 15 Information Access Internal Information © FAO 2005 IMARK Investing in Information for Development Information Access Internal Information.
Software Quality Management CIS 376 Bruce R. Maxim UM-Dearborn.
1 Mateja Bizilj PEMPAL BCOP KEY CONCEPTS AND APPROACHES TO PROGRAMS AND PERFORMANCE Tbilisi, June 28, 2007.
© 2016 SlidePlayer.com Inc. All rights reserved.