Improving the Usability and Accessibility of Voting Systems and Products Dr. Sharon Laskowski July 9, 2004 TDGC Meeting.

Slides:



Advertisements
Similar presentations
TGDC Meeting, December 2011 Usability and Accessibility (U&A) Research Update Sharon J. Laskowski, Ph.D.
Advertisements

Workplace Safety and Insurance Board | Commission de la sécurité professionnelle et de l’assurance contre les accidents du travail 2014 Safety Groups Advantage.
Existing Documentation
Human Factors in Voting Systems John O’Hara IEEE Usability-Accessibility Working Group Chair HFES Voting System Task Force Chair Advisory Board Meeting.
Soft. Eng. II, Spr. 02Dr Driss Kettani, from I. Sommerville1 CSC-3325: Chapter 6 Title : The Software Quality Reading: I. Sommerville, Chap: 24.
FAA Workshop on Key Characteristics for Advanced Material Control
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Project Plan The Development Plan The project plan is one of the first formal documents produced by the project team. It describes  How the project will.
ICS 463, Intro to Human Computer Interaction Design: 9. Experiments Dan Suthers.
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
Assessing and Evaluating Learning
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
1 Lecture 4.3a: Metrics Overview (SEF Ch 14) Dr. John MacCarthy UMBC CMSC 615 Fall, 2006.
Voting System Qualification How it happens and why.
12/9-10/2009 TGDC Meeting TGDC Recommendations Research as requested by the EAC John P. Wack National Institute of Standards and Technology
OHT 2.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Software Quality assurance (SQA) SWE 333 Dr Khalid Alnafjan
Improving U.S. Voting Systems The Voters’ Perspective: Next generation guidelines for usability and accessibility Sharon Laskowski NIST Whitney Quesenbery.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
CS 4310: Software Engineering
TGDC Meeting, July 2011 Usability and Accessibility Test Methods: Preliminary Findings on Validation Sharon Laskowski, Ph.D. Manager, NIST Visualization.
MethodGXP The Solution for the Confusion.
S/W Project Management
CHAPTER 5 Infrastructure Components PART I. 2 ESGD5125 SEM II 2009/2010 Dr. Samy Abu Naser 2 Learning Objectives: To discuss: The need for SQA procedures.
OHT 16.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 The objectives of training and certification The training and certification.
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
CS3100 Software Project Management Week 26 - Quality Dr Tracy Hall.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Interacting with IT Systems Fundamentals of Information Technology Session 5.
CEN rd Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Phases of Software.
3461P Crash Course Lesson on Usability Testing The extreme, extreme basics...
Heuristic evaluation Functionality: Visual Design: Efficiency:
Mechanical Integrity Written Procedures. Lesson Objectives  Describe Required Written Procedures for Establishing an MI Program  List Acceptable Sources.
OpenSG Conformity IPRM Overview July 20, ITCA goals under the IPRM at a high level and in outline form these include: Organize the Test and Certification.
CS 3610: Software Engineering – Fall 2009 Dr. Hisham Haddad – CSIS Dept. Chapter 2 The Software Process Discussion of the Software Process: Process Framework,
Usability and Accessibility Working Group Report Sharon Laskowski, PhD National Institute of Standards and Technology TGDC Meeting,
ISE 412: Human Factors Engineering Dr. Laura Moody Fall, 2006.
University of Sunderland COM369 Unit 6 COM369 Project Quality Unit 6.
VVSG: Usability, Accessibility, Privacy 1 VVSG, Part 1, Chapter 3 Usability, Accessibility, and Privacy December 6, 2007 Dr. Sharon Laskowski
12/9-10/2009 TGDC Meeting Usability and Accessibility Progress and Challenges Sharon Laskowski, PhD National Institute of Standards and Technology
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Software Project Planning.
Fault Tolerance Benchmarking. 2 Owerview What is Benchmarking? What is Dependability? What is Dependability Benchmarking? What is the relation between.
TGDC Meeting, July 2010 Report of the UOCAVA Working Group John Wack National Institute of Standards and Technology DRAFT.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Voting Equipment in America: Observations from the Field Bill Killam, MA CHFP (703)
Next VVSG Training Standards 101 October 15-17, 2007 Mark Skall National Institute of Standards and Technology
Creating Accessibility, Usability and Privacy Requirements for the Voluntary Voting System Guidelines (VVSG) Whitney Quesenbery TGDC Member Chair, Subcommittee.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Briefing for the EAC Public Meeting Boston, Massachusetts April 26, 2005 Dr. Hratch Semerjian, Acting Director National Institute of Standards and Technology.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Update: Revising the VVSG Structure Sharon Laskowski vote.nist.gov April 14, 2016 EAC Standards Board Meeting 1.
Chapter 16 Staff training and certification. Outline The objectives of training and certification The training and certification process Determine professional.
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
Stages of Research and Development
Classroom Assessments Checklists, Rating Scales, and Rubrics
National Institute of Standards and Technology
ISO/IEC
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Assessment of Learning 1
Software Verification and Validation
SEVERITY & PRIORITY RELATIONSHIP
CIF301 Project Quality Unit 6
Alternative Assessment (Portfolio)
Usability engineering
Classroom Assessments Checklists, Rating Scales, and Rubrics
Designing Assessment Things to be considered:
HCI Evaluation Techniques
Assessment Methods.
Presenter: Kate Bell, MA PIP Reviewer
What is a CA document? Date: Authors: March 2005 March 2005
Software Reviews.
Presentation transcript:

Improving the Usability and Accessibility of Voting Systems and Products Dr. Sharon Laskowski July 9, 2004 TDGC Meeting

2

3 Human Factors/Usability Perspective on Voting Systems: Voters Cognitive and physical nature of the voters Physical environment Psychological environment Voting product Usability is determined by the demands of the system and the voter’s ability to perform under those demands

4 Measuring Accessibility and Usability Accessibility  The degree to which a system is available to and usable by individuals with disabilities Usability  A measure of the effectiveness, efficiency, and satisfaction achieved by a specified set of users performing specified tasks with a given product Metrics: errors causing a vote cast not as intended or a vote not cast, (errors prior to success), and time to cast vote Designing and measuring process  User-centered design  Diagnostic usability evaluation  Testing performance—usability testing

5 State of Usability of US Voting Systems In general, voting systems have not been measured for usability nor have they been developed using a user-centered design process We do not know the degree to which voters cast their vote NOT as they intended due to confusion with the user interface Note observations by, CalTech/MIT Voting Technology Project, Herrnson et.al., and others such as Doug Jones

6 Design and Performance Standards Design Standards—how the product is designed  For example, font size, ballot instructions Performance Standards—how the product functions  No overvoting, test by demonstration  Time to cast vote, failures in casting vote as intended Requires: measuring with users against benchmarks, Sample ballots of different complexity, and Well-defined test protocols and user groups

7 We currently cannot measure usability of voting systems  E.g., select/deselect  Need high degree of usability  Following design guidelines does not necessarily insure usability  Usability engineering provides measurement methods, but not necessarily to the degree we need specifically for voting We need standards and conformance tests that do measure degree of usability and accessibility, if systems are going to be qualified and certified for usability and accessibility Measurement for Qualification and Certification rigorous research & experiments informal evaluation Easy, variableComplex, reliable feasible reproducible conformance testing

8 Current Voting Standards and Testing Current VSS has some accessibility standards, but only a usability appendix ITAs currently perform qualification tests Can we test for usability and accessibility?  Standards must be clear, unambiguous, testable  Requires procedures for testing the voting product against the standards (conformance testing) For example, inspection, demonstration, operation IEEE P1583 draft standards  Task Group 3 has made some progress But, lack of resources and small vendor base have been a barrier to developing standards that are performance-based standards, benchmarks and conformance tests

9 The HF Report Our report recommends an approach that will produce measurable voting systems standards for usability and accessibility  Doesn’t need a lot of research  Does need: Expertise in conformance test development Some applied research to develop user testing protocols Neutral third parties to accomplish this No cheap, quick fixes  Could require some usability testing to avoid major usability blunders, but this is no guarantee

10 10 Recommendations 1. Performance-based, high-level usability standards 2. Complete set of user-related functional requirements 3. Avoid low-level design specifications. Use only product design requirements that have been validated as necessary 4. Applied research to support the development of usability and accessibility standards 5. Review current requirements (Access Board, the current VSS, draft IEEE standards) for possible adoption 6. Ballot design guidelines 7. Guidelines for facility and equipment layout; design and usability testing guidelines for vendor- and state-supplied documentation and training materials 8. Vendors should incorporate a user-centered design approach 9. Conformance tests for voting products against the applicable accessibility requirements. 10. Valid, reliable, repeatable, and reproducible process for usability conformance testing of voting products against the standards described in recommendation 1) with agreed upon usability pass/fail requirements.

11 Most Critical Need A set of usability standards for voting systems that are performance-based, with  Objective measures  Conformance test procedures Then voting products and systems can be certified that they meet the usability standards This is the only way to guarantee high levels of usability

12 RoadMap (Details in Report) Short term: encourage usability and user- centered design Long term:  Use best of IEEE and other standards …and ballot design guidance  Develop user test procedures  Collect user data to define performance baselines  Develop performance standards and conformance tests