VVSG: Usability, Accessibility, Privacy 1 VVSG, Part 1, Chapter 3 Usability, Accessibility, and Privacy December 6, 2007 Dr. Sharon Laskowski

Slides:



Advertisements
Similar presentations
Sample size estimation
Advertisements

TGDC Meeting, December 2011 Usability and Accessibility (U&A) Research Update Sharon J. Laskowski, Ph.D.
Topics: Quality of Measurements
The Research Consumer Evaluates Measurement Reliability and Validity
12/9-10/2009 TGDC Meeting Ballot On Demand David Flater National Institute of Standards and Technology
Dr. G. Johnson, Sampling Demystified: Sample Size and Errors Research Methods for Public Administrators Dr. Gail Johnson.
5.00 Understand Promotion Research  Distinguish between basic and applied research (i.e., generation of knowledge vs. solving a specific.
Excursions in Modern Mathematics, 7e: Copyright © 2010 Pearson Education, Inc. 16 Mathematics of Normal Distributions 16.1Approximately Normal.
TGDC Meeting, July 2011 Review of VVSG 1.1 Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Lecture Slides Elementary Statistics Eleventh Edition and the Triola Statistics Series by.
Determining the Size of
Chapter 2 Research Methods. The Scientific Approach: A Search for Laws Empiricism: testing hypothesis Basic assumption: events are governed by some lawful.
Voting System Qualification How it happens and why.
12/9-10/2009 TGDC Meeting TGDC Recommendations Research as requested by the EAC John P. Wack National Institute of Standards and Technology
Improving U.S. Voting Systems The Voters’ Perspective: Next generation guidelines for usability and accessibility Sharon Laskowski NIST Whitney Quesenbery.
TGDC Meeting, July 2011 Overview of July TGDC Meeting Belinda L. Collins, Ph.D. Senior Advisor, Voting Standards, ITL
Chapter 2: The Research Enterprise in Psychology
TGDC Meeting, July 2011 Usability and Accessibility Test Methods: Preliminary Findings on Validation Sharon Laskowski, Ph.D. Manager, NIST Visualization.
Chapter 2 Research Methods. The Scientific Approach: A Search for Laws Empiricism: testing hypothesis Basic assumption: events are governed by some lawful.
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
12/9-10/2009 TGDC Meeting Vote-by-Phone David Flater / Sharon Laskowski National Institute of Standards and Technology
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
Technical Adequacy Session One Part Three.
NIST HAVA-Related Work: Status and Plans June 16, 2005 National Institute of Standards and Technology
Topic 5 Statistical inference: point and interval estimate
California Secretary of State Voting Systems Testing Summit November 28 & 29, 2005, Sacramento, California Remarks by Kim Alexander, President, California.
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
Usability and Accessibility Working Group Report Sharon Laskowski, PhD National Institute of Standards and Technology TGDC Meeting,
Briefing for NIST Acting Director James Turner regarding visit from EAC Commissioners March 26, 2008 For internal use only 1.
Educators’ Attitudes about the Accessibility and Integration of Technology into the Secondary Curriculum Dr. Christal C. Pritchett Auburn University
NIST Voting Program Activities Update February 21, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
TGDC Meeting, Jan 2011 Accessibility and Usability Considerations for UOCAVA Remote Electronic Voting Systems Sharon Laskowski, PhD National Institute.
TGDC Meeting, Jan 2011 Auditability Working Group David Flater National Institute of Standards and Technology r4.
Determining the Size of a Sample 1 Copyright © 2014 Pearson Education, Inc.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Section 7-1 Review and Preview.
12/9-10/2009 TGDC Meeting Usability and Accessibility Progress and Challenges Sharon Laskowski, PhD National Institute of Standards and Technology
Improving the Usability and Accessibility of Voting Systems and Products Dr. Sharon Laskowski July 9, 2004 TDGC Meeting.
How and what to observe in e-enabled elections Presentation by Mats Lindberg, Election Adviser, Organisation for Security and Co-operation in Europe (OSCE)
Measurement Issues General steps –Determine concept –Decide best way to measure –What indicators are available –Select intermediate, alternate or indirect.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
TGDC Meeting, Jan 2011 Help America Vote Act (HAVA) Roadmap Nelson Hastings National Institute of Standards and Technology
Chapter 10 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
Sample Size Determination
NIST Voting Program Activities Update January 4, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
Next VVSG Training Standards 101 October 15-17, 2007 Mark Skall National Institute of Standards and Technology
What is Science? SECTION 1.1. What Is Science and Is Not  Scientific ideas are open to testing, discussion, and revision  Science is an organize way.
The VVSG Version 1.1 Overview Matthew Masterson Election Assistance Commission
EAC-requested VVSG Research Overview and Status June 2008 Mark Skall Chief, Software Diagnostics and Conformance Testing Division National Institute of.
Creating Accessibility, Usability and Privacy Requirements for the Voluntary Voting System Guidelines (VVSG) Whitney Quesenbery TGDC Member Chair, Subcommittee.
12/9-10/2009 TGDC Meeting The VVSG Version 1.1 Overview John P. Wack National Institute of Standards and Technology
Election Assistance Commission 1 Technical Guidelines Development Committee Meeting Post-HAVA Voting System Requirements – Federal Perspective February.
5. 2Object-Oriented Analysis and Design with the Unified Process Objectives  Describe the activities of the requirements discipline  Describe the difference.
Briefing for the EAC Public Meeting Boston, Massachusetts April 26, 2005 Dr. Hratch Semerjian, Acting Director National Institute of Standards and Technology.
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Statistics 19 Confidence Intervals for Proportions.
Election Assistance Commission 1 TGDC Meeting High Level VVSG Requirements: What do they look like? February, 09, United States.
Update: Revising the VVSG Structure Sharon Laskowski vote.nist.gov April 14, 2016 EAC Standards Board Meeting 1.
Intro to Probability and Statistics 1-1: How Can You Investigate Using Data? 1-2: We Learn about Populations Using Samples 1-3: What Role Do Computers.
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
TGDC Meeting, Jan 2011 Accessibility and Usability Considerations for UOCAVA Remote Electronic Voting Systems Sharon Laskowski, PhD National Institute.
Design and Data Analysis in Psychology I English group (A) Salvador Chacón Moscoso Susana Sanduvete Chaves Milagrosa Sánchez Martín School of Psychology.
12/9-10/2009 TGDC Meeting Alternatives to Software Independence Nelson Hastings National Institute of Standards and Technology
The VVSG 2005 Revision Overview EAC Standards Board Meeting February 26-27, 2009 John P. Wack NIST Voting Program National Institute.
Chapter 2 Research Methods.

National Institute of Standards and Technology
CDF for Voting Systems: Human Factors Issues
STEM Fair Graphs & Statistical Analysis
Using statistics to evaluate your test Gerard Seinhorst
Presentation transcript:

VVSG: Usability, Accessibility, Privacy 1 VVSG, Part 1, Chapter 3 Usability, Accessibility, and Privacy December 6, 2007 Dr. Sharon Laskowski

Dec 6, 2007Page 2VVSG: Usability, Accessibility, Privacy Goal of Usability, Accessibility, Privacy Requirements Goal of these requirements is to provide a voting system that voters can use comfortably, efficiently, and with justified confidence that they have cast their votes correctly. while preserving the privacy of the contents of a voter’s ballot Focus is primarily on the voter’s interaction with the voting system, but there are requirements for poll workers.

Dec 6, 2007Page 3VVSG: Usability, Accessibility, Privacy Key Concept: Universal Usability View the voting station as a public kiosk, designed for “everyone”, not special purpose Many people have some special needs, but do not identify themselves as having disabilities Aging population, those with language or reading issues Move as much accessibility into general voting station as possible ALL usability requirements apply to the ACC-VS

Dec 6, 2007Page 4VVSG: Usability, Accessibility, Privacy Key Concept: Accessibility Goal is to make the voting system independently accessible to as many voters as possible Harmonized with other accessibility standards, collaborated with the US Access Board Section 3.3 is organized according to the type of disability being addressed NOTE: features intended primarily to address one kind of disability may assist voters with other kinds

Dec 6, 2007Page 5VVSG: Usability, Accessibility, Privacy Key Concept: Accessible Voter Verification Software independence (SI) for security and its implications for accessibility has been a large issue for the TGDC SI in current systems can only be addressed using paper IVVR But, paper by itself is not accessible: some voters cannot verify directly So, there are a number of requirements that address how to make the IVVR accessible This includes using “observational testing” to verify the reliability of indirect means of verification

Dec 6, 2007Page 6VVSG: Usability, Accessibility, Privacy Key Concept: Design vs. Performance Design requirements Specify the “look and feel” of the voter interface, for general classes of voting systems Based on best practice from other similar domains Typically tested by inspection or expert review Performance requirements Based on usability testing best practices Specifies a benchmark which must be met when voters interact with the system Tested in a tightly controlled environment with human test participants

Dec 6, 2007Page 7VVSG: Usability, Accessibility, Privacy Usability Performance Requirements Goal: To develop a test method to distinguish systems with poor usability from those with good usability Based on performance not evaluation of the design Reliably detects and counts errors one might see when voters interact with a voting system Reproducible by test laboratories Technology-independent

Dec 6, 2007Page 8VVSG: Usability, Accessibility, Privacy Calculating benchmarks Given such a test method, benchmarks can be calculated: a system meeting the benchmarks has good usability and passes the test The values chosen for the benchmarks become the performance requirements

Dec 6, 2007Page 9VVSG: Usability, Accessibility, Privacy Usability testing for certification in a lab We are measuring the performance of the system in a lab We control for other variables, including the test participants We measure the effect of the system on usability The test ballot is designed to detect different types of usability errors and be typical of many types of ballots The test environment is tightly controlled, e.g., for lighting, setup, instructions, no assistance The test participants are chosen to reliably detect the same performance on the same system

Dec 6, 2007Page 10VVSG: Usability, Accessibility, Privacy Usability testing for certification in a lab Test participants are told exactly how to vote, so errors can be measured The test results measure relative degree of usability between systems and are NOT intended to predict performance in a specific election Ballot is different Environment is different (e.g, help is provided) Voter demographics are different A general sample of the US voting population is never truly representative because all elections are “local”.

Dec 6, 2007Page 11VVSG: Usability, Accessibility, Privacy Components of the test method (Voting Performance Protocol) Well-defined test protocol that describes the number and characteristics of the “voters” participating in the test and how to conduct test, Test ballot that is relatively complex to ensure the entire voting system is evaluated and significant errors detected, Instructions to the “voters” on exactly how to vote so that errors can be accurately counted, Description of the test environment, Method of analyzing and reporting the results, and Performance benchmarks with associated threshold values.

Dec 6, 2007Page 12VVSG: Usability, Accessibility, Privacy Performance Benchmarks: Recap of Research Validity: tested on 2 different systems with 47 participants Test protocol detected differences between systems, produces errors that were expected. Repeatability/Reliability: 4 tests on same system, 195 participants, similar results

Dec 6, 2007Page 13VVSG: Usability, Accessibility, Privacy Benchmark Tests 4 systems tested Selection of DREs, EBMs, PCOS 187 test participants 5 measurements 3 benchmark thresholds 2 values to be reported only

Dec 6, 2007Page 14VVSG: Usability, Accessibility, Privacy The Performance Measures Base Accuracy Score We first count the number of errors test participants made on the test ballot – there are 28 voting opportunities: count how many were correct for each participant We then calculate a Base Accuracy Score: the mean percentage of all ballot choices that are correctly cast by the test participants

Dec 6, 2007Page 15VVSG: Usability, Accessibility, Privacy The percentage of test participants who were able to complete the process of voting and have their ballot choices recorded by the system. We calculate 3 effectiveness measures: Total Completion Score

Dec 6, 2007Page 16VVSG: Usability, Accessibility, Privacy Voter Inclusion Index (VII) * A measure of overall voting accuracy that uses the Base Accuracy Score and the standard deviation. If 2 systems have the same Base Accuracy Score (BAS), the system with the larger variability gets a lower VII. The formula, where S is the standard deviation and LSL is a lower specification limit to spread out the measurement (we used.85), is: *range is 0 to ~1, assuming best value is 100% BAS, S=.05, but may be higher

Dec 6, 2007Page 17VVSG: Usability, Accessibility, Privacy Perfect Ballot Index (PBI) * The ratio of the number of cast ballots containing no erroneous votes to the number of cast ballots containing at least one error. This measure deliberately magnifies the effect of even a single error. It identifies those systems that may have a high Base Accuracy Score, but still have at least one error made by many participants. This might be caused by a single voting system design problem, causing a similar error by the participants. The higher the value of the index, the better the performance of the system. *range is 0 to infinity, if no errors at all.

Dec 6, 2007Page 18VVSG: Usability, Accessibility, Privacy Efficiency and Confidence Measures Average Voting Session Time – mean time taken for test participants to complete the process of activating, filling out, and casting the ballot. Average Voter Confidence – mean confidence level expressed by the voters that they believed they voted correctly and the system successfully recorded their votes. Neither of these measures were correlated with effectiveness. Most people were confident in the system and their ability to use the system.

Dec 6, 2007Page 19VVSG: Usability, Accessibility, Privacy Benchmark thresholds Voting systems, when tested by laboratories designated by the EAC using the methodology specified in this paper, must meet or exceed ALL these benchmarks: Total Completion Score of 98% Voter Inclusion Index of.35 Perfect Ballot Index of 2.33 Note: 2 of the systems we tested failed the VII, 1 failed the PBI Report time and confidence

Dec 6, 2007Page 20VVSG: Usability, Accessibility, Privacy Additional Research Reproducibility: How much flexibility can be allowed in the test protocol? Will variability in test participants experience due to labs in different geographic regions affect results? Should we factor in older population or less educated population? Benchmark thresholds are always tied to the demographics of the test participants to some extent Accessible voting system performance?