Presentation is loading. Please wait.

Presentation is loading. Please wait.

Designing Interfaces for Voting Machines Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland

Similar presentations

Presentation on theme: "Designing Interfaces for Voting Machines Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland"— Presentation transcript:

1 Designing Interfaces for Voting Machines Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland February 4, 2005

2 Frustrated voters  Voting technology and ballot design can influence election outcomes  Minorities and the poor are more likely to cast their ballots on outdated systems  Technology is in need of updating

3 When Interfaces Get in the Way  Ballot design Butterfly ballot  Interaction Hanging chad Changing vote (i.e., how to unselect a candidate)  Write-In problems 2004 - NY Times Editorial reported on San Diego mayoral election where voters for write-in candidate Frye didn’t darken a bubble. 2002 - Mt. Airy, MD mayor went from Holt to Johnson to Holt, based on acceptable spellings.

4 Usability Part of Larger Issues  Florida 2000 – Traditional technologies flawed Mechanical levers – break down, difficult to maintain, difficult to store and transport Paper ballots – errors, difficult to process and interpret Punch cards – hanging chad, etc.  Economics de-emphasizes usability  Focus on security de-emphasizes usability  Lack of research because of proprietary systems and number of designs

5 Our Study  Funded by NSF (National Science Foundation), Grant #0306698 “Project to Assess Voting Technology and Ballot Design” Carnegie Corporation, Grant #D05008  Consists of: Expert review<= Focus today Lab study<= Focus today New technology<= Focus today Field test Natural experiments  Co-Researchers with Paul Herrnson – Univ. of Maryland (project leader) Michael Traugott & Fred Conrad – Univ. of Michigan Richard Niemi – Univ. of Rochester Small-scale studies to demonstrate potential challenges and inform future research Does not address accuracy, affordability, accessibility, durability, or ballot design This represents partial results mid-way through a 3 year study. Future work will address accuracy, ballot design, and more

6 Partners  Federal Election Commission (FEC)  Maryland State Board of Elections  National Institute of Standards and Technology (NIST)  Vendors Diebold Hart InterCivic ES&S NEDAP Avante  Advisory Board

7 Machines Looked At  Avante Vote Trakker  Diebold AccuVote TS  ES&S Optical Scan  Hart eSlate  NEDAP LibertyVote  UMD Zoomable system As available for testing. Some machines have been deployed with different options. Some machines have since been updated. Vendors (except NEDAP) implemented ballots for best presentation. Machines selected to represent specific features

8 Avante Vote Trakker All photos taken by our research group – not provided by vendors.

9 Diebold AccuVote TS

10 ES&S Optical Scan

11 Hart eSlate

12 NEDAP LibertyVote

13 UMD Zoomable System Demo

14 Expert Review  12 HCI experts one evening 1 voting interaction specialist 1 government usability practitioner 5 academic HCI researchers 6 private usability practitioners  Each used 6 machines 2 ballot types where available (office block, party column) ~15 minutes each  Asked to list concerns  Followed worst case perspectives of novice poor language skills older voters stressed voters system errors Most experts did not have background in voting systems Subjective responses require interpretation

15 Expert Review Rating System  Each issue given a severity rating (1-low, 5-high)  Concerns listed with average severity, # of instances

16 Avante VoteTrakker Concerns (average severity, number of instances) 5.01Write-in requires last name 4.02Record shown too fast and without instructions 4.02No previous button 1 3.02Auto-forward confusing 1 3.01Smiley face inappropriate 3.01Title too small 3.01Instruction text poorly written 3.01Didn't like this one at all 3.01"Cast ballot to continue" not clear - it actually finishes 1 Navigation focuses on progress with later review by design

17 Avante VoteTrakker (more I) Concerns (average severity, number of instances) 3.01Timed out, but didn't see warning 3.01Angle of machine is awkward 3.01Lot of reflection on screen 3.01Flashing instruction is distracting 3.01Colors of text poor (green/white, black/blue) 3.01No progress feedback 3.01No way to cancel and leave 2 3.01No way to start over 3.01"Please make selection" message is distracting 3.01no error-checking on write-in 2 Can time-out to cancel

18 Avante VoteTrakker (more II) Concerns (average severity, number of instances) 3.01Write-in association very small 3.01No way to go to end and cast ballot 3 3.01Lack of color on amendment screen may appear to be an error 3.01Disabled button is "white" which is very difficult to understand 3.01Cast ballot button requires 2 presses 3.01Can't say "no" to paper record - so why bother? 3.01Have to pick contrast/text size before starting 3.01No instructions after starting 2.01Not clear what to do at beginning 3 By design to minimize under-votes

19 Diebold AccuVote TS Concerns (average severity, number of instances) 5.01Ballot review confusing. Review colors don't match voting colors 5.01No help on some screens 5.01Write-in has no instructions 4.01Contrast and text size controls not clear 4.01Some font colors unclear (black on blue, red/blue) 4.01Party not clearly indicated 4.01Difficult to use while seated 4.01Large font is good, but "issues" text runs over screen display area requiring arrow navigation 3.02Wait icon is too computerish and not clear 3.01Card hard to enter

20 Diebold AccuVote TS (more) Concerns (average severity, number of instances) 3.01Poor depiction of voting vs. reviewing state 3.01"Card not inserted" error needs a diagram 3.01Buttons have poor visual affordance 3.01Instructions refer to "backspace" key, but is actually labeled "back" 3.01Instructions unclear (i.e., "Vote for one") 3.01Some text unclear (i.e., "2 of 4") 3.01Multiple write-in unclear 3.01Write-in not well associated with race being voted 1.01Extra dots on help/instruction screens

21 ES&S Optical Scan Concerns (average severity, number of instances) 5.01Instructions not mandatory, errors likely 5.01Write-in has high error mode (enter name, but not fill in circle) 4.02Changing vote process is punitive - must start over which could cause some to give up 4.01Poor visual grouping (title could be associated with items below) 4.01Could fold, bend or tear ballot 4.01No instructions to review ballot before submitting 4.01Instructions to turn over page not conspicuous enough 3.52Font size is fixed, and will be too small for some older and other voters 3.52No error checking on under-vote 3.02No error checking on over-vote

22 ES&S Optical Scan (more) Concerns (average severity, number of instances) 3.01Should use different highlight/feedback that vote was correct 3.01Why two sets of matching instructions? 3.01Instructions somewhat difficult for voters with limited English proficiency 3.01Instructions should say something about no extra marks on ballot 2.73Needs a better table - low and shaky 1 2.01Seated operation awkward 1.01"Vote in next column" unclear 1.01Appears to be an entry field at top of column 1 Different cost/quality tables available

23 Hart eSlate Concerns (average severity, number of instances) 5.01Combining summary and cast ballot confuses actual casting 4.01No way to jump to end 4.04Dial slow to learn, hard to use 1 4.01Red on blue text, and light fonts hard to read 3.52After reviewing, it's hard to get back to a choice to change it 3.52Blue movement on screen is disconcerting 3.01Cast ballot button didn't accept push - required 3 presses 1 Compare to subjective/objective data later

24 Hart eSlate (more) Concerns (average severity, number of instances) 3.01Poor progress indicator 3.01May confuse with a touch screen 3.01Can't clear entire vote and start over in one step 3.01Write-in screen does not indicate office being voted for 3.01Next/Prev and Dial ambiguous 3.01Auto-forward on select, but not unselect (inconsistent interface)

25 NEDAP LibertyVote Concerns (average severity, number of instances) 5.02Write-in message after OK is confusing 5.02No way to confirm/review write-in name 5.01"No vote" light should be different color (difficult to see what wasn't finished) 5.01No clear way to handle multiple write-ins 5.01Poor feeling of privacy due to size 4.52"Enter write-in" button doesn't seem to work 4.33Under-vote message easy to miss 4.03OK button for write-in too far away 4.02Too much reflection 4.01OK button with 4 arrows is weird 4.01Propositions too far away 4.01Hard to read/access from seated position

26 NEDAP LibertyVote (more) Concerns (average severity, number of instances) 4.01Number pad unclear - what is it for? 4.01Blue light coding (voted/unvoted) unclear 4.01"enlarge" scrollbar un-obvious (to left of little message screen) 4.01Buttons hard to press with poor tactile feedback 4.01Scroll bar thing to right of message box unclear 3.73Difficult to correct a vote 3.52Write-in area too far away 3.01"Partisan offices" unclear terminology 3.01Can change language accidentally 3.01Same color for race and candidate is unclear 3.01Prefer sequence to "jump around" model of full face ballot 2.01No second chance to cast vote - review is implicit

27 NEDAP Actual Ballot

28 UMD Zoomable Concerns (average severity, number of instances) 4.03Color of review & cast ballot buttons should be different than progress indicator and selected items 3.01Not clear how to get started 3.01Feels like a game - possibly inappropriate 3.01"Not voted" confusing when multiple choices available 3.01Peripheral races too visually confusing 2.52Progress/navigation buttons is partly a progress indicator, but not clear enough 2.01Overview buttons shouldn't split 4 sub-types

29 Lab Study  42 members from Ann Arbor, MI voted on 6 machines Paid $50 for 1-2 hours Different Random orders for different people Latin Square design Over selected for potential difficulty Most (69%) >= 50 years old Most (62%) uses computers once very 2 weeks or less Most (30) voted on office-block ballot Indicated intention of (fictional) candidates by circling names on paper form Study not controlled for prior experience, but Ann Arbor uses optical scan  Data: Satisfaction ratings reported after voting on each machine Time measurement Videotaped interactions

30 Lab Study (more)  Looked at: Time voters spend reading instructions Response to paper or on-screen ballot Response to the reporting of under- or over-voting Ability to change a vote  Complications and malfunctions of DRE or Optical Scan Readers

31 Lab Study – Satisfaction Data  Usability studies typically measure: Speed, Accuracy, Satisfaction We are currently reporting on two (Speed, Satisfaction)

32 “The voting system was easy to use”

33 “I felt comfortable using the system”

34 “ Correcting my mistakes was easy”

35 “Casting a write-in vote was easy to do”

36 “Changing a vote was easy to do”

37 Lab Study - Time to Cast Ballot

38 Lab Study – Analysis Remains  Why are some machines consistently most preferred and others least preferred?  Detailed coding of video interactions underway  Planned analyses of video interactions: Tally of problems by machine that do and do not lead to unintended votes cast Explanation of satisfaction data in terms of voters’ actions  Remember that usability is only one characteristic of overall performance Accuracy, Accessibility, Affordability, Durability, Security, Transportability, etc.

39 Future Parts of the Project  Field Test Assess usability among large, more representative sample Assess impact of ballot designs on usability issues Assess accuracy on different systems  Natural Experiments Assess impact of voting systems and ballot designs on over-voting, under-voting, straight-party voting, and other measures across jurisdictions and over time Assess impact of changing from specific types of voting systems (or ballots) to another system (or ballot)

40 Implications and Reflections  Voter intention is the key goal  Usability is as important as security (and so is accuracy and accessibility as well as affordability and durability)  Being able to update interface is important (i.e., certification may be interfering with usability)  Ballot/machine combination important (i.e., one size doesn’t fit all) This talk available with vendor’s responses

Download ppt "Designing Interfaces for Voting Machines Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland"

Similar presentations

Ads by Google