Presentation is loading. Please wait.

Presentation is loading. Please wait.

TGDC Meeting, Jan 2011 VVSG 1.1 Test Suite Status Mary Brady National Institute of Standards and Technology

Similar presentations


Presentation on theme: "TGDC Meeting, Jan 2011 VVSG 1.1 Test Suite Status Mary Brady National Institute of Standards and Technology"— Presentation transcript:

1 TGDC Meeting, Jan 2011 VVSG 1.1 Test Suite Status Mary Brady National Institute of Standards and Technology http://vote.nist.gov

2 TGDC Meeting, Jan 2011 Page 2 Background Status quo: Labs have been testing to VVSG 1.0 (2005) using proprietary, custom tooling and review processes In 2007-08, NIST developed a set of public test suites for VVSG 2.0, to be used as part of the EAC Testing and Certification Program In 2009, to support VVSG 1.1, test methods for new and changed material was back- ported from 2.0 test suites; status quo prevails for everything else.

3 TGDC Meeting, Jan 2011 Page 3 Why Public Test Suites? To achieve consistency across testing labs and promote transparency of the testing process To review the VVSG for ambiguities, completeness, and correctness To assist manufacturers by providing precise test specifications To assist testing labs by lowering the overall cost of testing

4 TGDC Meeting, Jan 2011 Page 4 Test Development Timeline Internal Voting Team VVSG 2.0 Tests 2007-2008 Expert VSTL, Selected Experts VVSG 2.0 Tests Summer 2008 General Public Posted on Web Site VVSG 2.0 Tests Fall 2008 Subset of VVSG 2.0 Test Suites Expert, Public Review VVSG 1.1 Tests 2009 EAC Comments Nomenclature Traceability VVSG 1.1 Tests June 2010 VSTL Review Work within VSTL’s VVSG 1.1 Tests Spring 2011 Procedural Validation VVSG 1.1 Tests October 2011 December 2011: Integrated into Certification Process

5 TGDC Meeting, Jan 2011 Page 5 VVSG 1.1 Test Suite VVSG 1.1 test suite is based on VVSG 2.0 test methods associated with back-ported requirements Accessibility and usability Operational temperature and humidity Electronic records, security specifications, and VVPAT Core functionality, reliability and accuracy New test method developed for updated software setup validation requirement

6 TGDC Meeting, Jan 2011 Accessibility & Usability System-independent test narratives with pass/fail criteria Highly structured process surrounding the usability test protocols for the performance-based testing with test participants ISO Common Industry Format for reporting usability test results CIF templates and how-to’s for manufacturers and test labs Page 6

7 TGDC Meeting, Jan 2011Page 7 Accidental Activation: Input mechanisms SHALL be designed to minimize accidental activation Covers requirements : 3.2.6c3.2.6c Accidental Activation 3.2.6c.i3.2.6c.i Size and Separation of Touch Areas 3.2.6c.ii3.2.6c.ii No Repeating Keys 3.2.6c Input mechanisms SHALL be designed to minimize accidental activation 3.2.6.c.i On touch screens, the sensitive touch areas SHALL have a minimum height of 0.5 inches and minimum width of 0.7 inches. The vertical distance between the centers of adjacent areas SHALL be at least 0.6 inches, and the horizontal distance at least 0.8 inches. 3.2.6.2.ii No key or control on a voting system SHALL have a repetitive effect as a result of being held in its active position. EXAMPLE TEST CASE

8 TGDC Meeting, Jan 2011Page 8 Test Method includes 7 test requirements, covering 2 pages Excerpt: For touchscreen systems, The tester shall examine the touch areas for at least contests #4 (Governor) and #9 (County Commissioners). Using a ruler to measure distance and a stylus to perform the touching, the tester shall determine first that the touch areas used to vote for at least the first t candidates in each contest are separated as required. F => If any vertical distance between centers of adjacent touch areas for voting is less than 0.6 inches, then, for requirement "Size and Separation of Touch Areas", the system fails. EXAMPLE TEST CASE Accidental Activation, cont.

9 TGDC Meeting, Jan 2011 Hardware Operational Temperature and humidity Page 9

10 TGDC Meeting, Jan 2011Page 10 EXAMPLE TEST CASE Covers requirements: Volume I, Section 4.1.2.13 Environmental Control - Operating Environment. Voting systems shall be capable of operation in temperatures ranging from 41 °F to 104 °F (5 °C to 40 °C) and relative humidity from 5% to 85%, non-condensing. For testing information, see Volume II, section 4.7.1. Volume II, Section 4.7.1 Operating Temperature and Humidity Tests. All voting systems shall be tested in accordance with the appropriate procedures of MIL-STD-810D, "Environmental Test Methods and Engineering Guidelines''. Operating Temperature and Humidity

11 TGDC Meeting, Jan 2011Page 11 EXAMPLE TEST CASE Covers requirements, cont.: Operating Temperature All voting systems shall be tested according to the low temperature and high temperature testing specified by MIL-STD-810-D: Method 502.2, Procedure II – Operation and Method 501.2, Procedure II – Operation, with test conditions that simulate system operation. Operating Humidity All voting systems shall be tested according to the humidity testing specified by MIL-STD-810-D: Method 507.2, Procedure II – Natural (Hot–Humid), with test conditions that simulate system operation. Operating Temperature and Humidity

12 TGDC Meeting, Jan 2011Page 12 EXAMPLE TEST CASE Test method includes 15 steps, covering 3 pages Excerpt: Step 8: Set the chamber to 104 degrees Fahrenheit and 85% relative humidity (see Comment 1), observing precautions against thermal shock and condensation (see Comment 2). Allow relative humidity and VSUT temperature to stabilize. All paper, including ballots, used by the system must be stabilized at the specified testing temperature and humidity levels prior to testing (see Comment 4).Comment 1Comment 2Comment 4 Step 9: Perform an operational status check. If the VSUT shows evidence of damage, or any examined function or feature is not working correctly, then record that the VSUT fails the Operating Temperature and Humidity test. End the test. Operating Temperature and Humidity, cont.

13 TGDC Meeting, Jan 2011 Security Electronic records Security specifications VVPAT New test method developed for updated software setup validation requirement Page 13

14 TGDC Meeting, Jan 2011Page 14 Electronic and Paper Record Structure Covers Requirement 7.9.3c Electronic ballot images shall be digitally signed by the voting system. The digital signature shall be generated using a NIST-approved digital signature algorithm with a security strength of at least 112 bits implemented within a FIPS 140-2 validated cryptographic module operating in FIPS mode. Discussion: NIST approved is "An algorithm or technique that meets at least one of the following: 1) is specified in a FIPS or NIST Recommendation, 2) is adopted in a FIPS or NIST Recommendation or 3) is specified in a list of NIST approved security functions (e.g., specified as approved in the annexes of FIPS 140-2/3)". The security strengths of cryptographic algorithms can be found in NIST Special Publication 800- 57: Recommendation for Key Management - Part 1 General. EXAMPLE TEST CASE

15 TGDC Meeting, Jan 2011Page 15 Procedure Step 1: Obtain five electronic ballot images from the VSUT. Step 2: Verify digital signature on each of the ballot images individually. Step 3: If any of the digital signature verifications fails, record “The VSUT fails the Cryptographic Protection of Records test.” End the test. Step 4: Execute the Sections 6.1.3 and 6.2.3 cryptographic tests for the digital signature cryptographic module used to sign the electronic ballot images: Step 5: If any one of the above tests fails, record “The VSUT fails the Cryptographic Protection of Records test.” End the test. Step 6: Record “The VSUT passes the Cryptographic Protection of Records test.” End the test. EXAMPLE TEST CASE

16 TGDC Meeting, Jan 2011 Core Functionality Votetest Basic, essential voting system logic Ability to define elections Capture, count, and report votes Voting variations 92 tests formalized as SQL scripts Tests are intentionally simple… 89 use about 10 ballots, 3 use 100 ballots A volume test (mock election) is a significant test of all supported functions together …but they exercise the complete elections and voting process Page 16

17 TGDC Meeting, Jan 2011Page 17 EXAMPLE TEST CASE Covers Requirement I.2.4.3.d: All systems shall provide capabilities to produce a consolidated printed report of the results for each contest of all votes cast (including the count of ballots from other sources supported by the system as specified by the manufacturer) that includes the votes cast for each selection, the count of undervotes, and the count of overvotes. Printed Report: Counted report of contest, including votes, undervotes, overvotes

18 TGDC Meeting, Jan 2011Page 18 EXAMPLE TEST CASE General Procedure 1.Establish initial state (clean out data from previous tests, verify resident software/firmware); 2.Program election and prepare ballots and/or ballot styles; 3.Generate pre-election audit reports; 4.Configure voting devices; 5.Run system readiness tests; 6.Generate system readiness audit reports; 7.Precinct count only: 1.Open poll; 2.Run precinct count test ballots; and 3.Close poll. 8.Run central count test ballots (central count / absentee ballots only); 9.Generate in-process audit reports; 10.Generate data reports for the specified reporting contexts; 11.Inspect ballot counters; and 12.Inspect reports. Printed Report, cont. : Counted report of contest, including votes, undervotes, overvotes

19 TGDC Meeting, Jan 2011Page 19 EXAMPLE TEST CASE Test Method includes 38 steps Excerpt: Step 26: Compute the absolute value of the difference between the reported number of votes for “Car Tay Fower” in the “President, vote for at most 1” contest in Precinct 1 and the value 4, and add it to the Report Error. If the needed value does not appear in the report, increment Report Error by one (1).... Step 34: For each spurious ballot count or vote total reported by the VSUT (e.g., ascribing votes to a candidate that did not run in a particular contest or reporting one or more overvotes on a VSUT that prevents overvoting), increase the Report Error by one (1). Step 35: Record the Report Error. Printed report, cont.

20 TGDC Meeting, Jan 2011 Reliability, accuracy, misfeed rate Improved test method replaces material that was historically included in the VSS/VVSG… hence, included in drafts Now evaluated using data collected during all tests, rather than a single, isolated test Page 20

21 TGDC Meeting, Jan 2011 Page 21 Test Validation Rigorous traceability to VVSG requirements Reviewed by independent parties VSTL labs, experts, public EAC: updated to consistent nomenclature and traceability Procedural (on-going) Operating temperature and humidity – complete Other VVSG 1.1 test suite components are under consideration and will be conducted in 2011

22 TGDC Meeting, Jan 2011 Page 22 Next Steps Continue procedural validation Round-trip with testing laboratories to discuss methods for integrating test methods into their workflow Success here will pave the way for the rest of the VVSG 2.0 test suites Continue to work with all to improve the VVSG, manufacturer implementations, testing practices, and the test suites

23 TGDC Meeting, Jan 2011 Discussion Page 23


Download ppt "TGDC Meeting, Jan 2011 VVSG 1.1 Test Suite Status Mary Brady National Institute of Standards and Technology"

Similar presentations


Ads by Google