TGDC Meeting, July 2011 Update on the UOCAVA Working Group Andrew Regenscheid Mathematician, Computer Security Division, ITL

Slides:



Advertisements
Similar presentations
Software Quality Assurance Plan
Advertisements

Secure Systems Research Group - FAU Process Standards (and Process Improvement)
TGDC Meeting, December 2011 Usability and Accessibility (U&A) Research Update Sharon J. Laskowski, Ph.D.
ETen E-Poll ID – Strasbourg COE meeting November, 2006 Slide 1 E-TEN E-POLL Project Electronic Polling System for Remote Operation Strasbourg.
IEEE P1622 Meeting, Oct 2011 IEEE P1622 Meeting October 24-25, 2011 Overview of IEEE P1622 Draft Standard for Electronic Distribution of Blank Ballots.
Software Quality Assurance Plan
TGDC Meeting, July 2011 Review of VVSG 1.1 Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
DoD Information Technology Security Certification and Accreditation Process (DITSCAP) Phase III – Validation Thomas Howard Chris Pierce.
Observation of e-enabled elections Jonathan Stonestreet Council of Europe Workshop Oslo, March 2010.
Secure System Administration & Certification DITSCAP Manual (Chapter 6) Phase 4 Post Accreditation Stephen I. Khan Ted Chapman University of Tulsa Department.
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
United States Election Assistance Commission Pilot Program Testing and Certification Manual & UOCAVA Pilot Program Testing and Certification Manual & UOCAVA.
Voting System Qualification How it happens and why.
12/9-10/2009 TGDC Meeting TGDC Recommendations Research as requested by the EAC John P. Wack National Institute of Standards and Technology
Conducting the IT Audit
TGDC Meeting, December 2011 Andrew Regenscheid National Institute of Standards and Technology Update on UOCAVA Risk Assessment by.
TGDC Meeting, Jan 2011 UOCAVA Pilot Projects for the 2012 Federal Election Report from the UOCAVA Working Group Andrew Regenscheid National Institute of.
Improving U.S. Voting Systems The Voters’ Perspective: Next generation guidelines for usability and accessibility Sharon Laskowski NIST Whitney Quesenbery.
TGDC Meeting, July 2011 Overview of July TGDC Meeting Belinda L. Collins, Ph.D. Senior Advisor, Voting Standards, ITL
Election Assistance Commission United States VVSG Technical Guidelines Development Committee (TGDC) NIST July 20, 2015 Gaithersburg,
TGDC Meeting, July 2011 Usability and Accessibility Test Methods: Preliminary Findings on Validation Sharon Laskowski, Ph.D. Manager, NIST Visualization.
SEC835 Database and Web application security Information Security Architecture.
CHAPTER 5 Infrastructure Components PART I. 2 ESGD5125 SEM II 2009/2010 Dr. Samy Abu Naser 2 Learning Objectives: To discuss: The need for SQA procedures.
United States Election Assistance Commission EAC UOCAVA Documents: Status &Update EAC Technical Guidelines Development Committee Meeting (TGDC)
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
TGDC Meeting, July 2011 UOCAVA Roadmap Update Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
NIST HAVA-Related Work: Status and Plans June 16, 2005 National Institute of Standards and Technology
Implementation of the Essential Standards The Australian Quality Framework (AQTF) is the national set of standards which assures nationally consistent,
Making every vote count. United States Election Assistance Commission HAVA 101 TGDC Meeting December 9-10, 2009.
12/9-10/2009 TGDC Meeting NIST Research on UOCAVA Voting Andrew Regenscheid National Institute of Standards and Technology
UOCAVA Report Overview and Status July 2008 Andrew Regenscheid Computer Security Division National Institute of Standards and Technology.
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
IEEE P1622 Meeting, Feb 2011 Common Data Format (CDF) Update John P. Wack National Institute of Standards and Technology
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
1 Election Operations Assessment Summary Election Assistance Commission.
Product Documentation Chapter 5. Required Medical Device Documentation  Business proposal  Product specification  Design specification  Software.
Improving U.S. Voting Systems Security Breakout Session Improving U.S. Voting Systems Andrew Regenscheid National Institute.
Usability and Accessibility Working Group Report Sharon Laskowski, PhD National Institute of Standards and Technology TGDC Meeting,
Briefing for NIST Acting Director James Turner regarding visit from EAC Commissioners March 26, 2008 For internal use only 1.
NIST Voting Program Activities Update February 21, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
TGDC Meeting, Jan 2011 Accessibility and Usability Considerations for UOCAVA Remote Electronic Voting Systems Sharon Laskowski, PhD National Institute.
12/9-10/2009 TGDC Meeting Auditing concepts David Flater National Institute of Standards and Technology
TGDC Meeting, Jan 2011 Auditability Working Group David Flater National Institute of Standards and Technology r4.
VVSG: Usability, Accessibility, Privacy 1 VVSG, Part 1, Chapter 3 Usability, Accessibility, and Privacy December 6, 2007 Dr. Sharon Laskowski
TGDC Meeting, July 2010 Security Considerations for Remote Electronic UOCAVA Voting Andrew Regenscheid National Institute of Standards and Technology
TGDC Meeting, July 2010 Report of the UOCAVA Working Group John Wack National Institute of Standards and Technology DRAFT.
NIST Voting Program Page 1 NIST Voting Program Lynne Rosenthal National Institute of Standards and Technology
TGDC Meeting, December 2011 Overview of December TGDC Meeting Belinda L. Collins, Ph.D. Senior Advisor, Voting Standards
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
TGDC Meeting, July 2011 Voluntary Voting System Guidelines Roadmap Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
TGDC Meeting, Jan 2011 Help America Vote Act (HAVA) Roadmap Nelson Hastings National Institute of Standards and Technology
TGDC Meeting, July 2010 Report on Other Resolutions from Dec 2009 TGDC Meeting John Wack National Institute of Standards and Technology
TGDC Meeting, Jan 2011 Review of UOCAVA Roadmap Nelson Hastings National Institute of Standards and Technology
NIST Voting Program Activities Update January 4, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
1 DECEMBER 9-10, 2009 Gaithersburg, Maryland TECHNICAL GUIDELINES DEVELOPMENT COMMITTEE Commissioner Donetta Davidson.
Creating Accessibility, Usability and Privacy Requirements for the Voluntary Voting System Guidelines (VVSG) Whitney Quesenbery TGDC Member Chair, Subcommittee.
TGDC Meeting, Jan 2011 Development of High Level Guidelines for UOCAVA voting systems Andrew Regenscheid National Institute of Standards and Technology.
TGDC Meeting, Jan 2011 Path Forward for FY11 UOCAVA Activities Nelson Hastings National Institute of Standards and Technology
Briefing for the EAC Public Meeting Boston, Massachusetts April 26, 2005 Dr. Hratch Semerjian, Acting Director National Institute of Standards and Technology.
Update: Revising the VVSG Structure Sharon Laskowski vote.nist.gov April 14, 2016 EAC Standards Board Meeting 1.
TGDC Meeting, Jan 2011 Report from Workshop on UOCAVA Remote Voting Systems Nelson Hastings National Institute of Standards and Technology
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
TGDC Meeting, Jan 2011 UOCAVA Pilot Projects for the 2012 Federal Election Report from the UOCAVA Working Group Andrew Regenscheid National Institute of.
Computer Security: Principles and Practice First Edition by William Stallings and Lawrie Brown Lecture slides by Lawrie Brown Chapter 17 – IT Security.
TGDC Meeting, Jan 2011 Accessibility and Usability Considerations for UOCAVA Remote Electronic Voting Systems Sharon Laskowski, PhD National Institute.
The VVSG 2005 Revision Overview EAC Standards Board Meeting February 26-27, 2009 John P. Wack NIST Voting Program National Institute.
Software Quality Control and Quality Assurance: Introduction
National Institute of Standards and Technology
Presentation transcript:

TGDC Meeting, July 2011 Update on the UOCAVA Working Group Andrew Regenscheid Mathematician, Computer Security Division, ITL

TGDC Meeting, July 2011Page 2 Overview The TGDC UOCAVA working group has three outstanding task items: High-level guidelines for UOCAVA voting systems Narrative risk analysis on current UOCAVA voting process and demonstration project system Low-level guidelines for demonstration project system

TGDC Meeting, July 2011Page 3 Meeting Objectives This session of today’s meeting has three objectives: Decide how to proceed on the high-level guidelines, including decisions on: Intended scope and purpose Auditability/verifiability guidelines Usability/accessibility guidelines Resolution of FVAP’s comments Decide on a course of action for conducting a risk analysis on the current UOCAVA voting process Discuss process/timeline for approaching demonstration project guidelines

TGDC Meeting, July 2011Page 4 High-Level Guidelines EAC/NIST/FVAP UOCAVA Roadmap “EAC and the TGDC, with technical support from NIST, and input from FVAP, will identify high-level, non-testable guidelines for remote electronic absentee voting systems. This effort will focus on the desirable characteristics of such systems and serve as a needs analysis for future pilots and research; and for the purposes of driving industry to implement solutions.”

TGDC Meeting, July 2011Page 5 High-Level Guidelines Purpose Fulfill charge from UOCAVA Roadmap Interpreted the UOCAVA Roadmap language as asking for aspirational, high-level guidelines intended to identify goals for future UOCAVA voting systems Intent is that these high-level guidelines would form basis for the development of low-level guidelines for the demonstration project and future UOCAVA voting systems Scope Included both demonstration project systems and future systems Guidelines intended to be all-encompassing, covering roughly the same scope as future low-level guidelines

TGDC Meeting, July 2011Page 6 High-Level Guidelines Goal was to identify a small number (~25) high-level guidelines that covered all important topics Build consensus around high-level concepts, and flush out details in low-level guidelines for the future Emphasis on aspirational goals- we recognized some may not be achievable today

TGDC Meeting, July 2011Page 7 High-Level Guidelines: Topics Current high-level guidelines draft includes: Voting functions Auditability Quality assurance and configuration management Reliability and availability Usability and accessibility Security Interoperability

TGDC Meeting, July 2011Page 8 High-Level Guidelines: Process NIST staff initially drafted high-level guidelines in sections using: Earlier drafts of high-level guidelines Council of Europe’s Legal, Operation and Technical Standards for E- Voting Research done to support VVSG development Existing relevant standards UOCAVA and U&A working group members reviewed and edited guidelines Properties of the current UOCAVA voting system were taken into consideration, but did not limit the guidelines

TGDC Meeting, July 2011Page 9 Voting Functions Primary, basic guidelines expected from any voting system, e.g., One cast ballot counted per voter (hlg-2, 3) Accurate and reproducible vote counts (hlg-4) Supply voters with correct ballot style (hlg-5) Some were derived from CoE E-Voting standard

TGDC Meeting, July 2011Page 10 Auditability Primary guideline: “The UOCAVA voting system shall create and preserve evidence to enable auditors to verify that it has operated correctly in an election, and to identify the cause if it has not.” Two controversial proposed guidelines: “The audit system shall provide the ability to compare records and verify the correct operation of the UOCAVA voting system and the accuracy of the result, in an effort to detect fraud, to prove that all counted votes are authentic and that all authentic votes have been counted as cast.” “The UOCAVA voting system shall make it possible for voters to check whether their vote was cast and recorded as they intended, and shall make it possible for observers to check whether all cast votes have been counted and tallied correctly.”

TGDC Meeting, July 2011Page 11 Quality Assurance and Configuration Management System must be “fit for use” System must be developed, monitored and maintained in accordance with applicable best practices for quality assurance Documented, tested, and stable configuration Guidelines based on research done to support VVSG 2.0 draft

TGDC Meeting, July 2011Page 12 Reliability and Availability Definition of critical failure: any functional failure, the occurrence of which jeopardizes the validity of the election, or casts doubt on the credibility of the election result Probability of critical failures and overall system availability must be fit for intended use (hlg-1, 3) Assure reliability of system through application of best reliability engineering practices and standard reliability analysis procedures Based on CoE guidelines and supporting VVSG 2.0 research

TGDC Meeting, July 2011Page 13 Security Security guidelines were developed accepting risks of the current mail-system, e.g., Low-level compromises of ballot secrecy is accepted (hlg-2) Some low-level fraud accepted- the goal is to prevent an undetectable change in the outcome of the election (hlg-3) Some new issues unique to electronic systems: Strong user authentication for voters, administrators, officials (hlg-1) Systems must be free of vulnerabilities that allow remote attacks (hlg-4) Prevent malicious software on terminals from impacting election integrity (hlg-5) Recommended use of penetration testing (hlg-6)

TGDC Meeting, July 2011Page 14 User-Centered Development (hlg-1) Develop with best practices in user-centered design and user testing Incorporate these principles throughout the system development cycle and as part of certification Evaluate system usability and accessibility via user testing with representative test participants Include usability evaluation of procedures and documentation for system administration

TGDC Meeting, July 2011Page 15 Accessibility (hlg-2, 5, 7) Make system accessible to voters with disabilities Built-in access features Interoperability with personal assistive technology (PAT) PAT as supplemental rather than necessary to ensure system accessibility Maintain privacy and independence throughout all phases of voting process Ballot marking, verification, and casting Voter has same accessibility accommodations throughout Comply with legal mandates

TGDC Meeting, July 2011Page 16 Best Design Practices (hlg-3, 4, 6) Follow human factors design best practices, for both system and ballot design where possible EAC’s report “Effective Designs for the Administration of Federal Elections” American Institute of Graphic Arts (AIGA)’s report “Top 10 Election Design Guidelines” Adhere to current standards and guidelines VVSG World Wide Web Consortium (W3C)’s Web Accessibility Initiative (WAI), specifically the Web Content Accessibility Guidelines (WCAG 2.0) and WAI for Accessible Rich Internet Applications (WAI-ARIA)

TGDC Meeting, July 2011Page 17 More on Ballot Design FVAP expressed some concern over including ballot design in the high-level guidelines To clarify: High-level guidelines are not intended to supersede State laws Election Officials control formatting of ballot content High-level guidelines are intended to address only those ballot design features controlled by the UOCAVA system For example, navigation and user interface controls UOCAVA system should support implementation of good ballot design

TGDC Meeting, July 2011Page 18 More on Accessibility FVAP requested high-level guidelines focus on the demonstration project, which would limit the scope of accessibility Suggested that only Section 508 be referenced Implications of this are unclear: Section 508 does require accessible design and some PAT interoperability Section 508 “Refresh” on the horizon How much of W3C’s WAI guidelines should be implemented in the demonstration project? Will we learn enough about accessibility from the demonstration project to inform future work?

TGDC Meeting, July 2011 Discussion/Questions Page 19 Next Topic: Risk Analysis Open issues: Intended scope and purpose Auditability/verifiability guidelines Usability/accessibility guidelines Resolution of FVAP’s comments

TGDC Meeting, July 2011Page 20 Risk Analysis TGDC Resolution #02-11 directs the UOCAVA Working Group to: “prepare a narrative risk assessment comparing the current UOCAVA voting process to electronic absentee voting systems used in a demonstration project with military voters.” Currently, the demonstration project system is not defined First step: analyzing risks in current UOCAVA voting process

TGDC Meeting, July 2011Page 21 Risk Analysis: Transactional Failures Current UOCAVA process has a number of transactional failure points between voter registration and ballot canvassing: Voter registration failures Ballot delivery failures Ballot marking errors Ballot return failures These failures are observable and measurable An analysis of these failures can lead us to an overall failure rate of the current process

TGDC Meeting, July 2011Page 22 Risk Analysis: Identifying Risks Transactional failures are only one type of risk The UOCAVA working group can analyze one or more representative current UOCAVA voting processes to identify other potential risks What is the potential vulnerability? Who is in a position to exploit it? What is the impact of a successful exploit? What is the probability of a successful exploit? Challenge #1: Impacts are not always easily quantifiable in comparable units. What is the value of a vote? Challenge #2: Probabilities for malicious attacks are notoriously difficult to estimate

TGDC Meeting, July 2011Page 23 Risk Analysis: Comparing Risks It will be important to compare and balance risks between different types of systems, as well as different types of risks within a given system We can create quantifiable comparisons of impact Example: Comparing the impact of lost ballots and tampered ballots to the outcome of the election Collaboration with NIST Statistical Engineering Division Explore use of EAC Election Operations Assessment Tool Qualitative comparisons will be done in other areas, such as malicious attacks or risks

TGDC Meeting, July 2011 Discussion/Questions Page 24 Next Topic: Demonstration Project Guidelines Feedback on Risk Analysis Path Forward

TGDC Meeting, July 2011Page 25 UOCAVA Demonstration Project Work is building up to the implementation of a remote voting demonstration project for military voters EAC has tasked the TGDC in developing guidelines for the demonstration project system TGDC Resolution #02-11 stated TGDC’s acceptance of this task, and directed the TGDC to develop guidelines for a demonstration project with simplifying assumptions: Military voters only Use of Common Access Card (CAC) for authentication Use of professionally-administered machines

TGDC Meeting, July 2011Page 26

TGDC Meeting, July 2011Page 27 Mitigated Risks The simplifying assumptions mitigate some risks identified in NISTIR 7551: A Threat Analysis on UOCAVA Voting Systems: Use of CAC mitigates authentication-related risks, including voter impersonation and phishing attacks Digitally signed ballots using CAC could mitigate some malicious attacks on servers Use of professionally-administered machines mitigates risk of malicious software on voting terminals impacting ballot secrecy or integrity Use of military network could help to mitigate some remote attacks on servers

TGDC Meeting, July 2011Page 28 Other Risks Other risks may need to be mitigated by other means, pending results of risk analysis: Network-based attacks may not be mitigated by the architecture Internet voting systems inherit many of the same potential risks as electronic polling place systems

TGDC Meeting, July 2011Page 29 Demonstration Project Prerequisites Several items need to be completed prior to development of demonstration project guidelines TGDC tasked with the high-level guidelines and risk analysis TGDC/NIST also need: Concept of operations of the demonstration system Expected high-level system architecture Clearly defined scope for demonstration project system How extensive will this project be? One-time only? What functions must be provided? Who decides appropriate tradeoffs and accepts risks?

TGDC Meeting, July 2011Page 30 Demonstration Project: Timeline Current work: complete near-term deliverables (i.e., high-level guidelines and risk analysis) intended to inform low-level guidelines development Demonstration project guidelines expected to take 24 months to develop, vet through a public comment period, and approve in TGDC and EAC 12 month development process 6 month vetting process 6 month revision process For a 2016 demonstration project, guidelines would be needed by mid-2014

TGDC Meeting, July 2011 Discussion/Questions Page 31