TGDC Meeting, July 2011 Voting System Software Assurance: SAMATE Automated Source Code Conformance Verification Study Michael Kass Computer Scientist,

Slides:



Advertisements
Similar presentations
1 Verification by Model Checking. 2 Part 1 : Motivation.
Advertisements

Software Assurance Metrics and Tool Evaluation (SAMATE) Michael Kass National Institute of Standards and Technology
Alan Shaffer, Mikhail Auguston, Cynthia Irvine, Tim Levin The 7th OOPSLA Workshop on Domain-Specific Modeling October 21-22, 2007 Toward a Security Domain.
Abstraction and Modular Reasoning for the Verification of Software Corina Pasareanu NASA Ames Research Center.
Telecooperation/RBG Technische Universität Darmstadt Copyrighted material; for TUD student use only Introduction to Computer Science I Topic 16: Exception.
TGDC Meeting, December 2011 Usability and Accessibility (U&A) Research Update Sharon J. Laskowski, Ph.D.
CS 325: Software Engineering January 13, 2015 Introduction Defining Software Engineering SWE vs. CS Software Life-Cycle Software Processes Waterfall Process.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 24 Slide 1 Critical Systems Validation 2.
Writing Quality Specifications July 9, 2004 Mark Skall Acting Director, Information Technology Laboratory National Institute of Standards and Technology.
TGDC Meeting, July 2011 Review of VVSG 1.1 Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
Automated creation of verification models for C-programs Yury Yusupov Saint-Petersburg State Polytechnic University The Second Spring Young Researchers.
1 Static Testing: defect prevention SIM objectives Able to list various type of structured group examinations (manual checking) Able to statically.
Chapter 2: The Visual Studio.NET Development Environment Visual Basic.NET Programming: From Problem Analysis to Program Design.
This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005.
This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005.
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
12/9-10/2009 TGDC Meeting TGDC Recommendations Research as requested by the EAC John P. Wack National Institute of Standards and Technology
TGDC Meeting, December 2011 Michael Kass National Institute of Standards and Technology Update on SAMATE Automated Source Code Conformance.
Election Assistance Commission United States VVSG Technical Guidelines Development Committee (TGDC) NIST July 20, 2015 Gaithersburg,
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
S ECURITY T OOLS F OR S OFTWARE D EVELOPMENT F X C OP 10.0 David Angulo Rubio.
TGDC Meeting, July 2011 Usability and Accessibility Test Methods: Preliminary Findings on Validation Sharon Laskowski, Ph.D. Manager, NIST Visualization.
Product Quality, Testing, Reviews and Standards
Michael Burnside Blog: Software Quality Assurance, Quality Engineering, and Web and Mobile Test.
© 2007 Carnegie Mellon University Secure Coding Initiative Jason A. Rafail Monday, May 14 th, 2007.
A Framework for Automated Web Application Security Evaluation
Testing – A Methodology of Science and Art. Agenda To show, A global Test Process which work Like a solution Black Box for an Software Implementation.
(1) Automated Quality Assurance Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii Honolulu.
CSCE 548 Code Review. CSCE Farkas2 Reading This lecture: – McGraw: Chapter 4 – Recommended: Best Practices for Peer Code Review,
Chapter 1 Introduction Dr. Frank Lee. 1.1 Why Study Compiler? To write more efficient code in a high-level language To provide solid foundation in parsing.
™ ™ © 2006, KDM Analytics Software Assurance Ecosystem and its Applications Djenana Campara Chief Executive Officer, KDM Analytics Board Director, Object.
Improving U.S. Voting Systems Security Breakout Session Improving U.S. Voting Systems Andrew Regenscheid National Institute.
Development Process and Testing Tools for Content Standards OASIS Symposium: The Meaning of Interoperability May 9, 2006 Simon Frechette, NIST.
Accreditation for Voting Equipment Testing Laboratories Gordon Gillerman Standard Services Division Chief
Usability and Accessibility Working Group Report Sharon Laskowski, PhD National Institute of Standards and Technology TGDC Meeting,
Briefing for NIST Acting Director James Turner regarding visit from EAC Commissioners March 26, 2008 For internal use only 1.
NIST Voting Program Activities Update February 21, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
© Copyright 3Dlabs 2004 Page 1 GLSL Conformance proposal December 2004.
Test Plans, Test Cases, and Test Reports
This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005.
Making every vote count. United States Election Assistance Commission EAC Voting System Certification TGDC Meeting December 9-10, 2009.
NIST Voting Program Barbara Guttman 12/6/07
TGDC Meeting, July 2011 Voluntary Voting System Guidelines Roadmap Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
NIST SAMATE Project and OMG Michael Kass NIST Information Technology Laboratory March 11, 2008.
TGDC Meeting, July 2010 Report on Other Resolutions from Dec 2009 TGDC Meeting John Wack National Institute of Standards and Technology
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
NIST Voting Program Activities Update January 4, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
SwA Co-Chair and Task Lead Strategy Session Agenda Technology, Tools and Product Evaluation Working Group Status Briefing Co-Chair(s) Michael Kass (NIST),
Next VVSG Training Standards 101 October 15-17, 2007 Mark Skall National Institute of Standards and Technology
The VVSG Version 1.1 Overview Matthew Masterson Election Assistance Commission
Briefing for the EAC Public Meeting Boston, Massachusetts April 26, 2005 Dr. Hratch Semerjian, Acting Director National Institute of Standards and Technology.
12/9-10/2009 TGDC Meeting NIST-developed Test Suites David Flater National Institute of Standards and Technology
Csontos Péter, Porkoláb Zoltán Eötvös Loránd Tudományegyetem, Budapest ECOOP 2001 On the complexity of exception handling.
Chapter 2 Build Your First Project A Step-by-Step Approach 2 Exploring Microsoft Visual Basic 6.0 Copyright © 1999 Prentice-Hall, Inc. By Carlotta Eaton.
TGDC Meeting, July 2011 VVSG 1.1 Test Suite Status Mary Brady Manager, NIST Information Systems Group, Software and Systems Division, ITL
The VVSG 2005 Revision Overview EAC Standards Board Meeting February 26-27, 2009 John P. Wack NIST Voting Program National Institute.
Appendix 1 - Packages Jim Fawcett copyright (c)
Computer Scientist, Software and Systems Division, ITL
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
Chapter 2: The Visual Studio .NET Development Environment
Chapter - 8 Implementation.
Verification and Validation
^ About the.
Secure Coding Initiative
Verification and Validation
ONAP Security Sub-committee Update
Presented By: Bill Curtis-Davidson
CodePeer Update Arnaud Charlet CodePeer Update Arnaud Charlet
CodePeer Update Arnaud Charlet CodePeer Update Arnaud Charlet
Enterprise Architect, CNA
Presentation transcript:

TGDC Meeting, July 2011 Voting System Software Assurance: SAMATE Automated Source Code Conformance Verification Study Michael Kass Computer Scientist, Software and Systems Division, ITL

TGDC Meeting, July 2011 Software Assurance National Information Assurance Glossary Definition: “The level of confidence that software is free from vulnerabilities, either intentionally designed into the software or accidentally inserted at anytime during its lifecycle, and that the software functions in the intended manner.” [1] "National Information Assurance Glossary"; CNSS Instruction No National Information Assurance GlossaryNational Information Assurance Glossary Page 2

TGDC Meeting, July 2011 Assurance Through Source Code Analysis Static source code analysis can identify weaknesses and vulnerabilities in source code that could compromise a voting system’s security, availability, integrity and privacy Many open source and commercial tools Used during development and assessment Page 3

TGDC Meeting, July 2011 The Problem Today, Voting System Test Laboratories (VSTLs) are not leveraging automated tools to verify voting source code against Voluntary Voting System Guidelines (VVSG) “Software Design and Coding Standards.” Human analysis alone can result in: Discrepancies in VSTL assessment repeatability and accuracy Increase test lab assessment time and cost Page 4

TGDC Meeting, July 2011 A Solution NIST Software Assurance Metrics And Tool Evaluation (SAMATE) team can assist the VSTLs in automating source code conformance by: Customizing freely-available tools that verify source code conformance to VVSG coding requirements Verifying tool effectiveness through testing Page 5

TGDC Meeting, July 2011 Current Assurance Work The SAMATE project Automating verification of source code conformance to 2005 VVSG software requirements Assurance case for open ended vulnerability testing of voting systems Page 6

TGDC Meeting, July 2011 SAMATE Background A U.S. Department of Homeland Security (DHS) and NIST co-sponsored effort to measure the effectiveness of software assurance tools (specifically source code analyzers) Through testing against a corpus of source code examples (large and small) with known software weaknesses and vulnerabilities Page 7

TGDC Meeting, July 2011 SAMATE Reference Dataset (SRD) Over 70,000 online source code analysis tool effectiveness tests Across 125 known software weaknesses Indexed against the Common Weakness Enumeration (CWE) An online dictionary of weaknesses in software Page 8

TGDC Meeting, July 2011 SAMATE Tool Test Example #include int main(int argc, char **argv) { void *linuxHandle; linuxHandle = dlopen ("libm.so", RTLD_LAZY); /*bad*/ if (!linuxHandle) { fprintf (stderr, "%s\n", dlerror()); return(1); } return (0); } Page 9

TGDC Meeting, July 2011 “Bootstrapping” an Automated Source Code Verification Capability for VSTLS SAMATE customized two open source tools against VVSG 2005, Volume 1:5.2 “Software Design and Coding Standards” and verified tool effectiveness through testing Page 10

TGDC Meeting, July 2011 VVSG Tool Customization and Testing Specifics SAMATE identified 49 software design and coding requirements in VVSG 2005 covering: Software integrity Software modularity and programming Control constructs Naming conventions Comment conventions Page 11

TGDC Meeting, July 2011 VVSG-Customization of Tools SAMATE evaluated current capabilities of freely available source code analysis tools for potential use in VVSG source code conformance verification against those 49 VVSG requirements Page 12

TGDC Meeting, July 2011 Tool Selection Criteria SAMATE specifically looks for tools that: Are freely available Are extensible, allowing tool customization for VVSG- specific requirements Have a relaxed licensing agreement (for re- distribution of SAMATE customization) Provide an Abstract Syntax Tree (AST) traversal mechanism Page 13

TGDC Meeting, July 2011 Initial Tool Selection PMD (a Java code analysis tool) Open source tool with BSD license Runs in Windows/Linux environments Focuses on finding “bugs” in source code Page 14

TGDC Meeting, July 2011 Initial Tool Selection Compass/ROSE (a C/C++ code analysis tool) A research project of Lawrence Livermore National Laboratory (LLNL) An open source compiler infrastructure to build source-to-source program transformation and analysis tools for Fortran, C/C++ and other languages Page 15

TGDC Meeting, July 2011 Methodology for Automation Classify VVSG requirements for automation potential Complete, partial or none Build generic tool search rules for requirements Create example test code to verify that the tools function correctly by “seeding” the code with non-conformant constructs Run the tools against the example test code to verify tool correctness Document why each requirement could (or could not) be automated Page 16

TGDC Meeting, July 2011 Requirement Classification Requirements were grouped into 6 types: Completely automatable Completely automatable, but requires customization to the voting system-specific coding style or API Automatable, but not with tools used in this study Partially automatable, tool could “point” to potential non-conformance, but human analysis is required for verification Not automatable Requirement not applicable to a particular language Page 17

TGDC Meeting, July 2011 Tool Effectiveness Tests SAMATE wrote source code examples in C and Java to verify tool correctness in reporting non-conformance Each test consists of a small (30 lines or less) source code program that contains non-VVSG-conformant code constructs that violate a VVSG code workmanship requirement When scanned by the source code analysis tool, the test files elicit a report from the tool indicating the filename and line number where the non-conformant construct was found Page 18

TGDC Meeting, July 2011 Tool Reports Tools (run in command-line mode) report file path and name, line number of non-conformance and error message: tool_tests\java\AssertStatements.java:26 Assert statements should be absent from a production compilation tool_tests\java\AssertStatements.java:37 Assert statements should be absent from a production compilation tool_tests\java\DefaultCase.java:28 Switch statements should have a default label tool_tests\java\DefaultCase.java:70 Switch statements should have a default label tool_tests\java\DynamicallyLoadedCode.java:18 Dynamically loaded code is prohibited tool_tests\java\ExceptionAsFlowControl.java:31 Avoid using exceptions as flow control. tool_tests\java\ExceptionAsFlowControl.java:36 Avoid using exceptions as flow control. Reports were verified against expected results Page 19

TGDC Meeting, July 2011 “Sanity Checking” Creation of both customized tools and tool tests created a feedback loop to verify our understanding of the semantics of the VVSG requirement Feedback also exposed some questions regarding the semantics of some requirements that require additional clarification by EAC Page 20

TGDC Meeting, July 2011 Results of our Study to Date VVSG Software Design Coding Standards Requirement Classification Completely Automatable JavaC 1) A generic search rule can be written to identify all instances of non- conformance in source code ) A custom search rule specific to the coding style of each voting system can be written to identify all instances of non-conformance 46 3) This tool cannot, but another tool could verify this requirement 810 Partially automatable ( but requires additional human analysis to verify) 4) A generic rule can “point to” a possible conformance violation, but human analysis is required to verify it 33 Not automatable 5) No tool can verify conformance to the requirement (requires 100% human analysis) ) The requirement is not relevant to the programming language 104

TGDC Meeting, July 2011 Other Possible Languages and Tools LanguagesTool C/C++Commercial tools JavaFindbugs, Checkstyle C#StyleCop, FXCop, Commercial tools VB.NETFXCop, Commercial tools COBOLCommercial tools only Page 22 Trade names and company products are mentioned or identified in this presentation. In no case does such identification imply recommendation or endorsement by the National Institute of Standards and Technology, nor does it imply that the products are necessarily the best available for the purpose.

TGDC Meeting, July 2011 Summary The majority of VVSG 1.0 coding convention requirements can be fully or partially verified via tool automation SAMATE created a “bootstrap” automated conformance verification capability (for C and Java) that could cover the majority of VVSG 1.0 coding requirements Other tools could “fill the gaps” for the remaining requirements A few VVSG 1.0 requirements were irrelevant to programming languages A few VVSG 1.0 requirements are simply not verifiable by automated tools Page 23

TGDC Meeting, July 2011 Follow-On NIST SAMATE team met with Wyle and SLI VSTLs in June 2011 VSTLs were supportive of NIST providing guidance and automated tooling Labs acknowledge that source code analysis is one of the most expensive and resource-intensive part of their work Beta demonstration of customized tools raised questions regarding semantics of VVSG coding requirements Labs suggested a “roundtable” discussion between NIST, labs, and voting system manufacturers regarding automated verification of VVSG coding requirements Page 24

TGDC Meeting, July 2011 Discussion/Questions Page 25