( = “unknown yet”) Our novel symbolic execution framework: - extends model checking to programs that have complex inputs with unbounded (very large) data.

Slides:



Advertisements
Similar presentations
1 Verification by Model Checking. 2 Part 1 : Motivation.
Advertisements

Korat Automated Testing Based on Java Predicates Chandrasekhar Boyapati, Sarfraz Khurshid, Darko Marinov MIT ISSTA 2002 Rome, Italy.
Automated Verification with HIP and SLEEK Asankhaya Sharma.
A Survey of Approaches for Automated Unit Testing
Abstraction and Modular Reasoning for the Verification of Software Corina Pasareanu NASA Ames Research Center.
Model Counting >= Symbolic Execution Willem Visser Stellenbosch University Joint work with Matt Dwyer (UNL, USA) Jaco Geldenhuys (SU, RSA) Corina Pasareanu.
1 Symbolic Execution for Model Checking and Testing Corina Păsăreanu (Kestrel) Joint work with Sarfraz Khurshid (MIT) and Willem Visser (RIACS)
1/20 Generalized Symbolic Execution for Model Checking and Testing Charngki PSWLAB Generalized Symbolic Execution for Model Checking and Testing.
Carnegie Mellon University Java PathFinder and Model Checking of Programs Guillaume Brat, Dimitra Giannakopoulou, Klaus Havelund, Mike Lowry, Phil Oh,
FIT FIT1002 Computer Programming Unit 19 Testing and Debugging.
Automated creation of verification models for C-programs Yury Yusupov Saint-Petersburg State Polytechnic University The Second Spring Young Researchers.
The Software Model Checker BLAST by Dirk Beyer, Thomas A. Henzinger, Ranjit Jhala and Rupak Majumdar Presented by Yunho Kim Provable Software Lab, KAIST.
Developing Verifiable Concurrent Software Tevfik Bultan Department of Computer Science University of California, Santa Barbara
1 Today Another approach to “coverage” Cover “everything” – within a well-defined, feasible limit Bounded Exhaustive Testing.
DART Directed Automated Random Testing Patrice Godefroid, Nils Klarlund, and Koushik Sen Syed Nabeel.
Houdini: An Annotation Assistant for ESC/Java Cormac Flanagan and K. Rustan M. Leino Compaq Systems Research Center.
Efficient Modular Glass Box Software Model Checking Michael Roberson Chandrasekhar Boyapati The University of Michigan.
December 5, Mars and Beyond: NASA’s Software Challenges in the 21st Century Dr. Michael R. Lowry NASA Ames Research Center.
Formal verification Marco A. Peña Universitat Politècnica de Catalunya.
1/23/2003University of Virginia1 Korat: Automated Testing Based on Java Predicates CS751 Presentation by Radu Stoleru C.Boyapaty, S.Khurshid, D.Marinov.
272: Software Engineering Fall 2012 Instructor: Tevfik Bultan Lecture 8: Semi-automated test generation via UDITA.
272: Software Engineering Fall 2012 Instructor: Tevfik Bultan Lecture 4: SMT-based Bounded Model Checking of Concurrent Software.
Formal Techniques for Verification Using SystemC By Nasir Mahmood.
Your Interactive Guide to the Digital World Discovering Computers 2012.
Symbolic Execution with Mixed Concrete-Symbolic Solving (SymCrete Execution) Jonathan Manos.
Software Testing.
Software Engineering Prof. Dr. Bertrand Meyer March 2007 – June 2007 Chair of Software Engineering Static program checking and verification Slides: Based.
Lifecycle Verification of the NASA Ames K9 Rover Executive Dimitra Giannakopoulou Mike Lowry Corina Păsăreanu Rich Washington.
Verification of Java Programs using Symbolic Execution and Loop Invariant Generation Corina Pasareanu (Kestrel Technology LLC) Willem Visser (RIACS/USRA)
Java Pathfinder JPF Tutorial - Test Input Generation With Java Pathfinder.
B. Fernández, D. Darvas, E. Blanco Formal methods appliedto PLC code verification Automation seminar CERN – IFAC (CEA) 02/06/2014.
WSMX Execution Semantics Executable Software Specification Eyal Oren DERI
1 Introduction to Software Engineering Lecture 1.
Finding Feasible Counter-examples when Model Checking Abstracted Java Programs Corina S. Pasareanu, Matthew B. Dwyer (Kansas State University) and Willem.
Symbolic Execution with Abstract Subsumption Checking Saswat Anand College of Computing, Georgia Institute of Technology Corina Păsăreanu QSS, NASA Ames.
Learning Symbolic Interfaces of Software Components Zvonimir Rakamarić.
1 Model Checking of Robotic Control Systems Presenting: Sebastian Scherer Authors: Sebastian Scherer, Flavio Lerda, and Edmund M. Clarke.
Symbolic Execution and Model Checking for Testing Corina Păsăreanu and Willem Visser Perot Systems/NASA Ames Research Center and SEVEN Networks.
Software testing techniques Software testing techniques Software Testability Presentation on the seminar Kaunas University of Technology.
Symbolic and Concolic Execution of Programs Information Security, CS 526 Omar Chowdhury 10/7/2015Information Security, CS 5261.
08120: Programming 2: SoftwareTesting and Debugging Dr Mike Brayshaw.
Using Symbolic PathFinder at NASA Corina Pãsãreanu Carnegie Mellon/NASA Ames.
USING MODEL CHECKING TO DISCOVER AUTOMATION SURPRISES Java class User: - getExpectation() - checkExpectation() FAULTY EXECUTION start incrMCPAlt pullAltKnob.
PROGRAMMING TESTING B MODULE 2: SOFTWARE SYSTEMS 22 NOVEMBER 2013.
Concepts and Realization of a Diagram Editor Generator Based on Hypergraph Transformation Author: Mark Minas Presenter: Song Gu.
Static Techniques for V&V. Hierarchy of V&V techniques Static Analysis V&V Dynamic Techniques Model Checking Simulation Symbolic Execution Testing Informal.
Concrete Model Checking with Abstract Matching and Refinement Corina Păsăreanu QSS, NASA Ames Research Center Radek Pelánek Masaryk University, Brno, Czech.
Dynamic Testing.
Automated Formal Verification of PLC (Programmable Logic Controller) Programs
Whole Test Suite Generation. Abstract Not all bugs lead to program crashes, and not always is there a formal specification to check the correctness of.
Random Test Generation of Unit Tests: Randoop Experience
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Symbolic Execution in Software Engineering By Xusheng Xiao Xi Ge Dayoung Lee Towards Partial fulfillment for Course 707.
CSE 331 SOFTWARE DESIGN & IMPLEMENTATION SYMBOLIC TESTING Autumn 2011.
Symstra: A Framework for Generating Object-Oriented Unit Tests using Symbolic Execution Tao Xie, Darko Marinov, Wolfram Schulte, and David Notkin University.
Finding bugs with a constraint solver daniel jackson. mandana vaziri mit laboratory for computer science issta 2000.
Dynamic White-Box Testing What is code coverage? What are the different types of code coverage? How to derive test cases from control flows?
24 September 2002© Willem Visser Program Model Checking Enabling Technology Abstraction void add(Object o) { buffer[head] = o; head = (head+1)%size;
CSC 533: Programming Languages Spring 2016
CSC 533: Programming Languages Spring 2015
Formal methods: Lecture
Software Testing.
Types for Programs and Proofs
A Test Case + Mock Class Generator for Coding Against Interfaces
Types of Testing Visit to more Learning Resources.
Software Testing (Lecture 11-a)
Automatic Test Generation SymCrete
Symbolic Execution and Test-input Generation
CSE 1020:Software Development
08120: Programming 2: SoftwareTesting and Debugging
Presentation transcript:

( = “unknown yet”) Our novel symbolic execution framework: - extends model checking to programs that have complex inputs with unbounded (very large) data - automates test input generation Generalized Symbolic Execution for Model Checking and Testing input program Model checking Program instrumentation Decision procedures instrumented program correctness specification continue/ backtrack counterexample(s)/ test suite [heap+constraint+thread scheduling] Framework: Future mission software: - concurrent - complex, dynamically allocated data structures (e.g., lists or trees) - highly interactive: - with complex inputs - large environment - should be extremely reliable - testing: - requires manual input - typically done for a few nominal input cases - not good at finding concurrency bugs - not good at dealing with complex data structures void Executive:: startExecutive(){ runThreads(); …} void Executive:: executePlan(…) { while(!empty) executeCurrentPlanNode(); } … Rover Executive execute action environment/ rover status Input plan complex input structure large environment data concurrency, dynamic data (lists, trees) - model checking: - automatic, good at finding concurrency bugs - not good at dealing with complex data structures - feasible only with a small environment - and a small set of input values Current practice in checking complex software: - modular architecture: can use different model checkers/decision procedures class Node { int elem; Node next; Node deleteFirst() { if (elem < 10) return next; else if (elem < 0) assert(false); … } } Code Analysis of “deleteFirst” with our framework -“simulate” the code using symbolic values instead of program data; enumerate the input structures lazily e0 ≥ 10 /\ e0<0 e0 < 10e0 ≥ 10 truefalse true FALSE Precondition: acyclic list Numeric Constraints Decision Procedures Structural Constraints Lazy initialization+ Enumerate all structures e0 null e0 e1

Explanation of Accomplishment POC: Corina Pasareanu (Kestrel Technology LLC) and Willem Visser (RIACS/USRA) Joint work with Sarfraz Khurshid (MIT) Background: Future mission software systems, which will be concurrent and will manipulate complex dynamically allocated data structures, should be extremely reliable. These kinds of systems are known to be hard to test. Even using model checking to discover errors is inadequate due to the large and complex input domains of such systems. Accomplishment: A novel symbolic execution framework was developed and an algorithm implemented that enables model checking of programs that have complex inputs with unbounded (very large) data. The algorithm also automates test input generation.The symbolic execution algorithm has been implemented in the Java PathFinder model checker toolset. It has been used to generate test inputs for code coverage (i.e., condition coverage) of an Altitude Switch used in flight control software. Future Plans: We intend to apply our framework to the analysis of other complex systems, including the Mars K9 Executive prototype. We plan to investigate the application of abstraction techniques in the context of our framework. Overview of Algorithm: We provide a two-fold generalization of traditional symbolic execution methods. One, we define a source to source translation to instrument a program, which enables standard model checkers to perform symbolic execution of the program. The program instrumentation enables a model checker to automatically explore different configurations of the input data structures and manipulate logical formulae on program numeric values (using a decision procedure). Two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), primitive data (e.g., integers and strings) and concurrency. To symbolically execute a program, the algorithm uses lazy initialization, i.e., it initializes the components of the program’s inputs on an ``as-needed'' basis, without requiring an a priori bound on input sizes. The algorithm uses preconditions to initialize inputs only with valid values.