Presentation is loading. Please wait.

Presentation is loading. Please wait.

Choosing SATE Test Cases Based on CVEs Sue Wang October 1, 2010 The SAMATE Project 1SATE 2010 Workshop.

Similar presentations


Presentation on theme: "Choosing SATE Test Cases Based on CVEs Sue Wang October 1, 2010 The SAMATE Project 1SATE 2010 Workshop."— Presentation transcript:

1 Choosing SATE Test Cases Based on CVEs Sue Wang suewang_2000@yahoo.com October 1, 2010 The SAMATE Project http://samate.nist.gov/ 1SATE 2010 Workshop

2 Purpose and Motivation Provide test cases with exploitable vulnerabilities In an ideal world a tool detects significant bugs Also provide a fixed version of each test case To confirm low false positive rate Mentioned by SATE organizers and detailed proposal by Paul Anderson (SATE 2009 workshop) Brought up by tool makers and supported by users (SATE 2010 organization meeting) 2

3 Selection Criteria Open source software in C/C++ and Java AND with known security-related bugs AND get older versions AND manually pinpoint the bugs AND find a fixed version AND compile the source code 3

4 Primary Sources Brainstorm and exchange ideas within SAMATE team and with others Search for open sources, for instance –java-source.net –sourceforge.net –Other lists of scanned projects Search for related vulnerabilities –CVE – Common Vulnerabilities and Exposures (cve.mitre.org) –NVD – National Vulnerabilities Database (nvd.nist.gov) –CVE Details – enhanced CVE data (www.cvedetails.com) –OSVDB – The Open Source Vulnerability Database (osvdb.org) 4

5 Selection Process Identify potential SW List of Open Source SW Identify CVEs for each SW List of Relevant CVEs Collect factors for each CVE Analyzed CVEs Determine multi-factor eligibility Selected Candidates Find path & sink of CVE flaws SW with identified CVEs 5 Narrowed down to 12 open source software

6 Additional Selection Criteria multi-factor eligibility Quantity of CVEs Versions with and w/o flaws Versions have similar folder structures Resources to find CVE locations Versions available for Linux 6

7 Pinpointing the CVE Flaw path and sink for the CVE CVE’s description and references patch, bug tracking, & ver. control Follow information across various resources diff based on path and file name Code reviews and analysis 7

8 Selected Test Cases Wireshark C 1.6M LOC3k+ files100 CVEs evaluated 17 CVEs selected (17%) Google Chrome 5 C++ 4.7M LOC32k files103 CVEs evaluated9 CVEs selected (8.7%) Apache Tomcat 5 Java 174k LOC2k+ files91 CVEs evaluated 29 CVEs selected (32%) 8

9 Observations Took far more time and effort than expected –CVEs are not created equal Newer CVEs have higher quality info Some CVEs required large amounts of research –Locating the path and sink is much harder than finding the fix Reasons for low CVE selection rate –Not present in the selected version –Could not locate the source code or could not locate the sink Useful resources and tips –Source’s patches, bug tracking and version control info –Combine information from multiple resources (e.g., version -> bug # -> tracking -> batches) 9

10 Possible Future Work? Re-use the 3 test cases –Pinpoint more CVE flaws –Involve developers for confirming some of the pinpointed flaws –Invite tool makers to map warnings to CVEs –Analyze the warning and CVE mappings amount different tool makers and SATE findings –Store well understood CVE related test cases in SRD Other suggestions? 10


Download ppt "Choosing SATE Test Cases Based on CVEs Sue Wang October 1, 2010 The SAMATE Project 1SATE 2010 Workshop."

Similar presentations


Ads by Google