1 / 10

Choosing SATE Test Cases Based on CVEs

Choosing SATE Test Cases Based on CVEs. Sue Wang suewang_2000@yahoo.com October 1, 2010 The SAMATE Project http://samate.nist.gov/. Purpose and Motivation. Provide test cases with exploitable vulnerabilities In an ideal world a tool detects significant bugs

Télécharger la présentation

Choosing SATE Test Cases Based on CVEs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Choosing SATE Test Cases Based on CVEs Sue Wang suewang_2000@yahoo.com October 1, 2010 The SAMATE Project http://samate.nist.gov/ SATE 2010 Workshop

  2. Purpose and Motivation Provide test cases with exploitable vulnerabilities In an ideal world a tool detects significant bugs Also provide a fixed version of each test case To confirm low false positive rate • Mentioned by SATE organizers and detailed proposal by Paul Anderson (SATE 2009 workshop) • Brought up by tool makers and supported by users (SATE 2010 organization meeting)

  3. Selection Criteria • Open source software in C/C++ and Java • AND with known security-related bugs • AND get older versions • AND manually pinpoint the bugs • AND find a fixed version • AND compile the source code

  4. Primary Sources • Brainstorm and exchange ideas within SAMATE team and with others • Search for open sources, for instance • java-source.net • sourceforge.net • Other lists of scanned projects • Search for related vulnerabilities • CVE – Common Vulnerabilities and Exposures (cve.mitre.org) • NVD – National Vulnerabilities Database (nvd.nist.gov) • CVE Details – enhanced CVE data (www.cvedetails.com) • OSVDB – The Open Source Vulnerability Database (osvdb.org)

  5. Selection Process Narrowed down to 12 open source software

  6. Additional Selection Criteria

  7. Pinpointing the CVE Flaw

  8. Selected Test Cases

  9. Observations • Took far more time and effort than expected • CVEs are not created equal • Newer CVEs have higher quality info • Some CVEs required large amounts of research • Locating the path and sink is much harder than finding the fix • Reasons for low CVE selection rate • Not present in the selected version • Could not locate the source code or could not locate the sink • Useful resources and tips • Source’s patches, bug tracking and version control info • Combine information from multiple resources (e.g., version -> bug # -> tracking -> batches)

  10. Possible Future Work? • Re-use the 3 test cases • Pinpoint more CVE flaws • Involve developers for confirming some of the pinpointed flaws • Invite tool makers to map warnings to CVEs • Analyze the warning and CVE mappings amount different tool makers and SATE findings • Store well understood CVE related test cases in SRD • Other suggestions?

More Related