This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org
Reporting
OWASP Testing Guide v2 Table of Contents
Writing the report is easy if you have been collecting the information during each stage of testing. If is important for the customer to always get a systematic assessment looking for classes of issues in unknown developer code.
A question I always ask is, is this project a baseline report or the 12th report of a single application/system in the current year? Either way the report needs to be consistent in the delivery to customers, management etc. The reason this is important is that the 1st rule of security is to use multiple vendors for testing security right, but unless you specify you threat model type (see 5.1 How to value the real risk ) each report will have its own interpretation of risk and threats. If it does happen to be the 5 report on the same system, I typically submit an example report, and then ask the customer to compare apples to apples. If he liked certain areas of the other firms report, I want to review how it will look on our report ahead of time. It’s all about meeting expectations. On the technical level, I am a fan of using the VISA/MasterCard PCI standard for host reporting.
The following sections are typically standard
I. Executive Summary - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results. I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested. II. Systems Summary – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illustrate per system the status of an Application Assessment and if you were to add a few items, you could include the host/network level as well.
III System Detail – This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a individual system level and then detailed information and recommended corrective actions to the found issues.
Within the system detail I like to utilize the following sections:
Findings
Observations
Recommendations
Keeping in mind as an assessment person that you can find the biggest issues in the world, but it is the customer/employer etc... That has to accept the facts and elect to fix them. So the report is exactly that, as a subject matter expert you are providing the facts good or bad. Think in context of testifying in a court of law as to what you observed, what you found and what you recommended this document is your professional deliverable.
IIII Toolbox - This is the area that is used to describe to the customer the commercial and open-source tools that were used in conducting the assessment. Keep in mind that in the proposal, typically is when you have submitted the BIO's of the human capital (people) that are doing the testing. When custom scripts/code is utilized during the assessment, it should be disclosed in this section or noted as attachment. The customer paid for the assessment and should be provided with a list of the tools used to provide the service in addition to the digital knowledge of the tester.
Some may argue that the tools used by other professionals when providing a service (Doctors, Mechanics, Lawyers etc.. )are not disclosed so why should we do this as a standard practice? I answer that question with this, the OWASP Testing Guide is FREE and OPEN-SOURCE and all of the sections of this guide provide a manual explanation of how to test. Some will automate the testing process to reach a result but, tools are only 40-60% of the work. We believe this guide is to provide details and best practices in not only in conducting testing but as a information security professional, full-disclosure of how a result was reached as there are always false positives right ;)
APPENDIX A
Category | Ref Number | Name | Finding | Affected Item | Comment/Solution | Risk Value |
Information Gathering | Application Discovery | |||||
Spidering and googling | ||||||
Analisys of error code | ||||||
SSL/TLS Testing | ||||||
DB Listener Testing | ||||||
File extensions handling | ||||||
Old, backup and unreferenced files | ||||||
Business logic testing | ||||||
Authentication Testing | Default or guessable account | |||||
Brute Force | ||||||
Bypassing authentication schema | ||||||
Directory traversal/file include | ||||||
Vulnerable remember password and pwd reset | ||||||
Logout and Browser Cache Management Testing | ||||||
Session Management Testing | Session Management Schema | |||||
Session Token Manipulation | ||||||
Exposed Session Variables | ||||||
Session Riding | ||||||
HTTP Exploit | ||||||
Data Validation Testing | Cross site scripting | |||||
HTTP Methods and XST | ||||||
SQL Injection | ||||||
Stored procedure injection | ||||||
ORM Injection | ||||||
LDAP Injection | ||||||
XML Injection | ||||||
SSI Injection | ||||||
XPath Injection | ||||||
IMAP/SMTP Injection | ||||||
Code Injection | ||||||
OS Commanding | ||||||
Buffer overflow | ||||||
Incubated vulnerability | ||||||
Denial of Service Testing | Locking Customer Accounts | |||||
User Specified Object Allocation | ||||||
User Input as a Loop Counter | ||||||
Writing User Provided Data to Disk | ||||||
Failure to Release Resources | ||||||
Storing too Much Data in Session | ||||||
Web Services Testing | XML Structural Testing | |||||
XML content-level Testing | ||||||
HTTP GET parameters/REST Testing | ||||||
Naughty SOAP attachments | ||||||
Replay Testing | ||||||
AJAX Testing | AJAX Vulnerabilities |
OWASP Testing Guide v2
Here is the OWASP Testing Guide v2 Table of Contents