This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org

Difference between revisions of "Reporting"

From OWASP
Jump to: navigation, search
Line 1: Line 1:
 
{{Template:OWASP Testing Guide v2}}
 
{{Template:OWASP Testing Guide v2}}
  
Writing the report is easy if you have been collecting the information during each stage of your testing. It is important for the customer to always get a systematic assessment looking for classes of issues in unknown developer code so he/she can compare the results and always see a delta of new information.
+
Performing the assessment is only half of the overall process; the final product is the production of a well-written, and informative, report. A report should be easy to understand and highlight all the risks found during the assessment phase and appeal to both management and technical staff
  
A question I always ask is preassessment, is this project a baseline report or the 12th report of a single application/system in the current year? Either way the report needs to be consistent in the delivery to customers, management etc. The reason this is important, is that the 1st rule of security is to use multiple vendors for testing security right, but unless you specify a threat model type (see 5.1 How to value the real risk ) each report will have its own interpretation of risk and threats. If it does happen to be the 5th report on the same system, I typically submit an example report, and then ask the customer to compare "''apples to apples''". If he liked certain areas of the other firms report, I want to review how it will look on our report ahead of time. It’s all about meeting expectations. On the technical level, I am a fan of using the VISA/MasterCard PCI standard for host reporting.  
+
The report needs to have three major sections and be created in a manner that allows each section to be split off and printed and given to the appropriate teams, such as the developers or system managers.
  
The following sections are typically standard
+
The sections are:
 +
 +
'''I. Executive Summary'''
  
'''I. Executive Summary''' - If this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results (Scope). I think we all know that management likes visual graphics to illustrate problems so why not provide some to depict all of the systems tested.  
+
The executive summary sums up the overall findings of the assessment and gives managers, or system owners, an idea of the overall risk faced. The language used should be more suited to people who are not technically aware and should include graphs or other charts which show the risk level. It is recommended that a summary be included, which details when the testing commenced and when it was completed.
II. Systems Summary – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illustrate per system the status of an Application Assessment and if you were to add a few items, you could include the host/network level as well.  
 
  
'''III System Detail''' – This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a individual system level and then detailed information and recommended corrective actions to the found issues.  
+
Another section, which is often overlooked, is a paragraph on implications and actions. This allows the system owners to understand what is required to be done in order to ensure the system remains secure.  
  
Within the system detail I like to utilize the following sections:
+
'''II. Technical Management Overview'''
  
'''Findings'''
+
The technical management overview section often appeals to technical managers who require more technical detail than found in the executive summary. This section should include details about the scope of the assessment, the targets included and any caveats, such as system availability etc.
 +
This section also needs to include an introduction on the risk rating used throughout the report and then finally a technical summary of the findings.
  
'''Observations'''
+
'''III Assessment Findings'''  
  
'''Recommendations'''
+
The last section of the report is the section, which includes detailed technical detail about the vulnerabilities found, and the approaches needed to ensure they are resolved. This section is aimed at a technical level and should include all the necessary information for the technical teams to understand the issue and be able to solve it.
  
Keeping in mind as an assessment person that you can find the biggest issues in the world, but it is the customer/employer etc... That has to accept the facts and elect to fix them. So the report is exactly that, as a subject matter expert you are providing the facts good or bad. Think in context of testifying in a court of law as to what you observed, what you found and what you recommended this document is your professional deliverable.
+
The findings section should include:
  
'''IIII Toolbox''' - This is the area that is used to describe to the customer the commercial and open-source tools that were used in conducting the assessment. Keep in mind that in the proposal, typically is when you have submitted the BIO's of the human capital (people) that are doing the testing.  When custom scripts/code is utilized during the assessment, it should be disclosed in this section or noted as attachment. The customer paid for the assessment and should be provided with a list of the tools used to provide the service in addition to the knowledge of the tester. 
+
- A reference number for easy reference with screenshots
 +
- The host/application
 +
- The risk rating and impact value
 +
- A technical description of the issue
 +
- A section on resolving the issue
  
Some may argue that the tools used by other professionals when providing a service (Doctors, Mechanics, Lawyers etc.. )are not disclosed so why should we do this as a standard practice?  I answer that question with this, the OWASP Testing Guide is FREE and OPEN-SOURCE and all of the sections of this guide provide a manual explanation of how to test.  Some will automate the testing process to reach a result but, tools are only 40-60% of the work. We believe this guide is to provide details and best practices in not only in conducting testing but as a information security professional, full-disclosure of how a result was reached as there are always false positives right ;)
+
Each finding should be clear and concise and give the reader of the report a full understanding of the issue at hand
 +
 
 +
 
 +
'''IIII Toolbox''' 
 +
 
 +
This section is used to describe the commercial and open-source tools that were used in conducting the assessment. When custom scripts/code are utilized during the assessment, it should be disclosed in this section or noted as attachment.
 +
 
 +
Some may argue that the tools used by other professionals when providing a service (Doctors, Mechanics, Lawyers etc.) Are not disclosed so why should we do this as a standard practice?  I answer that question with this, the OWASP Testing Guide is FREE and OPEN-SOURCE and all of the sections of this guide provide a manual explanation of how to test.  Some will automate the testing process to reach a result but, tools are only 40-60% of the work. We believe this guide is to provide details and best practices in not only in conducting testing but also as a information security professional, full-disclosure of how a result was reached as there are always false positives right ;)
  
  

Revision as of 13:05, 17 December 2006

OWASP Testing Guide v2 Table of Contents


Performing the assessment is only half of the overall process; the final product is the production of a well-written, and informative, report. A report should be easy to understand and highlight all the risks found during the assessment phase and appeal to both management and technical staff

The report needs to have three major sections and be created in a manner that allows each section to be split off and printed and given to the appropriate teams, such as the developers or system managers.

The sections are:

I. Executive Summary

The executive summary sums up the overall findings of the assessment and gives managers, or system owners, an idea of the overall risk faced. The language used should be more suited to people who are not technically aware and should include graphs or other charts which show the risk level. It is recommended that a summary be included, which details when the testing commenced and when it was completed.

Another section, which is often overlooked, is a paragraph on implications and actions. This allows the system owners to understand what is required to be done in order to ensure the system remains secure.

II. Technical Management Overview

The technical management overview section often appeals to technical managers who require more technical detail than found in the executive summary. This section should include details about the scope of the assessment, the targets included and any caveats, such as system availability etc. This section also needs to include an introduction on the risk rating used throughout the report and then finally a technical summary of the findings.

III Assessment Findings

The last section of the report is the section, which includes detailed technical detail about the vulnerabilities found, and the approaches needed to ensure they are resolved. This section is aimed at a technical level and should include all the necessary information for the technical teams to understand the issue and be able to solve it.

The findings section should include:

- A reference number for easy reference with screenshots - The host/application - The risk rating and impact value - A technical description of the issue - A section on resolving the issue

Each finding should be clear and concise and give the reader of the report a full understanding of the issue at hand


IIII Toolbox

This section is used to describe the commercial and open-source tools that were used in conducting the assessment. When custom scripts/code are utilized during the assessment, it should be disclosed in this section or noted as attachment.

Some may argue that the tools used by other professionals when providing a service (Doctors, Mechanics, Lawyers etc.) Are not disclosed so why should we do this as a standard practice? I answer that question with this, the OWASP Testing Guide is FREE and OPEN-SOURCE and all of the sections of this guide provide a manual explanation of how to test. Some will automate the testing process to reach a result but, tools are only 40-60% of the work. We believe this guide is to provide details and best practices in not only in conducting testing but also as a information security professional, full-disclosure of how a result was reached as there are always false positives right ;)


APPENDIX A

Category Ref Number Name Finding Affected Item Comment/Solution Risk Value
Information Gathering Application Discovery
Spidering and googling
Analisys of error code
SSL/TLS Testing
DB Listener Testing
File extensions handling
Old, backup and unreferenced files
Business logic testing
Authentication Testing Default or guessable account
Brute Force
Bypassing authentication schema
Directory traversal/file include
Vulnerable remember password and pwd reset
Logout and Browser Cache Management Testing
Session Management Testing Session Management Schema
Session Token Manipulation
Exposed Session Variables
Session Riding
HTTP Exploit
Data Validation Testing Cross site scripting
HTTP Methods and XST
SQL Injection
Stored procedure injection
ORM Injection
LDAP Injection
XML Injection
SSI Injection
XPath Injection
IMAP/SMTP Injection
Code Injection
OS Commanding
Buffer overflow
Incubated vulnerability
Denial of Service Testing Locking Customer Accounts
User Specified Object Allocation
User Input as a Loop Counter
Writing User Provided Data to Disk
Failure to Release Resources
Storing too Much Data in Session
Web Services Testing XML Structural Testing
XML content-level Testing
HTTP GET parameters/REST Testing
Naughty SOAP attachments
Replay Testing
AJAX Testing AJAX Vulnerabilities




OWASP Testing Guide v2

Here is the OWASP Testing Guide v2 Table of Contents