This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org

Difference between revisions of "Reporting"

From OWASP
Jump to: navigation, search
m (Reporting)
 
(31 intermediate revisions by 8 users not shown)
Line 1: Line 1:
{{Template:OWASP Testing Guide v2}}
+
{{Template:OWASP Testing Guide v4}}
  
Writing the report is easy if you have been collecting the information during each stage of testing. If is important for the customer to always get a systematic assessment looking for classes of issues in unknown developer code.  
+
Performing the technical side of the assessment is only half of the overall assessment process. The final product is the production of a well written and informative report. A report should be easy to understand and should highlight all the risks found during the assessment phase. The report should appeal to both executive management and technical staff.  
  
A question I always ask is, is this project a baseline report or the 12th report of a single application/system in the current year? Either way the report needs to be consistent in the delivery to customers, management etc. The reason this is important is that the 1st rule of security is to use multiple vendors for testing security right, but unless you specify you threat model type (see 5.1 How to value the real risk ) each report will have its own interpretation of risk and threats. If it does happen to be the 5 report on the same system, I typically submit an example report, and then ask the customer to compare apples to apples. If he liked certain areas of the other firms report, I want to review how it will look on our report ahead of time. It’s all about meeting expectations. On the technical level, I am a fan of using the VISA/MasterCard PCI standard for host reporting.  
+
The report needs to have three major sections. It should be created in a manner that allows each separate section to be printed and given to the appropriate teams, such as the developers or system managers. The recommended sections are outlined below.
  
The following sections are typically standard
+
 +
'''1. Executive Summary'''
  
'''I. Executive Summary''' - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results. I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested.  
+
The executive summary sums up the overall findings of the assessment and gives business managers and system owners a high level view of the vulnerabilities discovered. The language used should be more suited to people who are not technically aware and should include graphs or other charts which show the risk level. Keep in mind that executives will likely only have time to read this summary and will want two questions answered in plain language: 1) ''What's wrong?'' 2) ''How do I fix it?'' You have one page to answer these questions.
II. Systems Summary – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illustrate per system the status of an Application Assessment and if you were to add a few items, you could include the host/network level as well.  
 
  
'''III System Detail''' This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a individual system level and then detailed information and recommended corrective actions to the found issues.  
+
The executive summary should plainly state that the vulnerabilities and their severity is an '''input''' to their organizational risk management process, not an outcome or remediation. It is safest to explain that tester does not understand the threats faced by the organization or business consequences if the vulnerabilities are exploited. This is the job of the risk professional who calculates risk levels based on this and other information. Risk management will typically be part of the organization's IT Security Governance, Risk and Compliance (GRC) regime and this report will simply provide an input to that process.
  
Within the system detail I like to utilize the following sections:
+
'''2. Test Parameters'''
'''
 
Findings
 
Observations
 
Recommendations'''  
 
  
Keeping in mind as an assessment person that you can find the biggest issues in the world, but it is the customer/employer etc... That has to accept the facts and elect to fix them. So the report is exactly that, as a subject matter expert you are providing the facts good or bad. Think in context of testifying in a court of law as to what you observed, what you found and what you recommended this document is your professional deliverable.
+
The Introduction should outline the parameters of the security testing, the findings and remediation. Some suggested section headings include:
'''
 
IIII Toolbox''' - This is the area that is used to describe to the customer the commercial and open-source tools that were used in conducting the assessment. Keep in mind that in the proposal, typically is when you have submitted the BIO's of the human capital (people) that are doing the testing.  When custom scripts/code is utilized during the assessment, it should be disclosed in this section or noted as attachment. The customer paid for the assessment and should be provided with a list of the tools used to provide the service in addition to the digital knowledge of the tester.
 
  
Some may argue that the tools used by other professionals when providing a service (Doctors, Mechanics, Lawyers etc.. )are not disclosed so why should we do this as a standard practice? I answer that question with this, the OWASP Testing Guide is FREE and OPEN-SOURCE and all of the sections of this guide provide a manual explanation of how to test.  Some will automate the testing process to reach a result but, tools are only 40-60% of the work. We believe this guide is to provide details and best practices in not only in conducting testing but as a information security professional, full-disclosure of how a result was reached as there are always false positives right ;)
+
2.1 Project Objective:
 +
This section outlines the project objectives and the expected outcome of the assessment.
  
 +
2.2 Project Scope:
 +
This section outlines the agreed scope.
  
'''APPENDIX A'''
+
2.3 Project Schedule:
 +
This section outlines when the testing commenced and when it was completed.
  
{| border=1
+
2.4 Targets:
|| '''Category''' || '''Ref Number''' || '''Name ''' || '''Finding ''' ||'''Affected Item'''|| '''Comment/Solution ''' || '''Risk Value '''
+
This section lists the number of applications or targeted systems.
 +
 
 +
2.5 Limitations:
 +
This section outlines every limitation which was faced throughout the assessment. For example, limitations of project-focused tests, limitation in the security testing methods, performance or technical issues that the tester come across during the course of assessment, etc.
 +
 
 +
2.6 Findings Summary:
 +
This section outlines the vulnerabilities that were discovered during testing.
 +
 
 +
2.7 Remediation Summary:
 +
This section outlines the action plan for fixing the vulnerabilities that were discovered during testing.
 +
 
 +
'''3. Findings'''  
 +
 
 +
The last section of the report includes detailed technical information about the vulnerabilities found and the actions needed to resolve them. This section is aimed at a technical level and should include all the necessary information for the technical teams to understand the issue and resolve it. Each finding should be clear and concise and give the reader of the report a full understanding of the issue at hand.
 +
 
 +
 
 +
The findings section should include:
 +
 
 +
* Screenshots and command lines to indicate what tasks were undertaken during the execution of the test case
 +
* The affected item
 +
* A technical description of the issue and the affected function or object
 +
* A section on resolving the issue
 +
* The severity rating [1], with vector notation if using CVSS
 +
 
 +
The following is the list of controls that were tested during the assessment:
 +
 
 +
{| class="wikitable"
 +
| align="center" style="background:#f0f0f0;"|'''Test ID'''
 +
| align="center" style="background:#f0f0f0;"|'''Test Description'''
 +
| align="center" style="background:#f0f0f0;"|'''Findings'''
 +
| align="center" style="background:#f0f0f0;"|'''Severity'''
 +
| align="center" style="background:#f0f0f0;"|'''Recommendations'''
 
|-
 
|-
|| Information Gathering ||  || Application Discovery ||  ||  ||  ||
+
| ||||
 
|-
 
|-
||  ||  || Spidering and googling ||  ||  ||  ||
+
| colspan="5" | '''Information Gathering'''
 
|-
 
|-
|| || || Analisys of error code ||  ||  ||  ||
+
| OTG-INFO-001||Conduct Search Engine Discovery and Reconnaissance for Information Leakage || || ||
 
|-
 
|-
|| || || SSL/TLS Testing ||  ||  ||  ||
+
| OTG-INFO-002||Fingerprint Web Server || || ||
 
|-
 
|-
|| || || DB Listener Testing ||  ||  ||  ||
+
| OTG-INFO-003||Review Webserver Metafiles for Information Leakage || || ||
 
|-
 
|-
|| || || File extensions handling ||  ||  ||  ||
+
| OTG-INFO-004||Enumerate Applications on Webserver || || ||
 
|-
 
|-
|| ||  || Old, backup and unreferenced files ||  || || ||
+
| OTG-INFO-005||Review Webpage Comments and Metadata for Information Leakage || || ||
 
|-
 
|-
||Business logic testing  || || ||  ||  ||  ||
+
| OTG-INFO-006||Identify application entry points || || ||
 
|-
 
|-
|| Authentication Testing || || Default or guessable account ||  ||  ||  ||
+
| OTG-INFO-007||Map execution paths through application || || ||
 
|-
 
|-
|| || || Brute Force ||  ||  ||  ||
+
| OTG-INFO-009||Fingerprint Web Application Framework || || ||
 
|-
 
|-
|| || || Bypassing authentication schema ||  ||  ||  ||
+
| OTG-INFO-009||Fingerprint Web Application || || ||
 
|-
 
|-
|| || || Directory traversal/file include ||  ||  ||  ||
+
| OTG-INFO-010||Map Application Architecture || || ||
 
|-
 
|-
|| || || Vulnerable remember password and pwd reset || ||  ||  ||
+
| || || || || ||
 
|-
 
|-
|| ||  || Logout and Browser Cache Management Testing ||  ||  ||  ||
+
| colspan="5" | '''Configuration and Deploy Management Testing'''
 
|-
 
|-
|| Session Management Testing || || Session Management Schema  ||  ||  || ||
+
| OTG-CONFIG-001||Test Network/Infrastructure Configuration || || ||
 
|-
 
|-
|| || || Session Token Manipulation ||  ||  ||  ||
+
| OTG-CONFIG-002 ||Test Application Platform Configuration || || ||
 
|-
 
|-
|| || || Exposed Session Variables ||  ||  ||  ||
+
| OTG-CONFIG-003||Test File Extensions Handling for Sensitive Information || || ||
 
|-
 
|-
|| || || Session Riding ||  ||  ||  ||
+
| OTG-CONFIG-004|| Backup and Unreferenced Files for Sensitive Information || || ||
 
|-
 
|-
|| || || HTTP Exploit ||  ||  ||  ||
+
| OTG-CONFIG-005||Enumerate Infrastructure and Application Admin Interfaces || || ||
 
|-
 
|-
|| Data Validation Testing || || Cross site scripting ||  ||  ||  ||
+
| OTG-CONFIG-006||Test HTTP Methods || || ||
 
|-
 
|-
||  ||  || HTTP Methods and XST ||  || || ||
+
| OTG-CONFIG-007||Test HTTP Strict Transport Security || || ||
 
|-
 
|-
|| || || SQL Injection ||  ||  ||  ||
+
| OTG-CONFIG-008||Test RIA cross domain policy || || ||
 
|-
 
|-
|| || || Stored procedure injection || ||  ||  ||
+
| |||| || || ||
 
|-
 
|-
||  ||  || ORM Injection ||  ||  ||  ||
+
| colspan="5" | '''Identity Management Testing'''
 
|-
 
|-
|| || || LDAP Injection ||  ||  ||  ||
+
| OTG-IDENT-001||Test Role Definitions || || ||
 
|-
 
|-
|| || || XML Injection ||  ||  ||  ||
+
| OTG-IDENT-002||Test User Registration Process || || ||
 
|-
 
|-
|| || || SSI Injection ||  ||  ||  ||
+
| OTG-IDENT-003||Test Account Provisioning Process || || ||
 
|-
 
|-
|| || || XPath Injection ||  ||  ||  ||
+
| OTG-IDENT-004||Testing for Account Enumeration and Guessable User Account || || ||
 
|-
 
|-
|| || || IMAP/SMTP Injection ||  ||  ||  ||
+
| OTG-IDENT-005||Testing for Weak or unenforced username policy || || ||
 
|-
 
|-
|| || || Code Injection ||  ||  ||  ||
+
| OTG-IDENT-006||Test Permissions of Guest/Training Accounts || || ||
 
|-
 
|-
|| || || OS Commanding ||  ||  ||  ||
+
| OTG-IDENT-007||Test Account Suspension/Resumption Process || || ||
 
|-
 
|-
|| || || Buffer overflow || ||  ||  ||
+
| |||| || || ||
 
|-
 
|-
||  ||  || Incubated vulnerability ||  ||  ||  ||
+
| colspan="5" | '''Authentication Testing'''
 
|-
 
|-
|| Denial of Service Testing || || Locking Customer Accounts ||  ||  || ||
+
| OTG-AUTHN-001||Testing for Credentials Transported over an Encrypted Channel || || ||
 
|-
 
|-
|| || || User Specified Object Allocation ||  ||  ||  ||
+
| OTG-AUTHN-002||Testing for default credentials || || ||
 
|-
 
|-
|| || || User Input as a Loop Counter ||  ||  ||  ||
+
| OTG-AUTHN-003||Testing for Weak lock out mechanism || || ||
 
|-
 
|-
|| || || Writing User Provided Data to Disk ||  ||  ||  ||
+
| OTG-AUTHN-004||Testing for bypassing authentication schema || || ||
 
|-
 
|-
|| || || Failure to Release Resources ||  ||  ||  ||
+
| OTG-AUTHN-005||Test remember password functionality || || ||
 
|-
 
|-
|| || || Storing too Much Data in Session ||  ||  ||  ||
+
| OTG-AUTHN-006||Testing for Browser cache weakness || || ||
 
|-
 
|-
|| Web Services Testing  ||  || XML Structural Testing || ||  || ||
+
| OTG-AUTHN-007||Testing for Weak password policy || || ||
 
|-
 
|-
||  ||  || XML content-level Testing || ||  || ||
+
| OTG-AUTHN-008||Testing for Weak security question/answer || || ||
 
|-
 
|-
|| ||  || HTTP GET parameters/REST Testing || ||  || ||
+
| OTG-AUTHN-009||Testing for weak password change or reset functionalities || || ||
 
|-
 
|-
|| || || Naughty SOAP attachments ||  ||  ||  ||
+
| OTG-AUTHN-010||Testing for Weaker authentication in alternative channel || || ||
 
|-
 
|-
|| || || Replay Testing  || ||  ||  ||
+
| |||| || || ||
 
|-
 
|-
|| AJAX Testing ||  || AJAX Vulnerabilities  ||  ||  ||  ||
+
| colspan="5" | '''Authorization Testing'''
 
|-
 
|-
 +
| OTG-AUTHZ-001||Testing Directory traversal/file include || || ||
 +
|-
 +
| OTG-AUTHZ-002||Testing for bypassing authorization schema || || ||
 +
|-
 +
| OTG-AUTHZ-003||Testing for Privilege Escalation || || ||
 +
|-
 +
| OTG-AUTHZ-004||Testing for Insecure Direct Object References || || ||
 +
|-
 +
| |||| || || ||
 +
|-
 +
| colspan="5" | '''Session Management Testing'''
 +
|-
 +
| OTG-SESS-001 ||Testing for Bypassing Session Management Schema || || ||
 +
|-
 +
| OTG-SESS-002 ||Testing for Cookies attributes || || ||
 +
|-
 +
| OTG-SESS-003 ||Testing for Session Fixation || || ||
 +
|-
 +
| OTG-SESS-004 ||Testing for Exposed Session Variables || || ||
 +
|-
 +
| OTG-SESS-005 ||Testing for Cross Site Request Forgery || || ||
 +
|-
 +
| OTG-SESS-006 ||Testing for logout functionality || || ||
 +
|-
 +
| OTG-SESS-007 ||Test Session Timeout || || ||
 +
|-
 +
| OTG-SESS-008 ||Testing for Session puzzling || || ||
 +
|-
 +
| || || || || ||
 +
|-
 +
| colspan="5" | '''Input Validation Testing'''
 +
|-
 +
| OTG-INPVAL-001||Testing for Reflected Cross Site Scripting || || ||
 +
|-
 +
| OTG-INPVAL-002||Testing for Stored Cross Site Scripting || || ||
 +
|-
 +
| OTG-INPVAL-003 ||Testing for HTTP Verb Tampering || || ||
 +
|-
 +
| OTG-INPVAL-004||Testing for HTTP Parameter pollution || || ||
 +
|-
 +
| OTG-INPVAL-006||Testing for SQL Injection || || ||
 +
|-
 +
| ||Oracle Testing || || ||
 +
|-
 +
| ||MySQL Testing || || ||
 +
|-
 +
| ||SQL Server Testing || || ||
 +
|-
 +
| ||Testing PostgreSQL || || ||
 +
|-
 +
| ||MS Access Testing || || ||
 +
|-
 +
| ||Testing for NoSQL injection || || ||
 +
|-
 +
| OTG-INPVAL-007||Testing for LDAP Injection || || ||
 +
|-
 +
| OTG-INPVAL-008||Testing for ORM Injection || || ||
 +
|-
 +
| OTG-INPVAL-009||Testing for XML Injection || || ||
 +
|-
 +
| OTG-INPVAL-010||Testing for SSI Injection || || ||
 +
|-
 +
| OTG-INPVAL-011||Testing for XPath Injection || || ||
 +
|-
 +
| OTG-INPVAL-012||IMAP/SMTP Injection || || ||
 +
|-
 +
| OTG-INPVAL-013||Testing for Code Injection || || ||
 +
|-
 +
| ||Testing for Local File Inclusion || || ||
 +
|-
 +
| ||Testing for Remote File Inclusion || || ||
 +
|-
 +
| OTG-INPVAL-014||Testing for Command Injection || || ||
 +
|-
 +
| OTG-INPVAL-015||Testing for Buffer overflow || || ||
 +
|-
 +
| ||Testing for Heap overflow || || ||
 +
|-
 +
| ||Testing for Stack overflow || || ||
 +
|-
 +
| ||Testing for Format string || || ||
 +
|-
 +
| OTG-INPVAL-016||Testing for incubated vulnerabilities || || ||
 +
|-
 +
| OTG-INPVAL-017||Testing for HTTP Splitting/Smuggling || || ||
 +
|-
 +
| |||| || || ||
 +
|-
 +
| colspan="5" | '''Error Handling'''
 +
|-
 +
| OTG-ERR-001||Analysis of Error Codes || || ||
 +
|-
 +
| OTG-ERR-002||Analysis of Stack Traces || || ||
 +
|-
 +
| || || || || ||
 +
|-
 +
| colspan="5" | '''Cryptography'''
 +
|-
 +
| OTG-CRYPST-001||Testing for Weak SSL/TSL Ciphers,  Insufficient Transport Layer Protection || || ||
 +
|-
 +
| OTG-CRYPST-002||Testing for Padding Oracle || || ||
 +
|-
 +
| OTG-CRYPST-003||Testing for Sensitive information sent via unencrypted channels || || ||
 +
|-
 +
| |||| || || ||
 +
|-
 +
| colspan="5" | '''Business Logic Testing'''
 +
|-
 +
| OTG-BUSLOGIC-001||Test Business Logic Data Validation || || ||
 +
|-
 +
| OTG-BUSLOGIC-002||Test Ability to Forge Requests || || ||
 +
|-
 +
| OTG-BUSLOGIC-003||Test Integrity Checks || || ||
 +
|-
 +
| OTG-BUSLOGIC-004||Test for Process Timing || || ||
 +
|-
 +
| OTG-BUSLOGIC-005||Test Number of Times a Function Can be Used Limits || || ||
 +
|-
 +
| OTG-BUSLOGIC-006||Testing for the Circumvention of Work Flows || || ||
 +
|-
 +
| OTG-BUSLOGIC-007||Test Defenses Against Application Mis-use || || ||
 +
|-
 +
| OTG-BUSLOGIC-008||Test Upload of Unexpected File Types || || ||
 +
|-
 +
| OTG-BUSLOGIC-009||Test Upload of Malicious Files || || ||
 +
|-
 +
| |||| || || ||
 +
|-
 +
| colspan="5" | '''Client Side Testing'''
 +
|-
 +
| OTG-CLIENT-001||Testing for DOM based Cross Site Scripting || || ||
 +
|-
 +
| OTG-CLIENT-002||Testing for JavaScript Execution || || ||
 +
|-
 +
| OTG-CLIENT-003||Testing for HTML Injection || || ||
 +
|-
 +
| OTG-CLIENT-004 ||Testing for Client Side URL Redirect || || ||
 +
|-
 +
| OTG-CLIENT-005||Testing for CSS Injection || || ||
 +
|-
 +
| OTG-CLIENT-006||Testing for Client Side Resource Manipulation || || ||
 +
|-
 +
| OTG-CLIENT-007||Test Cross Origin Resource Sharing || || ||
 +
|-
 +
| OTG-CLIENT-008||Testing for Cross Site Flashing || || ||
 +
|-
 +
| OTG-CLIENT-009||Testing for Clickjacking || || ||
 +
|-
 +
| OTG-CLIENT-010||Testing WebSockets || || ||
 +
|-
 +
| OTG-CLIENT-011||Test Web Messaging || || ||
 +
|-
 +
| OTG-CLIENT-012||Test Local Storage || || ||
 +
|-
 +
|
 
|}
 
|}
  
 +
'''Appendix''' 
  
 +
This section is often used to describe the commercial and open-source tools that were used in conducting the assessment. When custom scripts or code are utilized during the assessment, it should be disclosed in this section or noted as attachment. Customers appreciate when the methodology used by the consultants is included. It gives them an idea of the thoroughness of the assessment and what areas were included.
  
 
+
'''References'''
 
+
Industry standard vulnerability severity and risk rankings (CVSS) [1] – http://www.first.org/cvss
{{Category:OWASP Testing Project AoC}}
 

Latest revision as of 19:04, 1 December 2014

This article is part of the new OWASP Testing Guide v4.
Back to the OWASP Testing Guide v4 ToC: https://www.owasp.org/index.php/OWASP_Testing_Guide_v4_Table_of_Contents Back to the OWASP Testing Guide Project: https://www.owasp.org/index.php/OWASP_Testing_Project


Performing the technical side of the assessment is only half of the overall assessment process. The final product is the production of a well written and informative report. A report should be easy to understand and should highlight all the risks found during the assessment phase. The report should appeal to both executive management and technical staff.

The report needs to have three major sections. It should be created in a manner that allows each separate section to be printed and given to the appropriate teams, such as the developers or system managers. The recommended sections are outlined below.


1. Executive Summary

The executive summary sums up the overall findings of the assessment and gives business managers and system owners a high level view of the vulnerabilities discovered. The language used should be more suited to people who are not technically aware and should include graphs or other charts which show the risk level. Keep in mind that executives will likely only have time to read this summary and will want two questions answered in plain language: 1) What's wrong? 2) How do I fix it? You have one page to answer these questions.

The executive summary should plainly state that the vulnerabilities and their severity is an input to their organizational risk management process, not an outcome or remediation. It is safest to explain that tester does not understand the threats faced by the organization or business consequences if the vulnerabilities are exploited. This is the job of the risk professional who calculates risk levels based on this and other information. Risk management will typically be part of the organization's IT Security Governance, Risk and Compliance (GRC) regime and this report will simply provide an input to that process.

2. Test Parameters

The Introduction should outline the parameters of the security testing, the findings and remediation. Some suggested section headings include:

2.1 Project Objective: This section outlines the project objectives and the expected outcome of the assessment.

2.2 Project Scope: This section outlines the agreed scope.

2.3 Project Schedule: This section outlines when the testing commenced and when it was completed.

2.4 Targets: This section lists the number of applications or targeted systems.

2.5 Limitations: This section outlines every limitation which was faced throughout the assessment. For example, limitations of project-focused tests, limitation in the security testing methods, performance or technical issues that the tester come across during the course of assessment, etc.

2.6 Findings Summary: This section outlines the vulnerabilities that were discovered during testing.

2.7 Remediation Summary: This section outlines the action plan for fixing the vulnerabilities that were discovered during testing.

3. Findings

The last section of the report includes detailed technical information about the vulnerabilities found and the actions needed to resolve them. This section is aimed at a technical level and should include all the necessary information for the technical teams to understand the issue and resolve it. Each finding should be clear and concise and give the reader of the report a full understanding of the issue at hand.


The findings section should include:

  • Screenshots and command lines to indicate what tasks were undertaken during the execution of the test case
  • The affected item
  • A technical description of the issue and the affected function or object
  • A section on resolving the issue
  • The severity rating [1], with vector notation if using CVSS

The following is the list of controls that were tested during the assessment:

Test ID Test Description Findings Severity Recommendations
Information Gathering
OTG-INFO-001 Conduct Search Engine Discovery and Reconnaissance for Information Leakage
OTG-INFO-002 Fingerprint Web Server
OTG-INFO-003 Review Webserver Metafiles for Information Leakage
OTG-INFO-004 Enumerate Applications on Webserver
OTG-INFO-005 Review Webpage Comments and Metadata for Information Leakage
OTG-INFO-006 Identify application entry points
OTG-INFO-007 Map execution paths through application
OTG-INFO-009 Fingerprint Web Application Framework
OTG-INFO-009 Fingerprint Web Application
OTG-INFO-010 Map Application Architecture
Configuration and Deploy Management Testing
OTG-CONFIG-001 Test Network/Infrastructure Configuration
OTG-CONFIG-002 Test Application Platform Configuration
OTG-CONFIG-003 Test File Extensions Handling for Sensitive Information
OTG-CONFIG-004 Backup and Unreferenced Files for Sensitive Information
OTG-CONFIG-005 Enumerate Infrastructure and Application Admin Interfaces
OTG-CONFIG-006 Test HTTP Methods
OTG-CONFIG-007 Test HTTP Strict Transport Security
OTG-CONFIG-008 Test RIA cross domain policy
Identity Management Testing
OTG-IDENT-001 Test Role Definitions
OTG-IDENT-002 Test User Registration Process
OTG-IDENT-003 Test Account Provisioning Process
OTG-IDENT-004 Testing for Account Enumeration and Guessable User Account
OTG-IDENT-005 Testing for Weak or unenforced username policy
OTG-IDENT-006 Test Permissions of Guest/Training Accounts
OTG-IDENT-007 Test Account Suspension/Resumption Process
Authentication Testing
OTG-AUTHN-001 Testing for Credentials Transported over an Encrypted Channel
OTG-AUTHN-002 Testing for default credentials
OTG-AUTHN-003 Testing for Weak lock out mechanism
OTG-AUTHN-004 Testing for bypassing authentication schema
OTG-AUTHN-005 Test remember password functionality
OTG-AUTHN-006 Testing for Browser cache weakness
OTG-AUTHN-007 Testing for Weak password policy
OTG-AUTHN-008 Testing for Weak security question/answer
OTG-AUTHN-009 Testing for weak password change or reset functionalities
OTG-AUTHN-010 Testing for Weaker authentication in alternative channel
Authorization Testing
OTG-AUTHZ-001 Testing Directory traversal/file include
OTG-AUTHZ-002 Testing for bypassing authorization schema
OTG-AUTHZ-003 Testing for Privilege Escalation
OTG-AUTHZ-004 Testing for Insecure Direct Object References
Session Management Testing
OTG-SESS-001 Testing for Bypassing Session Management Schema
OTG-SESS-002 Testing for Cookies attributes
OTG-SESS-003 Testing for Session Fixation
OTG-SESS-004 Testing for Exposed Session Variables
OTG-SESS-005 Testing for Cross Site Request Forgery
OTG-SESS-006 Testing for logout functionality
OTG-SESS-007 Test Session Timeout
OTG-SESS-008 Testing for Session puzzling
Input Validation Testing
OTG-INPVAL-001 Testing for Reflected Cross Site Scripting
OTG-INPVAL-002 Testing for Stored Cross Site Scripting
OTG-INPVAL-003 Testing for HTTP Verb Tampering
OTG-INPVAL-004 Testing for HTTP Parameter pollution
OTG-INPVAL-006 Testing for SQL Injection
Oracle Testing
MySQL Testing
SQL Server Testing
Testing PostgreSQL
MS Access Testing
Testing for NoSQL injection
OTG-INPVAL-007 Testing for LDAP Injection
OTG-INPVAL-008 Testing for ORM Injection
OTG-INPVAL-009 Testing for XML Injection
OTG-INPVAL-010 Testing for SSI Injection
OTG-INPVAL-011 Testing for XPath Injection
OTG-INPVAL-012 IMAP/SMTP Injection
OTG-INPVAL-013 Testing for Code Injection
Testing for Local File Inclusion
Testing for Remote File Inclusion
OTG-INPVAL-014 Testing for Command Injection
OTG-INPVAL-015 Testing for Buffer overflow
Testing for Heap overflow
Testing for Stack overflow
Testing for Format string
OTG-INPVAL-016 Testing for incubated vulnerabilities
OTG-INPVAL-017 Testing for HTTP Splitting/Smuggling
Error Handling
OTG-ERR-001 Analysis of Error Codes
OTG-ERR-002 Analysis of Stack Traces
Cryptography
OTG-CRYPST-001 Testing for Weak SSL/TSL Ciphers, Insufficient Transport Layer Protection
OTG-CRYPST-002 Testing for Padding Oracle
OTG-CRYPST-003 Testing for Sensitive information sent via unencrypted channels
Business Logic Testing
OTG-BUSLOGIC-001 Test Business Logic Data Validation
OTG-BUSLOGIC-002 Test Ability to Forge Requests
OTG-BUSLOGIC-003 Test Integrity Checks
OTG-BUSLOGIC-004 Test for Process Timing
OTG-BUSLOGIC-005 Test Number of Times a Function Can be Used Limits
OTG-BUSLOGIC-006 Testing for the Circumvention of Work Flows
OTG-BUSLOGIC-007 Test Defenses Against Application Mis-use
OTG-BUSLOGIC-008 Test Upload of Unexpected File Types
OTG-BUSLOGIC-009 Test Upload of Malicious Files
Client Side Testing
OTG-CLIENT-001 Testing for DOM based Cross Site Scripting
OTG-CLIENT-002 Testing for JavaScript Execution
OTG-CLIENT-003 Testing for HTML Injection
OTG-CLIENT-004 Testing for Client Side URL Redirect
OTG-CLIENT-005 Testing for CSS Injection
OTG-CLIENT-006 Testing for Client Side Resource Manipulation
OTG-CLIENT-007 Test Cross Origin Resource Sharing
OTG-CLIENT-008 Testing for Cross Site Flashing
OTG-CLIENT-009 Testing for Clickjacking
OTG-CLIENT-010 Testing WebSockets
OTG-CLIENT-011 Test Web Messaging
OTG-CLIENT-012 Test Local Storage

Appendix

This section is often used to describe the commercial and open-source tools that were used in conducting the assessment. When custom scripts or code are utilized during the assessment, it should be disclosed in this section or noted as attachment. Customers appreciate when the methodology used by the consultants is included. It gives them an idea of the thoroughness of the assessment and what areas were included.

References Industry standard vulnerability severity and risk rankings (CVSS) [1] – http://www.first.org/cvss