This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org
Top 10 2013/ProjectMethodology
About
The purpose of this page is to provide greater clarity on the development methodology of the OWASP Top 10. This page provides information on the data sources used as input to the top 10, the current development processes, suggestions to improve involvement and participation, and also an FAQ to cover common questions & concerns.
This is a wiki and editable by anyone with an OWASP account. Please constructively contribute to the conversation. Additional discussions should also take place within the OWASP top 10 mailing list.
Current Methodology
The 2010 and later versions of OWASP Top 10 are organized based on the OWASP_Risk_Rating_Methodology, adjusted for the fact that the Top 10 is independent of any particular system. This adjusted methodology is documented in the OWASP Top 10 for 2010 here: Top_10_2010-Notes_About_Risk.
This resulted in 4 risk factors used to calculate the order of the Top 10, 3 Likelihood Factors and 1 Impact Factor. These factors are:
- Likelihood of an Application Having that Vulnerability (Prevalence)
- Likelihood of an Attacker Discovering that Vulnerability (Detectability)
- Likelihood of An Attacker Successfully Exploiting that Vulnerability (Exploitability)
- Typical Technical Impact if that Vulnerability is Successfully Exploited (Impact)
Each of these factors is scored on a scale from 1 through 3, except for XSS, which has a prevalence of 4
- The Top 10 uses data sources provided by a variety of companies (see Top_10_2013/ProjectMethodology#Current_Data_Sources sources) to calculate Vulnerability Prevalence. We would love to use similar data to help calculate the scores for the other risk factors if that data is available (which is one of the improvement suggestions recommended below).
- Data & professional opinion used to create initial Top 10 rankings and items
- <dave> List involved individuals here
- Public comment period of RC1 from February through end of March
- All comments evaluated and top 10 updated appropriately by:
- <dave> List involved individuals here
- All comments and responses posted publicly
- <dave> RC2 issued?
- Final version published
Current Prevalence Data Sources
- Aspect Security
- HP (Results for both Fortify and WebInspect)
- Minded Security - Statistics
- Softtek
- TrustWave Spiderlabs
- Veracode – Statistics
- WhiteHat Security – Statistics
If you would like to contribute your vulnerability statistics to the OWASP Top 10 project, please send your data to: dave.wichers@owasp.org. Please indicate if its OK for OWASP to publish this raw data. If you have already published this data, please provide us a link to the public posting.
Note: In the first version of the Top 10 in 2003, we started with the MITRE CVE data, and each update expanded the number of prevalence data contributors. Unfortunately, the CVE data for 2011/2012 wasn't available for the 2013 release, which is why its not included this year.
Suggested Enhancements
- Use a public wiki or google issues to capture feedback - mailing lists are tough and things get lost
- Establish a Top 10 panel to evaluate and make final decisions on inclusion & ranking
- Not feasible for everyone to vote on every item
- A diverse panel representing various verticals (vendor, enterprise, offense/defense, etc)
- Additional data sources could be considered (please add links)
- WASC Web Hacking Incident Database
- Akamai State of the Internet Reports
- Firehosts Web Application Attack Reports
- Imperva's Web Application Attack Reports
- Additional reports could be considered:
- Annual Symantec Internet Threat Reports
- Datalossdb
- IBM XForce threat reports
- Public forum to brainstorm and discuss key topics
FAQ
- TBD