This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org
Top 10 2013/ProjectMethodology
About
The purpose of this page is to provide greater clarity on the development methodology of the OWASP Top 10. This page provides information on the data sources used as input to the top 10, the current development processes, suggestions to improve involvement and participation, and also an FAQ to cover common questions & concerns.
This is a wiki and editable by anyone with an OWASP account. Please constructively contribute to the conversation. Additional discussions should also take place within the OWASP top 10 mailing list.
Current Methodology
The 2010 and later versions of OWASP Top 10 are organized based on the OWASP_Risk_Rating_Methodology, adjusted for the fact that the Top 10 is independent of any particular system. This adjusted methodology is documented in the OWASP Top 10 for 2010 here: Top_10_2010-Notes_About_Risk.
This resulted in 4 risk factors used to calculate the order of the Top 10, 3 Likelihood Factors and 1 Impact Factor. These factors are:
- Likelihood of an Application Having that Vulnerability (Prevalence)
- Likelihood of an Attacker Discovering that Vulnerability (Detectability)
- Likelihood of An Attacker Successfully Exploiting that Vulnerability (Exploitability)
- Typical Technical Impact if that Vulnerability is Successfully Exploited (Impact)
Each of these factors is scored on a scale from 1 through 3, except for XSS, which has a prevalence of 4
The Top 10 uses data sources provided by a variety of companies (see Top_10_2013/ProjectMethodology#Current_Data_Sources sources) to calculate Vulnerability Prevalence. We would love to use similar data to help calculate the scores for the other risk factors if that data is available (which is one of the improvement suggestions recommended below).
Starting in 2010, the process for producing an update to the OWASP Top 10 is as follows:
- Collect prevalence data from data suppliers.
- Rank prevalence data for each supplier and then aggregate the results to create an overall prevalence ranking for this update to the Top 10.
- Determine the values of the other risk factors based on professional opinion. (This step not done prior to 2010)
- Calculate the Top 10 order
- Write a Release Candidate
- For 2010, the Release Candidate was reviewed by the Data Contributors and other members of the OWASP Community, and the core commenters/contributors were acknowledged here: Top_10_2010-Introduction
- Note, for 2013 this internal project review step was eliminated in order to get the release candidate out for public comment faster
- Publish Release Candidate for public comment
- Accept comments during a comment period
- Interact with comment providers to update the Top 10
- Publish all provided comments
- Publish a Final Release
In 2010, the Release Candidate process worked like this:
- It was open for public comment for several months
- Dave Wichers, primarily, interacted via email with each comment provider to address their comments or provide rationale as to why no change was thought to be most appropriate.
- All provided comments were published at:
- The final version was published.
In 2013, the process is currently like so:
- Public comment period of Release Candidate is currently from February through end of March 2013. (This can be extended if necessary)
- We were planning on following the same process as for 2010 to complete the Top 10, but clearly the OWASP Community wants to get more heavily involved in producing the Final Release, which is great.
- So, at this point, the process for completing the 2013 Top 10 is TBD, subject to your input/suggestions.
Current Prevalence Data Sources
- Aspect Security
- HP (Results for both Fortify and WebInspect)
- Minded Security - Statistics
- Softtek
- TrustWave Spiderlabs
- Veracode – Statistics
- WhiteHat Security – Statistics
If you would like to contribute your vulnerability statistics to the OWASP Top 10 project, please send your data to: dave.wichers@owasp.org. Please indicate if its OK for OWASP to publish this raw data. If you have already published this data, please provide us a link to the public posting.
Note: In the first version of the Top 10 in 2003, we started with the MITRE CVE data, and each update expanded the number of prevalence data contributors. Unfortunately, the CVE data for 2011/2012 wasn't available for the 2013 release, which is why its not included this year.
Suggested Enhancements
- Use a public wiki or google issues to capture feedback - mailing lists are tough and things get lost
- Establish a Top 10 panel to evaluate and make final decisions on inclusion & ranking
- Not feasible for everyone to vote on every item
- A diverse panel representing various verticals (vendor, enterprise, offense/defense, etc)
- Additional data sources could be considered (please add links)
- WASC Web Hacking Incident Database
- Akamai State of the Internet Reports
- Firehosts Web Application Attack Reports
- Imperva's Web Application Attack Reports
- Additional reports could be considered:
- Annual Symantec Internet Threat Reports
- Datalossdb
- IBM XForce threat reports
- Public forum to brainstorm and discuss key topics
FAQ
- TBD