|
|
Line 79: |
Line 79: |
| | | |
| |} | | |} |
− |
| |
− | = OWASP Top 10 for 2017 Release Candidate 2 =
| |
− |
| |
− | <div style="width:100%;height:160px;border:0,margin:0;overflow: hidden;">[[File:OWASP_Project_Header.jpg|link=]]</div>
| |
− |
| |
− | RC2 is available for download [https://github.com/OWASP/Top10/blob/master/2017/OWASP%20Top%2010%202017%20RC2%20Final.pdf from GitHub].
| |
− |
| |
− | We have worked extensively to validate the methodology, obtained a great deal of data on over 114,000 apps, and obtained qualitative data via survey by 550 community members on the two new categories – insecure deserialization and insufficient logging and monitoring.
| |
− |
| |
− | We strongly urge for any corrections or issues to be made on the project's [https://github.com/OWASP/Top10/issues GitHub issue list].
| |
− |
| |
− | Through public transparency, we provide traceability and ensure that all voices are heard during this final month before publication.
| |
− |
| |
− | (We will be reaching out to translators shortly.)
| |
− |
| |
− | Andrew van der Stock<br/>
| |
− | Brian Glas<br/>
| |
− | Neil Smithline<br/>
| |
− | Torsten Gigler<br/>
| |
− |
| |
− | ==Historical/Outdated Information - for historical reference only==
| |
− | The 2017 OWASP Top 10 RC1 has been rejected. A [https://goo.gl/forms/ltbKrdYrp4Qdl7Df2 new survey for security professionals] and a [https://goo.gl/forms/tLgyvK9O74r7wMkt2 reopened data call] are now open. More details can be found on [https://owasp.blogspot.com/2017/08/owasp-top-10-2017-project-update.html this blog post].
| |
− |
| |
− |
| |
− | The release candidate for public comment was published 10 April 2017 and can be [https://github.com/OWASP/Top10/raw/master/2017/drafts/OWASP%20Top%2010%20-%202017%20RC1-English.pdf downloaded here.]. OWASP plans to release the final OWASP Top 10 - 2017 in July or August 2017 after a public comment period ending June 30, 2017.
| |
− |
| |
− | Constructive comments on this [https://github.com/OWASP/Top10/raw/master/2017/drafts/OWASP%20Top%2010%20-%202017%20RC1-English.pdf OWASP Top 10 - 2017 Release Candidate] should be forwarded via email to the [https://lists.owasp.org/mailman/listinfo/Owasp-topten OWASP Top 10 Project Email List]. Private comments may be sent to [mailto:vanderaj@owasp.org Andrew van der Stock]. Anonymous comments are welcome. All non-private comments will be catalogued and published at the same time as the final public release. Comments recommending changes to the Top 10 should include a complete suggested list of changes, along with a rationale for each change. All comments should indicate the specific relevant page and section.
| |
− |
| |
− | This release of the OWASP Top 10 marks this project’s fourteenth year of raising awareness of the importance of application security risks. This release follows the 2013 update, whose main change was the addition of 2013-A9 Use of Known Vulnerable Components. We are pleased to see that since the 2013 Top 10 release, a whole ecosystem of both free and commercial tools have emerged to help combat this problem as the use of open source components has continued to rapidly expand across practically every programming language. The data also suggests the use of known vulnerable components is still prevalent, but not as widespread as before. We believe the awareness of this issue the Top 10 - 2013 generated has contributed to both of these changes.
| |
− |
| |
− | We also noticed that since CSRF was introduced to the Top 10 in 2007, it has dropped from a widespread vulnerability to an uncommon one. Many frameworks include automatic CSRF defenses which has significantly contributed to its decline in prevalence, along with much higher awareness with developers that they must protect against such attacks.
| |
− |
| |
− | For 2017, the OWASP Top 10 Most Critical Web Application Security Risks (in the Release Candidate) are:
| |
− |
| |
− | * A1 Injection
| |
− | * A2 Broken Authentication and Session Management
| |
− | * A3 Cross-Site Scripting (XSS)
| |
− | * A4 Broken Access Control (As it was in 2004)
| |
− | * A5 Security Misconfiguration
| |
− | * A6 Sensitive Data Exposure
| |
− | * A7 Insufficient Attack Protection (NEW)
| |
− | * A8 Cross-Site Request Forgery (CSRF)
| |
− | * A9 Using Components with Known Vulnerabilities
| |
− | * A10 Underprotected APIs (NEW)
| |
− |
| |
− | == 2017 Update Data Call Data ==
| |
− |
| |
− | DATA CALL RESULTS ARE NOW PUBLIC: The [https://github.com/OWASP/Top10/blob/master/2017/datacall/OWASP%20Top%2010%20-%202017%20Data%20Call-Public%20Release.xlsx?raw=true results of this data call have been made public here] as an Excel spreadsheet with 4 tabs. Three of the tabs have raw data as submitted, organized into three vulnerability data size categories: large, small, and none. A 4th tab includes some basic analysis of the large size submissions. The OWASP Top 10 project thanks all the submitters for their input to the OWASP Top 10 - 2017.
| |
− |
| |
− | On May 20, 2016, the Top 10 project made a public announcement of the data call for the 2017 update to the OWASP Top 10. Contributors filled out the Google form posted here: [https://docs.google.com/forms/d/1sBMHN5nBicjr5xSo04xkdP5JlCnXFcKFCgEHjwPGuLw/viewform?c=0&w=1&usp=mail_form_link OWASP Top 10 - 2017 Data Call], which had the questions listed below.
| |
− |
| |
− | Page 1 of 5: Submitter Info
| |
− |
| |
− | * Name of Company/Organization *
| |
− | * Company/Organization Web Site *
| |
− | * Point of Contact Name *
| |
− | * Point of Contact E-Mail *
| |
− |
| |
− | Page 2 of 5: Background on Applications
| |
− |
| |
− | * During what year(s) was this data collected? *
| |
− | ** 2014
| |
− | ** 2015
| |
− | ** Both 2014 & 2015
| |
− | *** If the application vulnerability data you are submitting was extracted from a publicly available report, please provide a link to that report (or reports), and the relevant page number(s)
| |
− |
| |
− | * How many web applications do the submitted results cover? * We consider web apps, web services, and the server side of mobile apps to all be web apps.
| |
− |
| |
− | * What were the primary programming languages the applications you reviewed written in? Primary being 5% or more of the supplied results - Check all that apply
| |
− | ** Java
| |
− | ** .NET
| |
− | ** Python
| |
− | ** PHP
| |
− | ** Ruby
| |
− | ** Grails
| |
− | ** Play
| |
− | ** Node.js
| |
− | ** Other:
| |
− |
| |
− | * Please supply the exact percentage of applications per language checked off above:
| |
− |
| |
− | * What were the primary industries these applications supported? Primary being 5% or more of the supplied results - Check all that apply
| |
− | ** Financial
| |
− | ** Healthcare
| |
− | ** eCommerce
| |
− | ** Internet/Social Media
| |
− | ** Airline
| |
− | ** Energy
| |
− | ** Entertainment (Games/Music/Movies)
| |
− | ** Government
| |
− | ** Other:
| |
− |
| |
− | * Where in the world were the application owners primarily? Again - select those where 5% or more of your results came from
| |
− | ** North America
| |
− | ** Europe
| |
− | ** AsiaPac
| |
− | ** South America
| |
− | ** Middle East
| |
− | ** Africa
| |
− | ** Other:
| |
− |
| |
− | Page 3 of 5: Assessment Team and Detection Approach
| |
− |
| |
− | * What type of team did the bulk of this work? *
| |
− | ** Internal Assessment Team(s)
| |
− | ** Consulting Organization
| |
− | ** Product Vendor/Service Provider (e.g., SaaS)
| |
− | ** Other:
| |
− |
| |
− | *What type of analysis tools do they use? * Check all that apply.
| |
− | ** Free/Open Source Static Application Security Testing (SAST) Tools
| |
− | ** Free/Open Source Dynamic Application Security Testing (DAST) Tools
| |
− | ** Free/Open Source Interactive Application Security Testing (IAST) Tools
| |
− | ** Commercial Static Application Security Testing (SAST) Tools
| |
− | ** Commercial Dynamic Application Security Testing (DAST) Tools
| |
− | ** Commercial Interactive Application Security Testing (IAST) Tools
| |
− | ** Commercial DAST/IAST Hybrid Analysis Tools
| |
− | ** Other:
| |
− |
| |
− | * Which analysis tools do you frequently use? This includes both free, commercial, and custom (in house) tools - List tools by name
| |
− |
| |
− | * What is your primary assessment methodology? * Primary being the majority of your assessments follow this approach
| |
− | ** Raw (untriaged) output of automated analysis tool results using default rules
| |
− | ** Automated analysis tool results - with manual false positive analysis/elimination
| |
− | ** Output from manually tailored automated analysis tool(s)
| |
− | ** Output from manually tailored automated analysis tool(s) - with manual false positive analysis/elimination
| |
− | ** Manual expert penetration testing (Expected to be tool assisted w/ free DAST tool(s))
| |
− | ** Manual expert penetration testing with commercial DAST tool(s)
| |
− | ** Manual expert code review (Using IDE and other free code review aids)
| |
− | ** Manual expert code review with commercial SAST tool(s)
| |
− | ** Combined manual expert code review and penetration testing with only free tools
| |
− | ** Combined manual expert code review and penetration testing with only commercial tools
| |
− | ** Other:
| |
− |
| |
− | Page 4 of 5: Application Vulnerability Data
| |
− |
| |
− | Each question asks the number of vulnerabilities found for a particular type of vulnerability. At the end, is one catch all text question where you can add other types of vulnerabilities and their counts. If you prefer, just send your vulnerability data in a spreadsheet to brian.glas@owasp.org with these columns: CATEGORY NAME, CWE #, COUNT after you submit the rest of your input via this data call. ideally it would come from the email address you specified in the Point of Contact E-Mail question on Page 1 so its easy to correlate the two.
| |
− |
| |
− | * Number of SQL Injection Vulnerabilities Found (CWE-89)?
| |
− | * Number of Hibernate Injection Vulnerabilities Found (CW-564)?
| |
− | * Number of Command Injection Vulnerabilities Found (CWE-77)?
| |
− | * Number of Authentication Vulnerabilities Found (CWE-287)?
| |
− | * Number of Session Fixation Vulnerabilities Found (CWE-384)?
| |
− | * Number of Cross-Site Scripting (XSS) Vulnerabilities Found (CWE-79)?
| |
− | * Number of DOM-Based XSS Vulnerabilities Found (No CWE)?
| |
− | * Number of Insecure Direct Object Reference Vulnerabilities Found (CWE-639)?
| |
− | * Number of Path Traversal Vulnerabilities Found (CWE-22)?
| |
− | * Number of Missing Authorization Vulnerabilities Found (CWE-285)?
| |
− | * Number of Security Misconfiguration Vulnerabilities Found (CWE-2)?
| |
− | * Number of Cleartext Transmission of Sensitive Information Vulnerabilities Found (CWE-319)?
| |
− | * Number of Cleartext Storage of Sensitive Information Vulnerabilities Found (CWE-312)?
| |
− | * Number of Weak Encryption Vulnerabilities Found (CWE-326)?
| |
− | * Number of Cryptographic Vulnerabilities Found (CWEs-310/326/327/etc)?
| |
− | ** You can report them all lumped together in 310 or in their individual categories. However you want.
| |
− | * Number of Improper (Function Level) Access Control Vulnerabilities Found (CWE-285)?
| |
− | * Number of Cross-Site Request Forgery (CSRF) Vulnerabilities Found (CWE-352)?
| |
− | * Number of Use of Known Libraries Found (No CWE)?
| |
− | * Number of Unchecked Redirect Vulnerabilities Found (CWE-601)?
| |
− | * Number of Unvalidated Forward Vulnerabilities Found (No CWE)?
| |
− | * Number of Clickjacking Vulnerabilities Found (No CWE)?
| |
− | * Number of XML eXternal Entity Injection (XXE) Vulnerabilities Found (CWE-611)?
| |
− | * Number of Server-Side Request Forgery (SSRF) Vulnerabilities Found (CWE-918)?
| |
− | * Number of Denial of Service (DOS) Vulnerabilities Found (CWE-400)?
| |
− | * Number of Expression Language Injection Vulnerabilities Found (CWE-917)?
| |
− | * Number of Error Handling Vulnerabilities Found (CWE-388)?
| |
− | * Number of Information Leakage/Disclosure Vulnerabilities Found (CWE-200)?
| |
− | * Number of Insufficient Anti-automation Vulnerabilities Found (CWE-799)?
| |
− | * Number of Insufficient Security Logging Vulnerabilities Found (CWE-778)?
| |
− | * Number of Insufficient Intrusion Detection and Response Vulnerabilities Found (No CWE)?
| |
− | * Number of Mass Assignment Vulnerabilities Found (CWE-915)?
| |
− | * What other vulnerabilities did you find?
| |
− | ** Please provide in this format: CATEGORY NAME, CWE #, COUNT (one line per category). Say "No CWE" if there isn't a CWE # for that category. If you plan to send all your vulnerability data in via an email, please state so here so we know to expect it.
| |
− |
| |
− | Page 5 of 5: Suggestions for the next OWASP Top 10
| |
− |
| |
− | What do you think we should change?
| |
− |
| |
− | * Vulnerability types you think should be added to the T10? Because they are an unappreciated risk, widespread, becoming more prevalent, a new type of vulnerability, etc.
| |
− | * Vulnerability types you think should be removed from the T10?
| |
− | * Suggested changes to the Top 10 Document/Wiki?
| |
− | * Suggestions on how to improve this call for data?
| |
− |
| |
− | == Project Sponsors ==
| |
− |
| |
− | The OWASP Top 10 - 2017 project is sponsored by
| |
− |
| |
− | {{MemberLinks|link=https://www.autodesk.com|logo=Autodesk-logo.png}}
| |
− |
| |
− | Thanks to [https://www.aspectsecurity.com Aspect Security] for sponsoring earlier versions.
| |
| | | |
| = OWASP Top 10 for 2013 = | | = OWASP Top 10 for 2013 = |