This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org
Difference between revisions of "Crawling Code"
(→Crawing Code) |
(→Crawing Code) |
||
Line 1: | Line 1: | ||
[[OWASP Code Review Guide Table of Contents]]__TOC__ | [[OWASP Code Review Guide Table of Contents]]__TOC__ | ||
===Crawing Code=== | ===Crawing Code=== | ||
− | Crawling code is the practice of scanning a code base of the review target in question. It is, in effect, looking for key pointers wherein a possible security vulnerability might reside. Certain | + | Crawling code is the practice of scanning a code base of the review target in question. It is, in effect, looking for key pointers wherein a possible security vulnerability might reside. Certain APIs are related to interfacing to the external world or file IO or user management which are key areas for an attacker to focus on. In crawling code we look for API relating to these areas. We also need to look for business logic areas which may cause security issues, but generally these are bespoke methods which have bespoke names and can not be detected directly, even though we may touch on certain methods due to their relationship with a certain key API. |
− | Also we need to look for common issues relating to a specific language; issues that may not be *security* related but which may affect the stability/availability of the application in the case of extraordinary circumstances. Other issues when performing a code review are areas such a simple copyright notice in order to protect | + | Also we need to look for common issues relating to a specific language; issues that may not be *security* related but which may affect the stability/availability of the application in the case of extraordinary circumstances. Other issues when performing a code review are areas such a simple copyright notice in order to protect one’s intellectual property. |
− | Crawling code can be done manually or in an automated fashion using automated tools. Tools as simple as grep or wingrep can be used. Other tools are available which would search for key words relating to a specific programming language. | + | Crawling code can be done manually or in an automated fashion using automated tools. Tools as simple as grep or wingrep can be used. Other tools are available which would search for key words relating to a specific programming language. |
− | The following sections shall cover the function of crawing code for Java/J2EE, .NET and Classic ASP. | + | The following sections shall cover the function of crawing code for Java/J2EE, .NET and Classic ASP. This section is best used in conjunction with the [[Security Code Review Coverage|transactional analysis]] section also detailed in this guide. |
− | This section is best used in conjunction with the [[Security Code Review Coverage|transactional analysis]] section also detailed in this guide. | ||
[[Category:OWASP Code Review Project]] | [[Category:OWASP Code Review Project]] |
Revision as of 13:12, 10 January 2009
OWASP Code Review Guide Table of ContentsCrawing Code
Crawling code is the practice of scanning a code base of the review target in question. It is, in effect, looking for key pointers wherein a possible security vulnerability might reside. Certain APIs are related to interfacing to the external world or file IO or user management which are key areas for an attacker to focus on. In crawling code we look for API relating to these areas. We also need to look for business logic areas which may cause security issues, but generally these are bespoke methods which have bespoke names and can not be detected directly, even though we may touch on certain methods due to their relationship with a certain key API.
Also we need to look for common issues relating to a specific language; issues that may not be *security* related but which may affect the stability/availability of the application in the case of extraordinary circumstances. Other issues when performing a code review are areas such a simple copyright notice in order to protect one’s intellectual property.
Crawling code can be done manually or in an automated fashion using automated tools. Tools as simple as grep or wingrep can be used. Other tools are available which would search for key words relating to a specific programming language.
The following sections shall cover the function of crawing code for Java/J2EE, .NET and Classic ASP. This section is best used in conjunction with the transactional analysis section also detailed in this guide.