Testing: Information Gathering
A security testing needs a first phase focused on collecting all the information about a target application. Information Gathering is a necessary step of a penetration test.
This task can be carried out by using many different ways.
Using public tools (search engines) or false requests, purposely made, it is possible to force the application to send back error messages retreiving the versions and technologies used by the application.
Discovering and analyzing the front-end/back-end infrastructure and the application itself with the purpose to collect as much useful information as possible.
Oftenly it is possible to gather those information by receiving a response from the application which, as a consequence of default bad configuration in the application server or web server, could show not used or backup files.
Application discovery is an activity oriented to the identification of the web applications hosted on a web server/application server.
This analysis is important because many times there is not a direct link connecting the main application backend so, a discovery analysis would be useful to reveal details such as, web-apps used for administrative purposes, old versions of files or artifacts as scripts not properly deleted after their usage while crafted during the test/development phase or as the result of maintainance.
This phase of the Information Gathering process consists in browsing and capturing resources related to the application being tested. Search engines, such as Google, can be used to discover issues related to the web application structure or error pages produced by the application usually found because exposed to the public domain.
Web applications may divulge information during a penetration test which is not intended to be seen by an end user. Information (such as error codes)can inform the tester about technologies and products being used by the application.
Such error codes, can be easy to exploit without using any particular skill due to bad error handling strategy.
Often an analysis of the infrastructure and topology architecture can reveal a lot of information about a web application such as source code, HTTP methods permitted, administrative functionalities, authentication methods and infrastructural configurations.
For those reasons focusing only on the web application could not be an exhaustive test, considering the fact that those information collected during the security assessment, could not be as exaustive as those possibly gathered performing a wider test comprehensive of an infrastructure analysis.
SSL and TLS are two protocols which provide, with the support of the cryptography, a secure channel for the communications to protect the confidentiality of the information.
Considering the importance of those security implementations is needful to verify the presence of a strong cipher alghoritm used and every information collectable during a security assessment.
During the configuration of a database server many DB administrators do not consider the importance of the lack of security of the DB Listener component. It could reveal sensible data as well as configuration settings or database instances running.
The collection of those information could provide some useful hints needed to compromise the reservedness, integrity and availability of the data stored.
An accurate security analysis over DB listener configuration matters permits to acquire those information.
The web applications hide some information which usally are not considered during the development or the configuration of the application itself.
Those data can be discovered in the source code, in the log files or in the default error codes of the web servers so a correct approach on this topic is fundamental during a security assessment.
Observing the file extension present in a web server and used for a web-app, it is possible to comprehend the technologies which compose the target application (for example jsp and asp extensions in a server-side architecture) and sometimes the others systems connected to it.
The files of a web server (as old, backup and renamed files), freely readble and downloadble are a big source of information so it is necessary to verify the presence of those data because many times contain parts of source code, installation paths as well as passwords for applications and/or databases.
OWASP Testing Guide v2
Here is the OWASP Testing Guide v2 Table of Contents