This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit

Difference between revisions of "Industry:FTC Protecting Consumer Privacy"

Jump to: navigation, search
(Response: b ii) expanded)
Line 59: Line 59:
=== Final version ===
=== Final version ===
=== Draft Text version 2 ===

Revision as of 10:46, 25 January 2011

Return to Global Industry Committee

Activity Name FTC Protecting Consumer Privacy
Short Description Provide response to the FTC Staff Report "Protecting Consumer Privacy in an Era of Rapid Change - A Framework for Businesses and Policymakers"
Related Projects None
Email Contacts & Roles Primary
Colin Watson
Mailing list
Please use the Industry Committee list
  • Review report - in particular issues relating to web application security
  • Where appropriate, draft a response for submission
  • Submit the response as an official OWASP statement
  • 20 Jan 2011 - Complete first draft response
  • 20 Jan 2011 - Circulate draft
  • 28 Jan 2011 - Prepare final version
  • 29 Jan 2011 - Submit to FTC
  • In Progress
Resources FTC Staff Report: Protecting Consumer Privacy in an Era of Rapid Change - A Framework for Businesses and Policymakers

FTC Press release, 1 Dec 2010

Response submission using the comment form by 31st January 2011

Submission Response

Latest first

Final version


Draft Text version 2


Draft Text version 1


This official response has been submitted on behalf of the Open Web Application Security Project (OWASP) by the OWASP Global Industry Committee, following our own consultation process.


The OWASP response is to six of the FTC's questions for comment, which we have labelled a) to f) for our own purposes. The questions responded to relate to aspects within OWASP's mission to "to make application security visible, so that people and organizations can make informed decisions about true application security risks.". In somke cases, questions are not answered explicitly where OWASP does not have an agreed opinion, but intstead application related matters to consider are highlighted which may affect the topic of the question.

a) Companies should promote consumer privacy throughout their organizations and at every stage of the development of their products and services > Incorporate substantive privacy protections > When it is not feasible to update legacy data systems, what administrative or technical procedures should companies follow to mitigate the risks posed by such systems?

i) Although it may be economically infeasible to change legacy systems, it is often the case that these systems are modified, or access provided to them through new mechanisms (e.g. by a web service, or using a mobile application). The changes and additions should still consider building privacy in.

ii) For legacy data systems which cannot be altered, the techniques of data minimization, data integrity checking and data tokenization should be considered first. Thereafter conventional administrative and technical controls (e.g. segregation of duties, principle of least privilege, etc) should be applied, but we would like to highlight the following application-specific controls:

  • documentation of security defaults and options that affect security
  • secure configuration of the application and application environment
  • application event log analysis
  • application layer firewalls
  • application surface exposure minimization
  • application layer intrusion detection and prevention
  • data egress monitoring

iii) If multiple methods (e.g. desktop application, web site, web service, mobile application, accessible web site) are used to access a business process, all of these should have similar levels of privacy protection built in, so that one channel can not be used to circumvent another, and similarly for non-application alternatives (e.g. customer call center, walk-in shop, telephone self-service, etc).

b) Companies should promote consumer privacy throughout their organizations and at every stage of the development of their products and services > Maintain comprehensive data management procedures > How can the full range of stakeholders be given an incentive to develop and deploy privacy-enhancing technologies?

i) Privacy requirements should be build into development & procurement practices. Verification processes must be undertaken to ensure these have been delivered - a risk based approach should be used to ensure effort is applied proportionately through the portfolio of processes.

ii) It is not sufficient just to use PETs - they must be installed, configured and operated correctly. For example, selection of weak algorithms, exposure of keys or provision of access to decrypted data through an application can all circumvent "encryption as a solution". The use of TLS (SSL) can be undermined by exposing session variables over plain HTTP, SSL can be set up incorrectly (it is not "secure by default") and many applications (e.g. websites, mobile apps, email clients) make it very hard for users to tell whether SSL is being used correctly, and there is little consistency in visual signals for users.

c) Companies should promote consumer privacy throughout their organizations and at every stage of the development of their products and services > Maintain comprehensive data management procedures > What roles should different industry participants – e.g., browser vendors, website operators, advertising companies – play in addressing privacy concerns with more effective technologies for consumer control?

For services delivered over web technologies, besides the operators of the services, the software programming languages, code libraries, frameworks, host environment, network and browser can all affect the efficacy of privacy protection. Every part of the supply chain throughout the software development lifecycle (including operation and disposal), needs to be able to understand the privacy-affecting aspects and effects. There is an need for increased visibility.

d) Companies should simplify consumer choice > Commonly accepted practices > Is the list of proposed “commonly accepted practices” set forth in Section V(C)(1) of the report too broad or too narrow?

i) The commonly accepted practice of "internal operations" may not always be obvious to consumers. The example of website collecting information about visits and click-though rates is used. Consumer may well expect the organisation they are interacting with to do this, but they probably are not expecting their data to being collected, stored and processed directly by a third party which is often the case. This is not at all obvious to consumers and is quite different to the physical example of hotels and restaurants collecting customer satisfaction surveys themselves. Apart from web analytics, the inclusion of other third party hosted code (e.g. code libraries, widgets, syndicated content) has a similar effect.

ii) In the existing practice of "fraud detection", the use of "web server logs" are explicitly mentioned. These are not the only source of event data, and in any case, are not considered sufficient for most data protection purposes. Practices should include the collection, aggregation and analysis of all types of event information (which could contain data about consumers). We would suggest changing the phrase "ordinary web server logs" to "web server, security event, audit, local client and other logs".

iii) The list of commonly accepted practices should include misuse detection and prevention. This is quite similar to the existing "fraud prevention" practice already included, but not all suspicious activity and attacks involve deception or personal gain as the intent. Not all unwanted activity is necessarily a crime. The intent may be to alter, delete or steal data, or to view unauthorized information, or to prevent access to the service by others. Included in this should be process verification (e.g. testing and audit) which are not malicious, but do form part of detection & prevention.

iv) The list of commonly accepted practices should include data required for session management purposes. Many applications will not work at all unless session management data are allowed. This data of course is still subject to privacy controls, limitations to use, etc.

e) Companies should increase the transparency of their data practices > Reasonable access to consumer data > Should companies inform consumers of the identity of those with whom the company has shared data about the consumer, as well as the source of the data?

i) With applications, the data sharing can be occurring at the same time as it is being provided, and not necessarily subsequently. For example, including code from a third party within a web page where the user is identifiable. In these cases, it may make sense to inform the consumer in advance.

ii) Consumers may not expect some data practices/uses within organizations i.e. intra department data sharing. For example, the use of real consumer data in development and test environments.

f) Companies should increase the transparency of their data practices > Reasonable access to consumer data > Should consumers receive notice when data about them has been used to deny them benefits? How should such notice be provided? What are the costs and benefits of providing such notice?

Denial of benefits may occur when a user is logged out, locked out or otherwise prevented from accessing an application (e.g. IP address blacklisting). In some cases a particular consumer may be known (e.g. by the relationship to a particular user account name) or may only be through some other identifier (e.g. network address), which may not always be possible to identify an individual consumer at the time. Notices in these cases should include messages given by the application electronically.


to be added in final draft

Return to Global Industry Committee