This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org

Software Quality Assurance

From OWASP
Revision as of 12:03, 28 August 2008 by KirstenS (talk | contribs)

Jump to: navigation, search
Development Guide Table of Contents

Objective

The software quality assurance goal is to confirm the confidentiality and integrity of private user data is protected as the data is handled, stored, and transmitted. The QA testing should also confirm the application cannot be hacked, broken, commandeered, overloaded, or blocked by denial of service attacks, within acceptable risk levels. This implies that the acceptable risk levels and threat modeling scenarios are established up front, so the developers and QA engineers know what to expect and what to work towards.

Platforms Affected

All

Best practices

  • Leverage the available resources like the OWASP Top Ten list, CLASP, or the policy compliance frameworks described in Chapter 5 and the threat modeling processes described in Chapter 7. These processes will help identify design parameters, establish measurable goals, and ensure that security testing proceeds in a systematic, thorough, and quantified fashion.
  • Effective software quality assurance involves three complementary factors: Process, Metrics, and Automation.
  • Plan to test and quantify application security behavior during the QA process, just like any other system functionality.
  • Include the following considerations in your test plans:
    1. The policy compliance framework requirements
    2. Overviews of security testing methods, tools, training, and resource allocations
    3. The operating budget and schedule considerations
    4. Select a preferred vulnerability scoring system (CVSS, OVAL, etc.) and a management/tracking system (Bugzilla, a third-party vulnerability management package or service, etc.)
    5. Establish and collect useful metrics that will facilitate decision making (for example, the count of open defects by severity and category, the arrival count over time, the close rate, total testing coverage, etc.)
    6. Identify the testing activities which will be automation candidates and discuss how it will be done.
  • Have a set of QA entry criteria, which identifies the items necessary to begin testing:
    1. Policy compliance validation requirements
    2. The applicable threat modeling scenarios
    3. The testing schedule, resource list, and budget
    4. The metric and vulnerability scoring system selections
    5. An organizationally meaningful certification, which shows the QA team participated in design reviews and was satisfied with the security parameters of the system.
    6. The completed test plans
  • The QA exit criteria should include proof of application security integrity and readiness including:
    1. A summary report with charts, which summarize the collected metrics.
    2. A security testing report, which describes how well the application performed, compared to the policy compliance requirements and threat modeling scenarios, and its readiness compared to the established security baselines.
    3. No outstanding high-severity security defects (for example, a simple list showing that all severity 1 security bugs have been resolved and verified).
    4. An assessment which uses metrics to show that application security meets or exceeds established baselines, and that all security-related design goals have been met (that is, proof that the job is well done).

Note: Ideally, the reports should use visual presentation techniques whenever possible, via charts, graphs, and other methods for displaying the information visually, so the numbers are easy to comprehend and will facilitate the decision making process.

Process

Description

Utilize the test planning, test results, and metrics data to quantify the application security meets or exceeds the policy compliance and risk assessment goals.

How to identify if you are vulnerable

The presence of a working process results in an operating culture having certain distinguishing characteristics. Make sure you see some or all of these operating in yours.

For example,

  • The development team members are getting routinely updated on secure coding practices.
  • Design reviews incorporate and encourage security considerations.
  • The QA process includes planning and testing time for security assessments, instead of covering them as an afterthought or in an ad-hoc fashion.
  • Security-related bugs are specifically tracked and have an established escalation policy.

How to protect yourself

  • Make “Security” an operating word in the engineering team’s vocabulary. Encourage training opportunities, discussions, coding examples, and on-going interest.
  • Select and employ a vulnerability scoring system, such as CVSS, OVAL, or the like. Or at least make sure that security related defects have some special tracking method or tag.
  • Make sure the question of “Got security?” comes up during design reviews.
  • Establish a working escalation procedure for security-related defects.

Metrics

Description

The QA group will identify, select, and employ the meaningful metrics to provide the baseline measurement of application security. This baseline will serve as a comparison point for future assessments, too.

How to identify if you are vulnerable

A good system of metrics provides a basis for the following:

  • Summary charts, showing the security-related bug counts over time, their open and closure rates, and the progress towards policy compliance and risk assessment goals.
  • The numbers necessary to answer management’s questions about “How secure is the application?” or “Is risk increasing or decreasing over time?”
  • A known security defect density (that is, the average number of security bugs per unit of code is being monitored and the rate is going in the right direction: Down!)

How to protect yourself

  • Establish a working set of metrics. For example, count the number of high, medium, and low severity security bugs as a start. Follow with rate assessments, which will answer questions like, “How fast are security-related bugs being discovered in QA testing?”, “How severe are the bugs that are being detected?”, and “How complete is the testing coverage for the areas prioritized by our policy compliance or risk assessment goals?”
  • Track that all security related tests have been checked (a simple spreadsheet will do).
  • Automate the calculation and charting of the metrics as possible, so accurate information is available on-demand, even in a dashboard summary fashion.
  • Make sure all high-priority security bugs are fixed and regression-checked, prior to software release.

Testing Activities

Description

How to identify if you are vulnerable

Not every QA team will employ all of the following testing activities, but the more you employ strategically, the better your security assurance will be:

  • Cross-site scripting and SQL injection tests have been run.
  • An assessment of how well the application handles user input, including special or multibyte characters, excessively long strings, null inputs, or invalid values has been done.
  • Cookie or credentials manipulation testing has been performed.
  • Denials of Service scenarios have been checked. It is understood how the application will perform in the presence of connection, login, or transaction flooding.

How to protect yourself

  • Run user agent injection tests (cross-site scripting, SQL query injections, data manipulation checks).
  • Check how the application handles user input that is ill-formed, too short or too long, or that contains special or multibyte characters.
  • Check how sensitive the application is to cookie manipulation or session tampering.
  • Verify the application’s behavior under load. For example, what happens if 1,000 users login simultaneously? Or if a flood of TCP/IP connections are established, but no SYNs are received?

Links

Guide Table of Contents