This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit

Types of application security metrics

Revision as of 01:22, 31 May 2009 by Deleted user (talk | contribs)

Jump to: navigation, search

[ garo antreasian ] [ antivirusprograms ] http site [ asia fair trade ] [ australian photo puppy shepherd ] [ asian car model ] sitemap [ organismos autotrofos y heterotrofos ] [ african american christian famous ] african and indian elephants cerasia [ africa aids in southern ] [ automobile title check ] [ meryls role in out of africa ] [ comparatifs antivirus ] links [ auto in part store usa ] [ brenner autobahn ] [ autometer speedometer ] [ antivirus+avg ] [ system restart automatically ] [ automobile registration sticker ] [ linux antivirus review ] [ karspersky antivirus ] [ african child labor picture images ] [ african flag picture ] [ husqvarna sewing machines australia ] [ nortun antivirus ] [ david l lawrence convention center auto show ] [ panda titanium antivirus 2005 reviews ] [ the importance of african american divorce ] norton antivirus crack code [ antivirus download for free ] url [ automated imaging association ] [ cherry keyboards australia ] [ automatic water feeder ] [ contemporary african music ] site [ avg antivirus 7 crack ] [ lamington national park australia ] site [ types of antivirus software ] auto repair service new castle pennsylvania [ africa visas ] [ approval auto loan ] villa lobos fantasia

Metrics Overview


It's been said that you can't improve what you can't measure. We currently don't have any good metrics for application security. Everyone understands what we want to measure -- how secure is it? But we're really not sure what low-level measurements we should be making, nor do we know how to roll them up into something meaningful for the buyer or user of software.

The difficulty of this problem is essentially the same as determining if there are any loopholes in a legal contract. Like legalese, programming languages are arbitrarily complex. A malicious developer, like a crafty lawyer, will use all their skill to obfuscate their attack.

Direct Metrics

Ideally, we could just measure the software itself. If we could count all the vulnerabilities and determine their likelihood and impact, we'd know how secure it is. Unfortunately, even the best static analysis tools can't come close to doing this. Still, there are things we can measure, and perhaps we can figure out which of these things directly correlate with increased security.

  • How many lines of code?
  • What languages are used?
  • What libraries does this application use (and how)?
  • What type of network access is required (client, server, none)?
  • What security mechanisms are used?
  • What configuration files are associated with the application?
  • How are sensitive assets protected?
  • What vulnerabilities have been identified

Indirect Metrics

If you can't measure the security of software directly, another option is to measure the people, process, and technology that are associated with creating the software in the first place?

  • Is there security documentation (design, test results, vulnerabilities)?
  • Is the documentation accurate and complete?
  • Is there a process for reporting security flaws?
  • Who developed this code (training, experience, background check)?
  • What assurance activities were performed (threat modeling, analysis, code review, test, evaluation)?
  • What was the outcome of those assurance activities?