This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org

Difference between revisions of "Proposal Project Review QA Approach"

From OWASP
Jump to: navigation, search
Line 4: Line 4:
  
 
OWASP has a Project inventory of more than 150 projects. Many of these projects are in Incubator phase and a selected group of approximate 30 projects are part of Lab or Flagship phase.
 
OWASP has a Project inventory of more than 150 projects. Many of these projects are in Incubator phase and a selected group of approximate 30 projects are part of Lab or Flagship phase.
Keeping track and verifying activity level has been a challenging task for the organization. In February 2013 a new initiative to set a board of Project Advisers did produce a Project criteria by the end of September 2013, unfortunately, very few volunteers have the time and true commitment to actually do reviews. Another problem is that reviewing these projects at a quality level requires proper understanding of technologies therefore not anyone can actually execute a review, let alone that it is unbiased.
+
Keeping track and verifying activity level has been a challenging task for the organization. In February 2013 a new initiative to set a Board of Project Advisers delivered an assessment methodology using a set of criteria, which was ready by the end of September 2013, unfortunately, very few volunteers have the time and true commitment to actually do reviews. Another problem is that reviewing these projects at a quality level requires proper understanding of technologies therefore not anyone can actually execute a review, let alone that it is unbiased.
  
 
==Proposal approach==
 
==Proposal approach==
  
Code and Tool projects are software products produced by the efforts of volunteers. Incubators are a sandbox for innovation and experimentation, but once a project has reached a maturity level such as flagship, the project becomes an integral part of OWASP image. As a flagship project is linked to this view, how the organization operates, OWASP should provide a better platform to help project leaders reach a higher quality level, but also, help the project leader sustain the projects at another level such as financially and marketing initiatives. The review quality of a project as it actually is, does not deliver a realistic view of the real status and it is not able to keep proper track of the activity level or quality.
+
Code and Tool projects are software products produced by the efforts of volunteers. Incubators are a sandbox for innovation and experimentation, but once a project has reached a maturity level such as flagship, the project becomes an integral part of OWASP image. The main purpose of this project is to determine the quality of actual flagship projects and set a standard procedure for measuring the quality of projects who want or claim to have "Flagship" status
 +
 
 +
The actual review process does not deliver a realistic view of the real status and it is not able to keep proper track of the activity level or quality.
  
 
==Quantitative & Quality Assurance approach==
 
==Quantitative & Quality Assurance approach==
Measuring activity level is a time consuming task, however this can be solved with tools such as ohloh, making possible to automate this part. This works as long as Project leaders configure properly their project repositories in this system. This allow us to collect quantitative data about a project such as
+
Measuring activity level is a time consuming task, therefore it is imperative to automate this process.
 +
With the help of a new tool proposed by Enrico Branca, this could be achieved: http://www.pythonsecurity.org/stats
 +
 
 +
In the past we used Ohloh, howveer the information in Ohloh is not accurate.(I have test this using my own project's repository)
 +
 
 +
Some of the measurement that will be consider to measure activities are:
  
 
*Amount of commits
 
*Amount of commits
Line 31: Line 38:
 
*Usability==> Will be replaced by Functional testing
 
*Usability==> Will be replaced by Functional testing
 
*Creation and quality of training/tutorial material for the project.==> Quality again is also difficult to measure so we should have at least materials as criteria
 
*Creation and quality of training/tutorial material for the project.==> Quality again is also difficult to measure so we should have at least materials as criteria
*A measurable activity in Ohloh
+
*A measurable activity in using Python
 
*At least one contributor excluding project leader
 
*At least one contributor excluding project leader
  
  
 
===Focusing on Quality instead of popularity===
 
===Focusing on Quality instead of popularity===
Ohloh system might be vulnerable to “popularity” votes instead of quality. In order to be able to measure quality at a functional level, the candidate flagship project will be functionally tested using Tmap as methodology.  For this part , OWASP will hire a team of professional testers guided by a Project Advisor task force, in order to monitor and set the plan.
+
Ohloh system might be vulnerable to “popularity” votes instead of quality. In order to be able to measure quality at a functional level, the candidate flagship project will be functionally tested using Tmap as methodology.  For this part , OWASP will hire one or two testers to work part-time on this project including a group of volunteers to help test.
 
 
 
==PLAN==
 
==PLAN==
  
 
==Pilot project==
 
==Pilot project==
This plan will be executed as a pilot project with just 3 selected candidate flagship projects
+
This plan will be executed as a pilot project, using the actual flagship projects, as mentioned before,  we can determine if indeed these projects have the necessary production quality to be categorized as "Flagship".
The approach will be exactly the same instead of hiring a team of testers, Project Adviser Johanna Curiel will lead the testing and we are looking for volunteers to help us test the application. ideally the volunteer should have experience as QA tester and using Tmap and Scrum as methodololy
 
  
Candidate flagship projects to be tested for pilot project:
+
Candidate projects to be tested for pilot project:
  
Breaker: OWASP ZAP
+
Code
 +
*OWASP AntiSamy Project
 +
*OWASP Enterprise Security API
 +
*OWASP ModSecurity Core Rule Set Project
 +
*OWASP CSRFGuard Project
  
Builder : PHP RBAC/ESAPI Java
+
Tools
 +
*OWASP Web Testing Environment Project
 +
*OWASP WebGoat Project
 +
*OWASP Zed Attack Proxy
  
Defender: ModSecurity Project
 
  
 
==Budget Pilot project==
 
==Budget Pilot project==
It will be less than USD500 dollars
+
 
*a JIRA account of USD 10 for 10 users
+
 
* trial version of Ranotex ==> trial version of 30 days
 
* Cloud space(dropbox/google drive) for Virtual Machines ==> 99 dollars a year for 100GB
 
  
 
===Approach to Tools and Code project===
 
===Approach to Tools and Code project===
Line 68: Line 77:
  
 
===Hiring of Testers===
 
===Hiring of Testers===
The test team will work based on a contract or per hour based. Ideally we want to keep working with a team that knows the applications and can easily regress test them anytime we need. Preferably a company and not individuals so we make this sustainable. Volunteers can also join us to test anytime. We can look for university students looking for an internship at OWASP.  
+
The test team will work based on a contract or per hour based. Ideally we want to keep working with a team that knows the applications and can easily regress test them anytime we need. Volunteers can also join us to test anytime. We can look for university students looking for an internship at OWASP.  
  
 
===Testing environment===
 
===Testing environment===

Revision as of 12:15, 28 May 2014

Proposal : Project Reviews Quality Assurance approach

Background

OWASP has a Project inventory of more than 150 projects. Many of these projects are in Incubator phase and a selected group of approximate 30 projects are part of Lab or Flagship phase. Keeping track and verifying activity level has been a challenging task for the organization. In February 2013 a new initiative to set a Board of Project Advisers delivered an assessment methodology using a set of criteria, which was ready by the end of September 2013, unfortunately, very few volunteers have the time and true commitment to actually do reviews. Another problem is that reviewing these projects at a quality level requires proper understanding of technologies therefore not anyone can actually execute a review, let alone that it is unbiased.

Proposal approach

Code and Tool projects are software products produced by the efforts of volunteers. Incubators are a sandbox for innovation and experimentation, but once a project has reached a maturity level such as flagship, the project becomes an integral part of OWASP image. The main purpose of this project is to determine the quality of actual flagship projects and set a standard procedure for measuring the quality of projects who want or claim to have "Flagship" status

The actual review process does not deliver a realistic view of the real status and it is not able to keep proper track of the activity level or quality.

Quantitative & Quality Assurance approach

Measuring activity level is a time consuming task, therefore it is imperative to automate this process. With the help of a new tool proposed by Enrico Branca, this could be achieved: http://www.pythonsecurity.org/stats

In the past we used Ohloh, howveer the information in Ohloh is not accurate.(I have test this using my own project's repository)

Some of the measurement that will be consider to measure activities are:

  • Amount of commits
  • Amount of contributors
  • Ratings and reviews by the community
  • Code data such as Language distribution used in the project
  • Kudos for project leaders

Ohloh.PNG

In order to verify the quality of projects but also, support flagship projects in this process, this proposal advises to use a Quality Assurance approach. This means that projects that are candidates to become flagship, will be tested at a functional level using known methodologies such as Tmap.

Candidate Flagship projects (Tools & Code)

A candidate flagship project should fill the following criteria:

Excluding the following:

  • Industry participation==> Very difficult to measure
  • Usability==> Will be replaced by Functional testing
  • Creation and quality of training/tutorial material for the project.==> Quality again is also difficult to measure so we should have at least materials as criteria
  • A measurable activity in using Python
  • At least one contributor excluding project leader


Focusing on Quality instead of popularity

Ohloh system might be vulnerable to “popularity” votes instead of quality. In order to be able to measure quality at a functional level, the candidate flagship project will be functionally tested using Tmap as methodology. For this part , OWASP will hire one or two testers to work part-time on this project including a group of volunteers to help test.

PLAN

Pilot project

This plan will be executed as a pilot project, using the actual flagship projects, as mentioned before, we can determine if indeed these projects have the necessary production quality to be categorized as "Flagship".

Candidate projects to be tested for pilot project:

Code

  • OWASP AntiSamy Project
  • OWASP Enterprise Security API
  • OWASP ModSecurity Core Rule Set Project
  • OWASP CSRFGuard Project

Tools

  • OWASP Web Testing Environment Project
  • OWASP WebGoat Project
  • OWASP Zed Attack Proxy


Budget Pilot project

Approach to Tools and Code project

1st Phase- Focus is on Functional Testing Assuming we agree on using Tmap as testing methodology and as working methodology Scrum. We will apply a known QA methodology such as Tmap, but we will focus on creating test cases.

Therefore we continue to:

  • Create a test cases for these projects. This plan focuses on creating specific test cases for the main and most important functionality described by each project using (scrum methodology) in a system like JIRA
  • Focus on at least 70 to 85% Coverage of all main functionality for each project. Project leaders will have access to the system to feedback during the testing period. All test results are log in JIRA and automated reports will be placed for the community to follow and feedback.

Hiring of Testers

The test team will work based on a contract or per hour based. Ideally we want to keep working with a team that knows the applications and can easily regress test them anytime we need. Volunteers can also join us to test anytime. We can look for university students looking for an internship at OWASP.

Testing environment

Some tools are very specific to a testing environment including hacking/attacking a website, as an example, OWASP ZAP is installed and the functionality test is done against WebGoat in the test environment. A VM containing all the projects installed and ready for testing will be configured.

Approach for Reviewing OWASP Code & Tool projects

It is more complex because it requires the testers to know the programming language. For this part testers must know and understand the code. The main focus will be to follow the instructions as described by the code project and test how well they are integrated as they claim to be. Then again here, we need to create code reviews, therefore, if we set an testing environment, the main focus of the testers is to test functionality and not into configuring the project, however configuration in case of a code project is also by itself a test. The Project Review Task force will monitor this part.

Approach for Reviewing Document projects

It is difficult to quantify how well written and accurate or not a document is, unless the reviewer has a broad body of knowledge on the subject. Finding the right reviewer is more challenging. This one will have to spend a time reading and creating a report. An Alternative might be to hire a freelance technical editor/writer that can provide his input from the Documentation and editing point of view and how well written and consistent the document is. This is more subjective and that's why I think that we better leave documentation to the Rating system. Hiring a technical writer and editor to provide his opinion could be an option, however, it is subjective.

Sustainability & Automation

The initial work will be to set the entire environment an infrastructure but once is done, is question of maintenance. The idea is that test cases are continuously reused and we can create automated test cases with Selenium (for web apps) and code review tools and use Automated tools such as Ranorex.

Time

First it must be approved and decided:

  • Testers==> provide a quotation for 2 ideally 3 testers. They must be able to configure tools and code projects.
  • Buying an account JIRA as platform. So far this is one of the best I have worked with. It's on the cloud and very cheap for low amount of users on the cloud SAAS it costs USD 15/month or https://www.atlassian.com/software/jira/pricing
  • Automated tools as Ranorex that can automate test for multiple kinds of apps (web or smart) costs EUR 3500,- is one of the best for all type of apps
  • A bug cloud account for VM machine (dropbox with 100GB will be fine) maybe OWASP already have something like this

Once this is decided then this is a rough planning:

  • Setting up and installations of OWASP projects on VM machine and Setup of Jira accounts==> +/- 5 days
  • Create test cases and start testing, including creation of automated test cases==>3 to 5 days average per project
  • Results are logged into system so is easy to create reports from there and follow up with project leaders and tester team. Reports will be published on OWASP website

Final Score Rating

The project should score at least 75% of the QA tests in order to become flagship These tests will be executed every 6 months period to verify and help the project leader achieve better quality

So I’m a flagship, what’s next

Flagship is a status that OWASP as organization should maintain and support. Its reputation also depends on how well these projects endure and progress. The major objective is a that a flagship project should get a package that will help them sustain at many levels. For example

  • Marketing campaigns
  • Paying for traveling cost to OWASP conferences
  • Getting sponsors
  • A budget for side activities