This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org
Difference between revisions of "Proposal Project Review QA Approach"
| (5 intermediate revisions by the same user not shown) | |||
| Line 15: | Line 15: | ||
Measuring activity level is a time consuming task, therefore it is imperative to automate this process. | Measuring activity level is a time consuming task, therefore it is imperative to automate this process. | ||
With the help of a new tool proposed by Enrico Branca, this could be achieved: http://www.pythonsecurity.org/stats | With the help of a new tool proposed by Enrico Branca, this could be achieved: http://www.pythonsecurity.org/stats | ||
| − | |||
| − | |||
Some of the measurement that will be consider to measure activities are: | Some of the measurement that will be consider to measure activities are: | ||
| Line 26: | Line 24: | ||
*Kudos for project leaders | *Kudos for project leaders | ||
| + | We are using Ohloh to gather activity and reviews | ||
[[File:Ohloh.PNG]] | [[File:Ohloh.PNG]] | ||
| + | |||
| + | ==Code Analysis and Continuous Assurance using SWAMP== | ||
| + | The possibilities to integrate this project together with the SWAMP initiative are of great benefit for OWASP. The pilot program will indeed integrate this Continuous Assurance process for long term automated testing and furtehr development of OWASP tools. | ||
| + | |||
| + | More info: | ||
| + | https://www.owasp.org/index.php/SWAMP_OWASP | ||
| + | |||
==Qualitative Approach== | ==Qualitative Approach== | ||
| Line 70: | Line 76: | ||
Resources | Resources | ||
| − | * Jira account | + | * Jira account has been donated by Atlassian (thanks to Norman Yue) |
* Ranorex Tool (1) license ==> USD 2,706.4 | * Ranorex Tool (1) license ==> USD 2,706.4 | ||
* 2 Virtual Servers(1 Linux/1 Windows) (Leaseweb) USD 130/server for 6 months period per server/ ==> USD 262,- | * 2 Virtual Servers(1 Linux/1 Windows) (Leaseweb) USD 130/server for 6 months period per server/ ==> USD 262,- | ||
* 1 tester @USD25/hour ==> Maximum hours 160==> Total USD 4,000- | * 1 tester @USD25/hour ==> Maximum hours 160==> Total USD 4,000- | ||
| + | |||
| + | <b>Total: USD 6968.4,-</b> | ||
===Approach to Tools and Code project=== | ===Approach to Tools and Code project=== | ||
Latest revision as of 11:33, 10 June 2014
- 1 Proposal : Project Reviews Quality Assurance approach
- 2 So I’m a flagship, what’s next
Proposal : Project Reviews Quality Assurance approach
Background
OWASP has a Project inventory of more than 150 projects. Many of these projects are in Incubator phase and a selected group of approximate 30 projects are part of Lab or Flagship phase. Keeping track and verifying activity level has been a challenging task for the organization. In February 2013 a new initiative to set a Board of Project Advisers delivered an assessment methodology using a set of criteria, which was ready by the end of September 2013, unfortunately, very few volunteers have the time and true commitment to actually do reviews. Another problem is that reviewing these projects at a quality level requires proper understanding of technologies therefore not anyone can actually execute a review, let alone that it is unbiased.
Proposal approach
Code and Tool projects are software products produced by the efforts of volunteers. Incubators are a sandbox for innovation and experimentation, but once a project has reached a maturity level such as flagship, the project becomes an integral part of OWASP image. The main purpose of this project is to determine the quality of actual flagship projects and set a standard procedure for measuring the quality of projects who want or claim to have "Flagship" status
The actual review process does not deliver a realistic view of the real status and it is not able to keep proper track of the activity level or quality.
Quantitative & Quality Assurance approach
Measuring activity level is a time consuming task, therefore it is imperative to automate this process. With the help of a new tool proposed by Enrico Branca, this could be achieved: http://www.pythonsecurity.org/stats
Some of the measurement that will be consider to measure activities are:
- Amount of commits
- Amount of contributors
- Ratings and reviews by the community
- Code data such as Language distribution used in the project
- Kudos for project leaders
We are using Ohloh to gather activity and reviews
Code Analysis and Continuous Assurance using SWAMP
The possibilities to integrate this project together with the SWAMP initiative are of great benefit for OWASP. The pilot program will indeed integrate this Continuous Assurance process for long term automated testing and furtehr development of OWASP tools.
More info: https://www.owasp.org/index.php/SWAMP_OWASP
Qualitative Approach
In order to verify the quality of projects but also, support flagship projects in this process, this proposal advises to use a Quality Assurance approach. This means that projects that are candidates to become flagship, will be tested at a functional level using known methodologies such as Tmap.
Candidate Flagship projects (Tools & Code)
A candidate flagship project should fill the following criteria:
- A solid Health activity as described in the Project Health Criteria :https://www.owasp.org/index.php/Assessing_Project_Health
Excluding the following:
- Industry participation==> Very difficult to measure
- Usability==> Will be replaced by Functional testing
- Creation and quality of training/tutorial material for the project.==> Quality again is also difficult to measure so we should have at least materials as criteria
- At least one contributor excluding project leader
Focusing on Quality instead of popularity
Ohloh system might be vulnerable to “popularity” votes instead of quality. In order to be able to measure quality at a functional level, the candidate flagship project will be functionally tested using Tmap as methodology. For this part , OWASP will hire one or two testers to work part-time on this project including a group of volunteers to help test.
PLAN
This is a rough planning for testing the projects File:PilotProjectQA-approachPlan.pdf
Pilot project
This plan will be executed as a pilot project, using the actual flagship projects, as mentioned before, we can determine if indeed these projects have the necessary production quality to be categorized as "Flagship".
Candidate projects to be tested for pilot project:
Code
- OWASP AntiSamy Project
- OWASP Enterprise Security API
- OWASP ModSecurity Core Rule Set Project
- OWASP CSRFGuard Project
Tools
- OWASP Web Testing Environment Project
- OWASP WebGoat Project
- OWASP Zed Attack Proxy
Budget and Planning Pilot project
File:PilotProjectQA-approachPlan.pdf
Resources
- Jira account has been donated by Atlassian (thanks to Norman Yue)
- Ranorex Tool (1) license ==> USD 2,706.4
- 2 Virtual Servers(1 Linux/1 Windows) (Leaseweb) USD 130/server for 6 months period per server/ ==> USD 262,-
- 1 tester @USD25/hour ==> Maximum hours 160==> Total USD 4,000-
Total: USD 6968.4,-
Approach to Tools and Code project
1st Phase- Focus is on Functional Testing Assuming we agree on using Tmap as testing methodology and as working methodology Scrum. We will apply a known QA methodology such as Tmap, but we will focus on creating test cases.
Therefore we continue to:
- Create a test cases for these projects. This plan focuses on creating specific test cases for the main and most important functionality described by each project using (scrum methodology) in a system like JIRA
- Focus on at least 70 to 85% Coverage of all main functionality for each project. Project leaders will have access to the system to feedback during the testing period. All test results are log in JIRA and automated reports will be placed for the community to follow and feedback.
Hiring of Testers
The test team will work based on a contract or per hour based. Ideally we want to keep working with a team that knows the applications and can easily regress test them anytime we need. Volunteers can also join us to test anytime. We can look for university students looking for an internship at OWASP.
Testing environment
Some tools are very specific to a testing environment including hacking/attacking a website, as an example, OWASP ZAP is installed and the functionality test is done against WebGoat in the test environment. A VM containing all the projects installed and ready for testing will be configured.
Approach for Reviewing OWASP Code & Tool projects
It is more complex because it requires the testers to know the programming language. For this part testers must know and understand the code. The main focus will be to follow the instructions as described by the code project and test how well they are integrated as they claim to be. Then again here, we need to create code reviews, therefore, if we set an testing environment, the main focus of the testers is to test functionality and not into configuring the project, however configuration in case of a code project is also by itself a test. The Project Review Task force will monitor this part.
Approach for Reviewing Document projects
It is difficult to quantify how well written and accurate or not a document is, unless the reviewer has a broad body of knowledge on the subject. Finding the right reviewer is more challenging. This one will have to spend a time reading and creating a report. An Alternative might be to hire a freelance technical editor/writer that can provide his input from the Documentation and editing point of view and how well written and consistent the document is. This is more subjective and that's why I think that we better leave documentation to the Rating system. Hiring a technical writer and editor to provide his opinion could be an option, however, it is subjective.
Sustainability & Automation
The initial work will be to set the entire environment an infrastructure but once is done, is question of maintenance. The idea is that test cases are continuously reused and we can create automated test cases with Selenium (for web apps) and code review tools and use Automated tools such as Ranorex.
Time
First it must be approved and decided:
- Testers==> provide a quotation for 2 ideally 3 testers. They must be able to configure tools and code projects.
- Buying an account JIRA as platform. So far this is one of the best I have worked with. It's on the cloud and very cheap for low amount of users on the cloud SAAS it costs USD 15/month or https://www.atlassian.com/software/jira/pricing
- Automated tools as Ranorex that can automate test for multiple kinds of apps (web or smart) costs EUR 3500,- is one of the best for all type of apps
- A bug cloud account for VM machine (dropbox with 100GB will be fine) maybe OWASP already have something like this
Once this is decided then this is a rough planning:
- Setting up and installations of OWASP projects on VM machine and Setup of Jira accounts==> +/- 5 days
- Create test cases and start testing, including creation of automated test cases==>3 to 5 days average per project
- Results are logged into system so is easy to create reports from there and follow up with project leaders and tester team. Reports will be published on OWASP website
Final Score Rating
The project should score at least 75% of the QA tests in order to become flagship These tests will be executed every 6 months period to verify and help the project leader achieve better quality
So I’m a flagship, what’s next
Flagship is a status that OWASP as organization should maintain and support. Its reputation also depends on how well these projects endure and progress. The major objective is a that a flagship project should get a package that will help them sustain at many levels. For example
- Marketing campaigns
- Paying for traveling cost to OWASP conferences
- Getting sponsors
- A budget for side activities