This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit https://owasp.org

Difference between revisions of "Community Engagement Results QA Testing 2014"

From OWASP
Jump to: navigation, search
(Proposed Budget)
(Actual budget used)
Line 41: Line 41:
 
  *Payment VM machine for manual testing(Ranorex tool installed) =USD262
 
  *Payment VM machine for manual testing(Ranorex tool installed) =USD262
  
Total actually used:~USD3500
+
Total actually used:~USD3500
  
 
==Results of Testing and further developments ==
 
==Results of Testing and further developments ==

Revision as of 18:10, 6 September 2015

Origin of Activity

In 2014, the board decided to downgrade all the flagship projects at the time to LAB status: https://groups.google.com/forum/#!topic/owasp-projects-task-force/X2b9J1eSC5E

(need an owasp account to view this): https://docs.google.com/a/owasp.org/document/d/1KGwq6dT5LWfRPUfmSLD-ZPxGJcWFcoOrvgPOjKFFVY8/edit?usp=sharing

Action taken

Budget was provided to major test all demoted projects (from Flaghsip to LABs) when the board decided that there were many projects that had a flagship status but many were lacking the quality they once had. Also there were projects at LAB status have not been reviewed yet , so they could also become flagship

Proposal for testing

A proposal was submitted for testing the projects using a clear criteria and QA approach https://www.owasp.org/index.php/Proposal_Project_Review_QA_Approach

Budget used

A budget of USD7000- was set aside for this purpose however only half was used thanks to donations: Original proposal: Proposal: https://www.owasp.org/index.php/Proposal_Project_Review_QA_Approach

Community fund allocation as appears today. I would like to see this corrected https://www.owasp.org/index.php/Community_Engagement_-_Payments#2015_Community_Engagement_Allocations.2FPayments

Proposed Budget

  • Jira account has been donated by Atlassian (thanks to Norman Yue)
  • Ranorex Tool (1) license ==> USD 2,706.4(was donated by Ranorex)
  • 2 Virtual Servers(1 Linux/1 Windows) (Leaseweb) USD 130/server for 6 months period per server/ ==> USD 262,-
  • 1 tester @USD25/hour ==> Maximum hours 160==> Total USD 4,000-

Total budget proposed: USD 6968.4,-

Actual budget used

JIRA and Ranorex sponsored with license(no cost) We hired one tester for only half the time we budget(attached break down of hours tested, we only use 98 hours) Attached hours testing the builds of projects by tester: https://docs.google.com/a/owasp.org/spreadsheets/d/1_WRxMKrjbVLfctcQcg6oxAc8fymcqOfb7m3tVrB4Hsw/edit?usp=sharing

We had a VM machine with Jenkins/TeamCity server for a period of one year USD750 (setup by Jason Johnson) In the end we used half of this budget

*Payment to tester (98 hours) = USD2450,-
*Payment VM machine Jenkings = USD750
*Payment VM machine for manual testing(Ranorex tool installed) =USD262
Total actually used:~USD3500

Results of Testing and further developments

October- November 2014

We used some of the reviews done in 2013 by the Advisor team: http://owasp.blogspot.com/2013/09/meet-our-new-technical-project-advisors.html https://drive.google.com/folderview?id=0B1lOCxlYdf1AeUwzWlFfeWg0Mmc&usp=gmail

I did together a major review with Marios Kourtesis and other members , using this information of the reviews in 2013 for the major reviews done in 2014. Jason Johnson help us setting up also a VM with automated Jenkings so project leaders could build their projects automatically and check for errors, I also did some setup projects in this machine which we discontinue using afterwards due to maintenance issues. There were leaders using this tool and I on another VM machine with Marios for testing the projects which we setup for 3 months that we worked on testing. Norman Yuen helped us get a JIRA for better review process and communication

Results of the testing done by Johanna Curiel and Marios Kourtesis https://www.owasp.org/index.php/LAB_Projects_Code_Analysis_Report

Openduck automation

We did a major automation using Openduck. We set in total 80 projects that were not registered: From there on was easier maintenance which Kait-Disney helped us review the basic criteria, managed the communication with project leaders, maintenance and setup with her the projects in Openduck for automated tracking: https://www.openhub.net/orgs/OWASP

2015 - Present

Kate helped us as support staff when Samantha and Kait-Disney stopped as staff. She helped on the major part of setting and helping new project leaders. Me and other members like Timo Goosen helped review some projects that required. Other members contributed with their input on the list. https://www.owasp.org/index.php/Category:OWASP_Project#tab=Project_Task_Force

Results 2015