<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://wiki.owasp.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Edalci</id>
		<title>OWASP - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://wiki.owasp.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Edalci"/>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php/Special:Contributions/Edalci"/>
		<updated>2026-04-16T01:53:00Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.27.2</generator>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68872</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68872"/>
				<updated>2009-09-15T01:10:23Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, September 17th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Erik Klein)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Location: AOL, 22260 Pacific blvd, Sterling, VA. 20166&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, date: TBD ===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68871</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68871"/>
				<updated>2009-09-15T01:10:03Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 3: Customization Lab for Fortify SCA, September 27th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, September 17th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Erik Klein)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Location: AOL, 22260 Pacific blvd, Sterling, VA. 20166&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, date: TBD ===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68870</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68870"/>
				<updated>2009-09-15T01:09:44Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, September 17th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Erik Klein)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Location: AOL, 22260 Pacific blvd, Sterling, VA. 20166&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, September 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68869</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=68869"/>
				<updated>2009-09-15T01:07:43Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 3: Customization Lab for Fortify SCA, August 27th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, September 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=68868</id>
		<title>File:Owasp SAtrack plan.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=68868"/>
				<updated>2009-09-15T01:05:35Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: uploaded a new version of &amp;quot;File:Owasp SAtrack plan.png&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=68867</id>
		<title>File:Owasp SAtrack plan.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=68867"/>
				<updated>2009-09-15T01:03:11Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: uploaded a new version of &amp;quot;File:Owasp SAtrack plan.png&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=68866</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=68866"/>
				<updated>2009-09-15T00:57:46Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add August 6th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=NjNhbjFsZ3FrdXRqYzVuNW11amhsbmRqZHMgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [http://www.google.com/calendar/event?action=VIEW&amp;amp;eid=djRqZHZramU3bmZtdXQyMnA4aGVmcGxlMzQganN0ZXZlbkBjaWdpdGFsLmNvbQ&amp;amp;tok=MjEjam9obi5zdGV2ZW5Ab3dhc3Aub3JnY2ZmNDM1Nzc1YmZlNDVhMzE2NTcyNzIzMjgwNzNjZGVkZDgxYTJhYQ&amp;amp;ctz=America%2FNew_York&amp;amp;hl=en Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
Meetings are held the first thursday of the month.&lt;br /&gt;
&lt;br /&gt;
===== Next Meeting =====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
'''DATE''': Thursday, September 17, 2009. 6:00pm Eastern Daylight Time&amp;lt;BR/&amp;gt;&lt;br /&gt;
'''LOCATION''': 22260 Pacific blvd, Sterling, VA. 20166&amp;lt;BR&amp;gt;&lt;br /&gt;
'''TOPIC''': &amp;quot;Fortify 360&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
'''SPEAKER''': Erik Klein (Fortify Software), Eric Dalci (Cigital)&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''INSTRUCTIONS''': RSVP through [mailto:wade.woolwine@owasp.org?Subject=OWASP%20RSVP Wade Woodline] with “OWASP RSVP” in the subject.&lt;br /&gt;
&lt;br /&gt;
'''DESCRIPTION''':&lt;br /&gt;
 &amp;lt;p&amp;gt; We're pleased to invite you to our next week's OWASP Session (Thursday September 17th). We will be hosting a presentation, demo and hands on session of Fortify 360 (http://www.fortify.com). Fortify 360 includes Fortify SCA (Source Code Analyzer) and the Fortify 360 Server which is Fortify's solution for an enterprise deployment of SCA. The session will start with a presentation by Fortify engineers, followed by a demo and finally a hands on session where the audience will be free to install Fortify SCA on the machine and try it the SCA tool on a sample application that we will provide. The audience will also be introduced with the Fortify 360 Server and try some of the enterprise level features such as collaborative code review, metrics and so on. Bring your laptop if you want to try Fortify 360!&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
The target audience is anyone interested in Secure Code Review with a Static Analysis tool at the desktop level and/or enterprise level. We will need to register visitors before hand...please email wade.woolwine@owasp.org for registration and confirm attendance. Pizza and refreshments will be served.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
'''DATE''': Thursday, September 3, 2009. 6:00pm.&amp;lt;BR/&amp;gt;&lt;br /&gt;
'''LOCATION''': 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
'''TOPIC''': &amp;quot;Conducting Application Assessment&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
'''SPEAKER''': Jeremy Epstein, SRI&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''INSTRUCTIONS''': RSVP through [mailto:wisseman_stan@bah.com?Subject=OWASP%20RSVP Stan Wisseman] with “OWASP RSVP” in the subject.&lt;br /&gt;
&lt;br /&gt;
'''DESCRIPTION''':&lt;br /&gt;
&amp;lt;P&amp;gt;After the 2000 election, many states launched headlong into electronic&lt;br /&gt;
voting systems to avoid the problems with &amp;quot;hanging chads&amp;quot;.  Once&lt;br /&gt;
problems with those systems started appearing, many localities started&lt;br /&gt;
moving to optical scan, which was used by a majority of US voters in&lt;br /&gt;
the 2008 election.  There are other technologies in use around the&lt;br /&gt;
country, including lever machines, vote-by-mail, vote-by-phone, and&lt;br /&gt;
Internet voting.  What are the tradeoffs among these technologies?&lt;br /&gt;
Particularly relevant to OWASP, what are the security issues&lt;br /&gt;
associated with different types of equipment, and what measures do&lt;br /&gt;
vendors of voting equipment use to try to address the security&lt;br /&gt;
problems?  Are software security problems important, or can&lt;br /&gt;
non-technical measures protect against them?  In this talk, we'll&lt;br /&gt;
discuss a wide variety of voting technologies, and their pros and cons&lt;br /&gt;
from both a technical and societal perspective.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''ABOUT THE SPEAKER''':&lt;br /&gt;
Jeremy Epstein is Senior Computer Scientist at SRI International.  His&lt;br /&gt;
background includes more than 20 years experience in computer security&lt;br /&gt;
research, product development, and consulting.  Prior to joining SRI&lt;br /&gt;
International, he was Principal Consultant with Cigital, and before&lt;br /&gt;
that spent nine years as Senior Director of Product Security at&lt;br /&gt;
Software AG, an international business software company. Within the area&lt;br /&gt;
of voting systems, Jeremy has been involved for over&lt;br /&gt;
five years in voting technology and advocacy, both as an employee and&lt;br /&gt;
as an independent consultant.&lt;br /&gt;
&lt;br /&gt;
===== Upcoming Speakers =====&lt;br /&gt;
&lt;br /&gt;
If you want to present, please contact John. We're very open to hearing from all our members.&lt;br /&gt;
Future speakers to include Gunnar Peterson and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York View the OWASP NoVA Chapter Calendar]&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. '''REGISTRATION IS OPEN!'''&lt;br /&gt;
&lt;br /&gt;
Please send an email to [mailto:John.Steven@owasp.org John Steven] with your skill level with Statis Analysis tools, your motivation and '''the dates''' that you want to sign in for.  &lt;br /&gt;
Students are required to bring their own laptop. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop&lt;br /&gt;
&lt;br /&gt;
=== Past meetings ===&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=68865</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=68865"/>
				<updated>2009-09-15T00:56:47Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add August 6th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=NjNhbjFsZ3FrdXRqYzVuNW11amhsbmRqZHMgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [http://www.google.com/calendar/event?action=VIEW&amp;amp;eid=djRqZHZramU3bmZtdXQyMnA4aGVmcGxlMzQganN0ZXZlbkBjaWdpdGFsLmNvbQ&amp;amp;tok=MjEjam9obi5zdGV2ZW5Ab3dhc3Aub3JnY2ZmNDM1Nzc1YmZlNDVhMzE2NTcyNzIzMjgwNzNjZGVkZDgxYTJhYQ&amp;amp;ctz=America%2FNew_York&amp;amp;hl=en Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
Meetings are held the first thursday of the month.&lt;br /&gt;
&lt;br /&gt;
===== Next Meeting =====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
'''DATE''': Thursday, September 17, 2009. 6:00pm Eastern Daylight Time&amp;lt;BR/&amp;gt;&lt;br /&gt;
'''LOCATION''': 22260 Pacific blvd, Sterling, VA. 20166&amp;lt;BR&amp;gt;&lt;br /&gt;
'''TOPIC''': &amp;quot;Fortify 360&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
'''SPEAKER''': Erik Klein (Fortify Software), Eric Dalci (Cigital)&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''INSTRUCTIONS''': RSVP through [mailto:wade.woolwine@owasp.org?Subject=OWASP%20RSVP Wade Woodline] with “OWASP RSVP” in the subject.&lt;br /&gt;
&lt;br /&gt;
'''DESCRIPTION''':&lt;br /&gt;
 &amp;lt;p&amp;gt; We're pleased to invite you to our next week's OWASP Session (Thursday September 17th). We will be hosting a presentation, demo and hands on session of Fortify 360 (http://www.fortify.com). Fortify 360 includes Fortify SCA (Source Code Analyzer) and the Fortify 360 Server which is Fortify's solution for an enterprise deployment of SCA. The session will start with a presentation by Fortify engineers, followed by a demo and finally a hands on session where the audience will be free to install Fortify SCA on the machine and try it the SCA tool on a sample application that we will provide. The audience will also be introduced with the Fortify 360 Server and try some of the enterprise level features such as collaborative code review, metrics and so on. Bring your laptop if you want to try Fortify 360!&lt;br /&gt;
&lt;br /&gt;
The target audience is anyone interested in Secure Code Review with a Static Analysis tool at the desktop level and/or enterprise level. We will need to register visitors before hand...please email wade.woolwine@owasp.org for registration and confirm attendance. Pizza and refreshments will be served.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
'''DATE''': Thursday, September 3, 2009. 6:00pm.&amp;lt;BR/&amp;gt;&lt;br /&gt;
'''LOCATION''': 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
'''TOPIC''': &amp;quot;Conducting Application Assessment&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
'''SPEAKER''': Jeremy Epstein, SRI&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''INSTRUCTIONS''': RSVP through [mailto:wisseman_stan@bah.com?Subject=OWASP%20RSVP Stan Wisseman] with “OWASP RSVP” in the subject.&lt;br /&gt;
&lt;br /&gt;
'''DESCRIPTION''':&lt;br /&gt;
&amp;lt;P&amp;gt;After the 2000 election, many states launched headlong into electronic&lt;br /&gt;
voting systems to avoid the problems with &amp;quot;hanging chads&amp;quot;.  Once&lt;br /&gt;
problems with those systems started appearing, many localities started&lt;br /&gt;
moving to optical scan, which was used by a majority of US voters in&lt;br /&gt;
the 2008 election.  There are other technologies in use around the&lt;br /&gt;
country, including lever machines, vote-by-mail, vote-by-phone, and&lt;br /&gt;
Internet voting.  What are the tradeoffs among these technologies?&lt;br /&gt;
Particularly relevant to OWASP, what are the security issues&lt;br /&gt;
associated with different types of equipment, and what measures do&lt;br /&gt;
vendors of voting equipment use to try to address the security&lt;br /&gt;
problems?  Are software security problems important, or can&lt;br /&gt;
non-technical measures protect against them?  In this talk, we'll&lt;br /&gt;
discuss a wide variety of voting technologies, and their pros and cons&lt;br /&gt;
from both a technical and societal perspective.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''ABOUT THE SPEAKER''':&lt;br /&gt;
Jeremy Epstein is Senior Computer Scientist at SRI International.  His&lt;br /&gt;
background includes more than 20 years experience in computer security&lt;br /&gt;
research, product development, and consulting.  Prior to joining SRI&lt;br /&gt;
International, he was Principal Consultant with Cigital, and before&lt;br /&gt;
that spent nine years as Senior Director of Product Security at&lt;br /&gt;
Software AG, an international business software company. Within the area&lt;br /&gt;
of voting systems, Jeremy has been involved for over&lt;br /&gt;
five years in voting technology and advocacy, both as an employee and&lt;br /&gt;
as an independent consultant.&lt;br /&gt;
&lt;br /&gt;
===== Upcoming Speakers =====&lt;br /&gt;
&lt;br /&gt;
If you want to present, please contact John. We're very open to hearing from all our members.&lt;br /&gt;
Future speakers to include Gunnar Peterson and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York View the OWASP NoVA Chapter Calendar]&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. '''REGISTRATION IS OPEN!'''&lt;br /&gt;
&lt;br /&gt;
Please send an email to [mailto:John.Steven@owasp.org John Steven] with your skill level with Statis Analysis tools, your motivation and '''the dates''' that you want to sign in for.  &lt;br /&gt;
Students are required to bring their own laptop. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop&lt;br /&gt;
&lt;br /&gt;
=== Past meetings ===&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=67215</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=67215"/>
				<updated>2009-08-04T18:06:07Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Tool license and Vendor IP */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=67214</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=67214"/>
				<updated>2009-08-04T18:05:12Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66598</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66598"/>
				<updated>2009-07-27T00:40:33Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 6: Tool Adoption and Deployment, September 2009 (date not set) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66597</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66597"/>
				<updated>2009-07-27T00:40:07Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 5: Customization Lab for Ounce Lab, September 2009 (date not set) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66596</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66596"/>
				<updated>2009-07-27T00:39:53Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date not set) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date to be confirmed)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66595</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66595"/>
				<updated>2009-07-27T00:38:43Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 5: Tool Adoption and Deployment, September 2009 (date not set) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 6: Tool Adoption and Deployment, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66594</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66594"/>
				<updated>2009-07-27T00:36:35Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 5: Tool Adoption and Deployment, September 17th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66593</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66593"/>
				<updated>2009-07-27T00:36:19Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 5: Customization Lab for Ounce Lab, August 27th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66592</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66592"/>
				<updated>2009-07-27T00:35:49Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 4: Customization Lab (Ounce Lab), August 27th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Tool Assisted Code Reviews with Ounce Lab, September 2009 (date not set)===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Ounce Lab (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Ounce Lab Static Analysis tool:&lt;br /&gt;
* Ounce Lab will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Session 5: Customization Lab for Ounce Lab, August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI&lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66591</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66591"/>
				<updated>2009-07-27T00:33:43Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 3: Customization Lab (Fortify), August 27th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab for Fortify SCA, August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66590</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66590"/>
				<updated>2009-07-27T00:33:20Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews, August 13th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews with Fortify SCA, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66589</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66589"/>
				<updated>2009-07-27T00:32:40Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 3: Customization Lab (Fortify), August 13th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 27th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc.&lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66588</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66588"/>
				<updated>2009-07-27T00:32:17Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews, August 13th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to the Fortify SCA Static Analysis tool:&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. We will have an open discussion session after the demo for students to ask questions to the vendor. After this course, student should be able to scan code by their own. Student should feel free to bring code to scan.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66587</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66587"/>
				<updated>2009-07-27T00:28:00Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews, August 6th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, August 13th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Fortify (Mike Mauro)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: AOL, 22000 AOL Way, Dulles, VA 20166&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison. We will have an open discussion session after each demo for students to ask questions to the vendors. Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66586</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66586"/>
				<updated>2009-07-27T00:25:47Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, August 6th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison. We will have an open discussion session after each demo for students to ask questions to the vendors. Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=66585</id>
		<title>File:Owasp SAtrack plan.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=66585"/>
				<updated>2009-07-27T00:25:03Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: uploaded a new version of &amp;quot;File:Owasp SAtrack plan.png&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66584</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66584"/>
				<updated>2009-07-27T00:22:48Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
'''Next Meeting'''&amp;lt;P&amp;gt;&lt;br /&gt;
Future speakers to include Gunnar Peterson, Dan Cornell, and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owasp SAtrack plan.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. '''REGISTRATION IS OPEN!'''&lt;br /&gt;
&lt;br /&gt;
Please send an email to [mailto:John.Steven@owasp.org John Steven] with your skill level with Statis Analysis tools, your motivation and '''the dates''' that you want to sign in for.  &lt;br /&gt;
Students are required to bring their own laptop. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop&lt;br /&gt;
&lt;br /&gt;
=== Past meetings ===&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=66583</id>
		<title>File:Owasp SAtrack plan.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:Owasp_SAtrack_plan.png&amp;diff=66583"/>
				<updated>2009-07-27T00:22:05Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66509</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66509"/>
				<updated>2009-07-24T14:45:29Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owaspsa-track2.PNG|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, August 6th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison. We will have an open discussion session after each demo for students to ask questions to the vendors. Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66029</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66029"/>
				<updated>2009-07-15T18:34:31Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Student’s prerequisites */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
'''Next Meeting'''&amp;lt;P&amp;gt;&lt;br /&gt;
Future speakers to include Gunnar Peterson, Dan Cornell, and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owaspsa-track2.PNG|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. '''REGISTRATION IS OPEN!'''&lt;br /&gt;
&lt;br /&gt;
Please send an email to [mailto:John.Steven@owasp.org John Steven] with your skill level with Statis Analysis tools, your motivation and '''the dates''' that you want to sign in for.  &lt;br /&gt;
Students are required to bring their own laptop. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop&lt;br /&gt;
&lt;br /&gt;
=== Past meetings ===&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66028</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66028"/>
				<updated>2009-07-15T18:32:36Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Registration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
'''Next Meeting'''&amp;lt;P&amp;gt;&lt;br /&gt;
Future speakers to include Gunnar Peterson, Dan Cornell, and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owaspsa-track2.PNG|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. '''REGISTRATION IS OPEN!'''&lt;br /&gt;
&lt;br /&gt;
Please send an email to [mailto:John.Steven@owasp.org John Steven] with your skill level with Statis Analysis tools, your motivation and '''the dates''' that you want to sign in for.  &lt;br /&gt;
Students are required to bring their own laptop. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66027</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66027"/>
				<updated>2009-07-15T18:30:55Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
'''Next Meeting'''&amp;lt;P&amp;gt;&lt;br /&gt;
Future speakers to include Gunnar Peterson, Dan Cornell, and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owaspsa-track2.PNG|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. '''REGISTRATION IS OPEN!'''&lt;br /&gt;
&lt;br /&gt;
Please send an email to [mailto:John.Steven@owasp.org John Steven] with your skills level with Statis Analysis and your motivation.  Students are required to bring their own laptop. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66021</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66021"/>
				<updated>2009-07-15T18:21:57Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
'''Next Meeting'''&amp;lt;P&amp;gt;&lt;br /&gt;
Future speakers to include Gunnar Peterson, Dan Cornell, and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
The Northern Virginia (NoVA) chapter is committed to compiling resources on interesting and valuable topic areas. We hope that this structure helps you access information pertinent to your tasks at hand as you move through a secure application development life cycle. Currently, our topic areas of focus include activities such as:&lt;br /&gt;
&lt;br /&gt;
* Threat Modeling&lt;br /&gt;
* [[Code Review and Static Analysis with tools]]&lt;br /&gt;
* Penetration Testing and Dynamic Analysis tools&lt;br /&gt;
* Monitoring/Dynamic patching (WAFs)&lt;br /&gt;
&lt;br /&gt;
Certain projects our members are involved in cross-cut these activities, providing value throughout. They include:&lt;br /&gt;
&lt;br /&gt;
* ASVS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66019</id>
		<title>Virginia</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Virginia&amp;diff=66019"/>
				<updated>2009-07-15T18:20:50Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==== About ====&lt;br /&gt;
[[Image:Owasp-nova.JPG|275px|right]]The '''OWASP Washington VA Local Chapter''' meetings are FREE and OPEN to anyone interested in learning more about application security. We encourage individuals to provide knowledge transfer via hands-on training and presentations of specific OWASP projects and research topics and sharing SDLC knowledge. &lt;br /&gt;
&lt;br /&gt;
We the encourage vendor-agnostic presentations to utilize the OWASP  Powerpoint template when applicable and individual volunteerism to enable perpetual growth. As a 501(3)c non-profit association donations of meeting space or refreshments sponsorship is encouraged, simply contact the local chapter leaders listed on this page to discuss. Prior to participating with OWASP please review the Chapter Rules.&lt;br /&gt;
&lt;br /&gt;
The original DC Chapter was founded in June 2004 by [mailto:jeff.williams@owasp.org Jeff Williams] and has had members from Virginia to Delaware. In April 2005 a new chapter, OWASP Washington VA Local Chapter, was formed and the DC Chapter was renamed to DC-Maryland. The two are sister chapters and include common members and shared discourse. The chapters meet in opposite halves of the month to facilitate this relationship.&lt;br /&gt;
&lt;br /&gt;
{{Chapter Template|chaptername=Virginia|extra=The chapter leader is [mailto:John.Steven@owasp.org John Steven]|mailinglistsite=http://lists.owasp.org/mailman/listinfo/owasp-CHANGEME|emailarchives=http://lists.owasp.org/pipermail/owasp-CHANGEME}}&lt;br /&gt;
* [http://lists.owasp.org/mailman/listinfo/owasp-wash_dc_va Click here to join local chapter mailing list]&lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Locations ====&lt;br /&gt;
'''If you plan to attend in person:'''&lt;br /&gt;
&lt;br /&gt;
Directions to Booz Allen's One Dulles facility:&lt;br /&gt;
&lt;br /&gt;
13200 Woodland Park Road&lt;br /&gt;
Herndon, VA 20171&lt;br /&gt;
&lt;br /&gt;
From Tyson's Corner:&lt;br /&gt;
&lt;br /&gt;
* Take LEESBURG PIKE / VA-7 WEST&lt;br /&gt;
* Merge onto VA-267 WEST / DULLES TOLL ROAD (Portions Toll)&lt;br /&gt;
* Take the VA-657 Exit (Exit Number 10 towards Herndon / Chantilly)&lt;br /&gt;
* Take the ramp toward CHANTILLY&lt;br /&gt;
* Turn Left onto CENTERVILLE ROAD (at end of ramp)&lt;br /&gt;
* Turn Left onto WOODLAND PARK ROAD (less than 1⁄2 mile)&lt;br /&gt;
* End at 13200 WOODLAND PARK ROAD&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;'''If you plan to attend via Webinar:'''&lt;br /&gt;
&lt;br /&gt;
You can attend through [[OWASPNoVA WebEx]] &lt;br /&gt;
&lt;br /&gt;
==== Schedule ====&lt;br /&gt;
'''Next Meeting'''&amp;lt;P&amp;gt;&lt;br /&gt;
Future speakers to include Gunnar Peterson, Dan Cornell, and more.&amp;lt;/p&amp;gt;&lt;br /&gt;
July 9th 6pm-9pm EST&amp;lt;br&amp;gt;&lt;br /&gt;
LOCATION: 13200 Woodland Park Road Herndon, VA 20171&amp;lt;BR&amp;gt;&lt;br /&gt;
TOPIC: &amp;quot;Ounce's 02&amp;quot;&amp;lt;BR&amp;gt;&lt;br /&gt;
SPEAKER(S): Dinis Cruz, OWASP, Ounce Labs.&amp;lt;BR&amp;gt;&lt;br /&gt;
PANEL: TBD&amp;lt;BR&amp;gt;&lt;br /&gt;
INSTRUCTIONS: RSVP through  Stan Wisseman wisseman_stan@bah.com with “OWASP RSVP” in the subject.&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
DESCRIPTION: So what is O2?&lt;br /&gt;
&lt;br /&gt;
Well in my mind O2 is a combination of advanced tools (Technology) which are designed to be used on a particular way (Process) by knowledgeable Individuals (People)&lt;br /&gt;
&lt;br /&gt;
Think about it as a Fighter Jet who is able to go very fast, has tons of controls, needs to be piloted by somebody who knows what they are doing and needs to have a purpose (i.e. mission).&lt;br /&gt;
&lt;br /&gt;
Basically what I did with O2 was to automate the workflow that I have when I'm engaged on a source-code security review.&lt;br /&gt;
&lt;br /&gt;
Now, here is the catch, this version is NOT for the faint-of-heart. I designed this to suit my needs, which although are the same as most other security consultants, have its own particularities :)&lt;br /&gt;
&lt;br /&gt;
The whole model of O2 development is based around the concept of automating a security consultant’s brain, so I basically ensure that the main O2 Developer (Dinis Cruz) has a very good understanding of the feature requirements of the targeted Security Consultant (Dinis Cruz) :) . And this proved (even to my surprise) spectacularly productive, since suddenly I (i.e. the security consultant) didn't had to wait months for new features to be added to its toolkit. If there was something that needed to be added, it would just be added in days or hours.&lt;br /&gt;
&lt;br /&gt;
* View the OWASP NoVA Chapter [http://www.google.com/calendar/hosted/owasp.org/embed?src=owasp.org_1ht5oegk8kd0dtat5cko71e7dc%40group.calendar.google.com&amp;amp;ctz=America/New_York Calendar ]&lt;br /&gt;
&lt;br /&gt;
* The next meeting is '''Thursday, July 9th, 2009.''' &lt;br /&gt;
* Add July 9th to my [https://www.google.com/calendar/hosted/owasp.org/event?action=TEMPLATE&amp;amp;amp;tmeid=azV2NGU4Zjc0M2w1a3NxMG4zam84cXZyMmcgb3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGc&amp;amp;amp;tmsrc=b3dhc3Aub3JnXzFodDVvZWdrOGtkMGR0YXQ1Y2tvNzFlN2RjQGdyb3VwLmNhbGVuZGFyLmdvb2dsZS5jb20 Google Calendar], [https://cigital.webex.com/cigital/j.php?ED=124134682&amp;amp;UID=1012271922&amp;amp;ICS=MI&amp;amp;LD=1&amp;amp;RD=2&amp;amp;ST=1&amp;amp;SHA2=CWCN2pMk85HfkpUHyB/6CNfutu79JBWjL2LWsdybwEw= Exchange Calendar]&lt;br /&gt;
&lt;br /&gt;
==== Knowledge ====&lt;br /&gt;
&lt;br /&gt;
==== Contributors and Sponsors ====&lt;br /&gt;
&lt;br /&gt;
'''Chapter Leader'''&lt;br /&gt;
&lt;br /&gt;
* [mailto:John.Steven@owasp.org John Steven], with assistance from [mailto:paco@cigital.com Paco Hope]&lt;br /&gt;
&lt;br /&gt;
'''Refreshment Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Cigital_OWASP.GIF]]&lt;br /&gt;
&lt;br /&gt;
'''Facility Sponsors'''&lt;br /&gt;
&lt;br /&gt;
[[Image:Bah-bw.JPG|215px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__NOTOC__&lt;br /&gt;
&amp;lt;headertabs/&amp;gt;&lt;br /&gt;
&amp;lt;paypal&amp;gt;Northern Virginia&amp;lt;/paypal&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Past Meetings ==&lt;br /&gt;
&lt;br /&gt;
===June 2009===&lt;br /&gt;
''Gary McGraw, Cigital Inc.'':''Building Security In Maturity Model''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, an interview:&lt;br /&gt;
''Jim Routh, formerly of DTCC'':''The Economic Advantages of a Resilient Supply Chain- Software Security&lt;br /&gt;
''&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Gary McGraw talked about the experience he, Sammy Migues, and Brian Chess gained conducting a survey of some of America's top Software Security groups. Study results are available under the [http://creativecommons.org/licenses/by-sa/3.0/ Creative Commons Share Alike license] at [http://www.bsi-mm.com www.bsi-mm.com]. Gary described the common structural elements and activities of successful software security programs, present the maturity model that resulted from survey data, and discuss lessons learned from listening to those leading these groups. &lt;br /&gt;
&lt;br /&gt;
Jim Routh gave an incredibly insightful interview regarding his own experiences crafting their security group. &lt;br /&gt;
&lt;br /&gt;
Download presentation notes at: [http://www.owasp.org/images/0/03/JMR-Economics_of_Security_Goups.ppt The Economic Advantages of a Resilient Supply Chain- Software Security]&lt;br /&gt;
&lt;br /&gt;
===May 2009 ===&lt;br /&gt;
''Eric Dalci, Cigital Inc.'':''Introduction to Static Analysis''&amp;lt;BR&amp;gt;&lt;br /&gt;
Later, a panel:&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Steven Lavenhar, Booz Allen Hamilton;&lt;br /&gt;
&amp;lt;LI&amp;gt;Eric Dalci, Cigital Inc.&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
Panel moderated by John Steven&lt;br /&gt;
&amp;lt;BR&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This session is an introductory to Static Analysis. This session presents the different types of analysis used by today's Static Analysis tools. Examples of direct application to find vulnerabilities will be shown (ex: Data Flow Analysis, Semantic, Control Flow, etc.). Current limitations of Static Analysis will also be exposed. This session is tool agnostic, but will cover the approach taken by various leading commercial (as well as open-source) tools.&lt;br /&gt;
&lt;br /&gt;
Download: [http://www.owasp.org/images/e/ea/OWASP_Virginia_Edalci_May09.pdf Intro to Static Analysis]&lt;br /&gt;
&lt;br /&gt;
===April 2009 ===&lt;br /&gt;
''Jeremiah Grossman, Whitehat Security'': '''Top 10 Web Hacking Techniques 2008'''&amp;lt;br&amp;gt;&lt;br /&gt;
Jeremiah Spoke on (what he and colleagues determined were the) top ten web hacking techniques of 2008. This talk was a preview of his RSA '09 talk.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Download http://www.whitehatsec.com/home/assets/presentations/09PPT/PPT_OWASPNoVA04082008.pdf&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Later,&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Nate Miller, Stratum Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Jeremiah Grossman, Whitehat Security;&lt;br /&gt;
&amp;lt;LI&amp;gt;Tom Brennan, Whitehat Security; and&lt;br /&gt;
&amp;lt;LI&amp;gt;Wade Woolwine, AOL&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
served as penetration testing panels answering questions posed and moderated by Ken Van Wyk.&lt;br /&gt;
&lt;br /&gt;
=== February 2009 ===&lt;br /&gt;
&lt;br /&gt;
''Ryan C. Barnett, Breach Security'': '''Patching Challenge: Securing WebGoat with ModSecurity'''&lt;br /&gt;
&lt;br /&gt;
Identification of web application vulnerabilities is only half the battle with remediation efforts as the other. Let's face the facts, there are many real world business scenarios where it is not possible to update web application code in either a timely manner or at all. This is where the tactical use-case of implementing a web application firewall to address identified issues proves its worth.&lt;br /&gt;
&lt;br /&gt;
This talk will provide an overview of the recommended practices for utilizing a web application firewall for virtual patching. After discussing the framework to use, we will then present a very interesting OWASP Summer of Code Project where the challenge was to attempt to mitigate as many of the OWASP WebGoat vulnerabilities as possible using the open source ModSecurity web application firewall. During the talk, we will discuss both WebGoat and ModSecurity and provide in-depth walk-throughs of some of the complex fixes. Examples will include addressing not only attacks but the underlying vulnerabilities, using data persistence for multiple-step processes, content injection and even examples of the new LUA programming language API. The goal of this talk is to both highlight cutting edge mitigation options using a web application firewall and to show how it can effectively be used by security consultants who traditionally could only offer source code fixes.&lt;br /&gt;
&lt;br /&gt;
Ryan C. Barnett is the Director of Application Security Research at Breach Security and leads Breach Security Labs. He is also a Faculty Member for the SANS Institute, Team Lead for the Center for Internet Security Apache Benchmark Project and a Member of the Web Application Security Consortium where he leads the Distributed Open Proxy Honeypot Project. Mr. Barnett has also authored a web security book for Addison/Wesley Publishing entitled &amp;quot;Preventing Web Attacks with Apache.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
(This talk is a preview of Ryan's talk at Blackhat Federal the following week - see https://www.blackhat.com/html/bh-dc-09/bh-dc-09-speakers.html#Barnett )&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Virtual_Patching_Ryan_Barnett_Blackhat_Federal_09.zip| WAF Virtual Patching Challenge: Securing WebGoat with ModSecurity]]&lt;br /&gt;
&lt;br /&gt;
''John Steven, Cigital'': '''Moving Beyond Top N Lists'''&lt;br /&gt;
&lt;br /&gt;
Download [[Media:Moving_Beyond_Top_N_Lists.ppt.zip| Moving Beyond Top N Lists]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cigital published an article: The Top 11 Reasons Why Top 10 (or 25) Lists Don’t Work. Yet, these lists are a staple of conference abstracts, industry best practice lists, and the like. Are they good or bad? We’ll explore how to get beyond the Top 10 (or 25) list in making your software security effort real.&lt;br /&gt;
&lt;br /&gt;
John is Senior Director, Advanced Technology Consulting at Cigital. His experience includes research in static code analysis and hands-on architecture and implementation of high-performance, scalable Java EE systems. John has provided security consulting services to a broad variety of commercial clients including two of the largest trading platforms in the world and has advised America's largest internet provider in the Midwest on security and forensics. John led the development of Cigital's architectural analysis methodology and its approach to deploying enterprise software security frameworks. He has demonstrated success in building Cigital's intellectual property for providing cutting-edge security. He brings this experience and a track record of effective strategic innovation to clients seeking to change, whether to adopt more cutting-edge approaches, or to solidify ROI. John currently chairs the SD Best Practices security track and co-edits the building security in department of IEEE's Security and Privacy magazine. John has served on numerous conference panels regarding software security, wireless security and Java EE system development. He holds a B.S. in Computer Engineering and an M.S. in Computer Science from Case Western Reserve University.&lt;br /&gt;
&lt;br /&gt;
=== January 2009 ===&lt;br /&gt;
&lt;br /&gt;
To kick off 2009, our January meeting featured a discussion of the relationship between application security and CMMI, and an overview of the OWASP ASVS project.&lt;br /&gt;
&lt;br /&gt;
''Michele Moss, Booz Allen Hamilton'': '''Evolutions In The Relationship Between Application Security And The CMMI'''&lt;br /&gt;
 &lt;br /&gt;
Addressing new and complex threats and IT security challenges requires repeatable, reliable, rapid, and cost effective solutions.  To implement these solutions, organizations have begun to align their security improvement efforts with their system and software development practices.  During a “Birds of a Feather” at the March 2007 SEPG, a group of industry representatives initiated an effort which led to the definition of assurance practices that can be applied in the context of the CMMI. This presentation will provide an understanding how applying the assurance practices in the context of security contribute to the overall increased quality of products and services, illustrate how the a focus on assurance in the context of CMMI practices is related to application security practices, and present and approach to evaluate and improve the repeatability and reliability of assurance practices. &lt;br /&gt;
 &lt;br /&gt;
Michele Moss, CISSP, is a security engineer with more than 12 years of experience in process improvement. She specializes in integrating assurance processes and practices into project lifecycles. Michele is the Co-Chair of the DHS Software Assurance Working Group on Processes &amp;amp; Practices. She has assisted numerous organizations with maturing their information technology, information assurance, project management, and support practices through the use of the capability maturity models including the CMMI, and the SSE-CMM. She is one of the key contributors in an effort to apply an assurance focus to CMMI.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Moss-AppSecurityAndCMMI.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Mike Boberski, Booz Allen Hamilton'': '''About OWASP ASVS'''&lt;br /&gt;
&lt;br /&gt;
The primary aim of the OWASP ASVS Project is to normalize the range&lt;br /&gt;
of coverage and level of rigor available in the market when it comes to&lt;br /&gt;
performing application-level security verification. The goal is to&lt;br /&gt;
create a set of commercially-workable open standards that are tailored&lt;br /&gt;
to specific web-based technologies.&lt;br /&gt;
&lt;br /&gt;
Mike Boberski works at Booz Allen Hamilton. He has a background in&lt;br /&gt;
application security and the use of cryptography by applications. He is&lt;br /&gt;
experienced in trusted product evaluation, security-related software&lt;br /&gt;
development and integration, and cryptomodule testing. For OWASP, he is&lt;br /&gt;
the project lead and a co-author of the  OWASP Application Security&lt;br /&gt;
Verification Standard, the first OWASP standard.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[https://www.owasp.org/images/5/52/About_OWASP_ASVS_Web_Edition.ppt]]&lt;br /&gt;
&lt;br /&gt;
=== November 2008 ===&lt;br /&gt;
For our November 2008 meeting, we had two great presentations on software assurance and security testing.&lt;br /&gt;
&lt;br /&gt;
''Nadya Bartol, Booz Allen Hamilton'': '''Framework for Software Assurance'''&lt;br /&gt;
&lt;br /&gt;
Nadya's presentation will provide an update on the Software Assurance&lt;br /&gt;
Forum efforts to establish a comprehensive framework for software&lt;br /&gt;
assurance (SwA) and security measurement.  The Framework addresses&lt;br /&gt;
measuring achievement of SwA goals and objectives within the context of&lt;br /&gt;
individual projects, programs, or enterprises.  It targets a variety of&lt;br /&gt;
audiences including executives, developers, vendors, suppliers, and&lt;br /&gt;
buyers.  The Framework leverages existing measurement methodologies,&lt;br /&gt;
including Practical Software and System Measurement (PSM); CMMI Goal,&lt;br /&gt;
Question, Indicator, Measure (GQ(I)M);  NIST SP 800-55 Rev1; and ISO/IEC&lt;br /&gt;
27004 and identifies commonalities among the methodologies to help&lt;br /&gt;
organizations integrate SwA measurement in their overall measurement&lt;br /&gt;
efforts cost-effectively and as seamlessly as possible, rather than&lt;br /&gt;
establish a standalone SwA measurement effort within an organization.&lt;br /&gt;
The presentation will provide an update on the SwA Forum Measurement&lt;br /&gt;
Working Group work, present the current version of the Framework and underlying measures&lt;br /&gt;
development and implementation processes, and propose example SwA&lt;br /&gt;
measures applicable to a variety of SwA stakeholders.  The presentation&lt;br /&gt;
will update the group on the latest NIST and ISO standards on&lt;br /&gt;
information security measurement that are being integrated into the&lt;br /&gt;
Framework as the standards are being developed.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Bartol-MeasurementForOWASP11-13-08.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Paco Hope, Cigital'': '''The Web Security Testing Cookbook'''&lt;br /&gt;
&lt;br /&gt;
The Web Security Testing Cookbook (O'Reilly &amp;amp; Associates, October 2008)&lt;br /&gt;
gives developers and testers the tools they need to make security&lt;br /&gt;
testing a regular part of their development lifecycle. Its recipe style&lt;br /&gt;
approach covers manual, exploratory testing as well automated techniques&lt;br /&gt;
that you can make part of your unit tests or regression cycle. The&lt;br /&gt;
recipes cover the basics like observing messages between clients and&lt;br /&gt;
servers, to multi-phase tests that script the login and execution of web&lt;br /&gt;
application features. This book complements many of the security texts&lt;br /&gt;
in the market that tell you what a vulnerability is, but not how to&lt;br /&gt;
systematically test it day in and day out. Leverage the recipes in this&lt;br /&gt;
book to add significant security coverage to your testing without adding&lt;br /&gt;
significant time and cost to your effort.&lt;br /&gt;
&lt;br /&gt;
Congratulations to Tim Bond who won an autographed copy of Paco's book.&lt;br /&gt;
Get your copy here [[http://www.amazon.com/Security-Testing-Cookbook-Paco-Hope/dp/0596514832]]&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/PacoHope-WebSecCookbook.pdf]]&lt;br /&gt;
&lt;br /&gt;
=== October 2008 ===&lt;br /&gt;
For our October 2008 meeting, we had two fascinating talks relating to forensics.&lt;br /&gt;
&lt;br /&gt;
''Dave Merkel, Mandiant'': '''Enterprise Grade Incident Management - Responding to Persistent Threats'''&lt;br /&gt;
&lt;br /&gt;
Dave Merkel is Vice President of Products at Mandiant, a leading provider of information security services, education and products. Mr. Merkel has worked in the information security and incident response industry for over 10 years. His background includes service as a federal agent in the US Air Force and over 7 years experience directing security operations at America Online. He currently oversees the product business at Mandiant, and is in charge of building Mandiant Intelligent Response - an enterprise incident response solution. But no, he won't be selling you anything today.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/Mandiant-EnterpriseIRandAPTpresentation.pdf]]&lt;br /&gt;
&lt;br /&gt;
''Inno Eroraha, NetSecurity'': '''[Responding to the Digital Crime Scene: Gathering Volatile Data'''&lt;br /&gt;
&lt;br /&gt;
Inno Eroraha is the founder and chief strategist of NetSecurity Corporation, a company that provides digital forensics, hands-on security consulting, and Hands-on How-To® training solutions that are high-quality, timely, and customer-focused. In this role, Mr. Eroraha helps clients plan, formulate, and execute the best security and forensics strategy that aligns with their business goals and priorities. He has consulted with Fortune 500 companies, IRS, DHS, VA, DoD, and other entities.&lt;br /&gt;
&lt;br /&gt;
Slides available: [[http://www.epsteinmania.com/owasp/NetSecurity-RespondingToTheDigitalCrimeScene-GatheringVolatileData-TechnoForensics-102908.pdf] ]&lt;br /&gt;
&lt;br /&gt;
==Knowledge==&lt;br /&gt;
On the [[Knowledge]] page, you'll find links to this chapter's contributions organized by topic area.&lt;br /&gt;
 &lt;br /&gt;
[[Category:Virginia]]&lt;br /&gt;
[[Category:Washington, DC]]&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66018</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=66018"/>
				<updated>2009-07-15T18:19:03Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:Owaspsa-track2.PNG|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison. We will have an open discussion session after each demo for students to ask questions to the vendors. Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:Owaspsa-track2.PNG&amp;diff=66017</id>
		<title>File:Owaspsa-track2.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:Owaspsa-track2.PNG&amp;diff=66017"/>
				<updated>2009-07-15T18:18:29Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62895</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62895"/>
				<updated>2009-05-28T07:59:58Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews, July 30th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat. Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison. We will have an open discussion session after each demo for students to ask questions to the vendors. Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62894</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62894"/>
				<updated>2009-05-28T07:58:30Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) Northern Virginia Chapter].&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
*Download slide from [http://www.owasp.org/index.php/Virginia_(Northern_Virginia) here]&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62893</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62893"/>
				<updated>2009-05-28T07:56:49Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital)&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan (Cigital)&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi (Cigital)&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62892</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62892"/>
				<updated>2009-05-28T07:55:54Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Session 2: Tool Assisted Code Reviews, July 30th 2009 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric (Cigital), Bruce Mayhew (Ounce Lab) and Fortify (tbd)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62891</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62891"/>
				<updated>2009-05-28T07:54:58Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Contacts */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this curriculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62890</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62890"/>
				<updated>2009-05-28T07:54:40Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Contacts===&lt;br /&gt;
Questions related to this currculum should be sent to [mailto:John.Steven@owasp.org John Steven], who is the Northern Virginia chapter leader.&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62889</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62889"/>
				<updated>2009-05-28T07:52:05Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we may have a hard limit of 40 students. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. Although we will give preference to people who show regularity and sign up for many sessions. Students will have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH pre-installed. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. Vendors are not allowed to interfere with other vendors’ session or demo. Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62888</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62888"/>
				<updated>2009-05-28T07:47:38Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we will have to put a hard limit of 40 attendees. &lt;br /&gt;
&lt;br /&gt;
Registration for sessions will be on first come and first served basis. But we will give preference to people who show regularity and sign up for many sessions. Students will also have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH preinstalled. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. This will mitigate side-by-side comparison of tool &amp;amp; findings. Vendors are not allowed to interfere with other vendors’ session or demo (Note: we have only 2 vendors: Fortify &amp;amp; Ounce Lab). Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62887</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62887"/>
				<updated>2009-05-28T07:46:57Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
===Logistics Preparation for hands on===&lt;br /&gt;
&lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we will have to put a hard limit of 40 attendees. &lt;br /&gt;
&lt;br /&gt;
===Registration=== &lt;br /&gt;
Registration for sessions will be on first come and first served basis. But we will give preference to people who show regularity and sign up for many sessions. Students will also have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH preinstalled. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
===Student’s prerequisites===&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
===Tool license and Vendor IP===&lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. This will mitigate side-by-side comparison of tool &amp;amp; findings. Vendors are not allowed to interfere with other vendors’ session or demo (Note: we have only 2 vendors: Fortify &amp;amp; Ounce Lab). Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
===Session 1: Intro To Static Analysis, May 7th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
===Session 2: Tool Assisted Code Reviews, July 30th 2009===&lt;br /&gt;
*Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
===Session 3: Customization Lab (Fortify), August 13th 2009===&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
===Session 4: Customization Lab (Ounce Lab), August 27th 2009===&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
===Session 5: Tool Adoption and Deployment, September 17th 2009===&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62886</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62886"/>
				<updated>2009-05-28T07:45:33Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Logistics Preparation for hands on===&lt;br /&gt;
&lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we will have to put a hard limit of 40 attendees. &lt;br /&gt;
&lt;br /&gt;
====Registration==== &lt;br /&gt;
Registration for sessions will be on first come and first served basis. But we will give preference to people who show regularity and sign up for many sessions. Students will also have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH preinstalled. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
====Student’s prerequisites====&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
====Tool license and Vendor IP==== &lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. This will mitigate side-by-side comparison of tool &amp;amp; findings. Vendors are not allowed to interfere with other vendors’ session or demo (Note: we have only 2 vendors: Fortify &amp;amp; Ounce Lab). Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
====Session 1: Intro To Static Analysis, May 7th 2009====&lt;br /&gt;
*Speaker: Dalci, Eric&lt;br /&gt;
*Time: 2 hours + open discussion&lt;br /&gt;
*Classroom size: This is open. &lt;br /&gt;
This presentation will give a taste about what Static Analysis is.&lt;br /&gt;
&lt;br /&gt;
====Session 2: Tool Assisted Code Reviews, July 30th 2009====&lt;br /&gt;
*Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
*Time: 2hours and half&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
&lt;br /&gt;
This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
* Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
* Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
* We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
* Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
&lt;br /&gt;
====Session 3: Customization Lab (Fortify), August 13th 2009====&lt;br /&gt;
*Speaker: Mike Ware&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (2 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
*	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
*	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
*	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
*	Configuration [properties: data.encryption = off]&lt;br /&gt;
*	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
**	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
*	Authentication&lt;br /&gt;
*	Authorization&lt;br /&gt;
*	Data validation&lt;br /&gt;
*	Session management&lt;br /&gt;
*	Etc. &lt;br /&gt;
&lt;br /&gt;
====Session 4: Customization Lab (Ounce Lab), August 27th 2009====&lt;br /&gt;
*Speaker: Nabil Hannan&lt;br /&gt;
*Time: 3 hours&lt;br /&gt;
*Logistics: Hands on setup as in logistic section.&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
*Prerequisite: Attended session 2&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
*	Custom rules (1.5 hours)&lt;br /&gt;
**	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
*	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
*	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
*	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
*	Filtering (.5 hours)&lt;br /&gt;
**	Prioritizing remediation efforts&lt;br /&gt;
*	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
*	Modifying finding severity/category &lt;br /&gt;
**	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
*	Input Validation&lt;br /&gt;
*	SQL Injection&lt;br /&gt;
*	Cross-Site Scripting&lt;br /&gt;
*	Etc.&lt;br /&gt;
*	Reporting (.5 hours)&lt;br /&gt;
**	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
*	OWASP Top 10&lt;br /&gt;
*	PCI &lt;br /&gt;
&lt;br /&gt;
====Session 5: Tool Adoption and Deployment, September 17th 2009====&lt;br /&gt;
*Speaker: Shivang Trivedi&lt;br /&gt;
*Time: 2 hours&lt;br /&gt;
*Location: TBD&lt;br /&gt;
*Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
*Classroom size: Open&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
*	Tool Selection&lt;br /&gt;
**	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
**	Coverage&lt;br /&gt;
**	Enterprise Support&lt;br /&gt;
**	Quality of Security Findings &lt;br /&gt;
*	Phases of Integration&lt;br /&gt;
**	Pre-requisites&lt;br /&gt;
**	Goals and Challenges&lt;br /&gt;
**	Distribution of Roles and Responsibilities&lt;br /&gt;
**	Considering LOE &lt;br /&gt;
*	Model Per Activity&lt;br /&gt;
**	Activity Flow&lt;br /&gt;
**	Phase Transition &lt;br /&gt;
*	Deployment Model&lt;br /&gt;
**	Advantages&lt;br /&gt;
**	Disadvantages &lt;br /&gt;
*	Free and Handy Tools to&lt;br /&gt;
**	Continuously Integrate&lt;br /&gt;
**	Join activity flow &lt;br /&gt;
*	Improvements and Lessons Learned&lt;br /&gt;
**	Effective use of tool’s capabilities&lt;br /&gt;
**	Expanding Coverage&lt;br /&gt;
**	Analysis Techniques&lt;br /&gt;
**	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:OWASP_roadmap.png&amp;diff=62885</id>
		<title>File:OWASP roadmap.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:OWASP_roadmap.png&amp;diff=62885"/>
				<updated>2009-05-28T07:38:31Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: uploaded a new version of &amp;quot;Image:OWASP roadmap.png&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62884</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62884"/>
				<updated>2009-05-28T07:38:08Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap for the Northern Virginia Chapter.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Logistics Preparation for hands on===&lt;br /&gt;
&lt;br /&gt;
Classroom’s size estimate for hands on: 30 stations max. Physical number of students can be larger as people may want to pair up. But we will have to put a hard limit of 40 attendees. &lt;br /&gt;
&lt;br /&gt;
====Registration==== &lt;br /&gt;
Registration for sessions will be on first come and first served basis. But we will give preference to people who show regularity and sign up for many sessions. Students will also have to fill up a small interview before the session so the instructors get to know their skill level and motivation. Students are required to meet the prerequisites for the sessions that they sign for. We ask to the students to bring their laptop in the hands on session, and to have software such as SSH preinstalled. Basic knowledge about code is also required in all sessions, except the last one. We will start registration by email mid-June or earlier.&lt;br /&gt;
&lt;br /&gt;
====Student’s prerequisites====&lt;br /&gt;
All students will need to bring their own laptop and use them as client to connect to the host machines; we will support windows users, MacOS and Unix. They should have at least 2 Gig of Ram, and have a version of SSH installed. &lt;br /&gt;
&lt;br /&gt;
====Tool license and Vendor IP==== &lt;br /&gt;
Vendors will need to provide tools license for the hands on sessions. Students will not have tool installation on their machine. They will be piloting copies of the tool in a remote VM image. This is NOT a competition. The purpose is NOT to compare tools, different source code will be picked for each vendor. This will mitigate side-by-side comparison of tool &amp;amp; findings. Vendors are not allowed to interfere with other vendors’ session or demo (Note: we have only 2 vendors: Fortify &amp;amp; Ounce Lab). Questions related to tool comparison between the present vendors are out of scope. Vendors are free to present features and particularities exclusive to their tools.&lt;br /&gt;
&lt;br /&gt;
====Session 1: Intro To Static Analysis, July 16th 2009====&lt;br /&gt;
Speaker: Dalci, Eric&lt;br /&gt;
Time: 2 hours + open discussion&lt;br /&gt;
Logistics: Slideware and projector&lt;br /&gt;
Location: &amp;lt;TBD&amp;gt;: AOL, Booz ?&lt;br /&gt;
Classroom size: This is open. &lt;br /&gt;
The last couple meetings have been averaging to about 40 attendees or so. This presentation will give a taste about what Static Analysis is. We will also try to raise interest for people to sign up for the successive session. People may want to sign up early for the session 2 and 3 since those will have limited seats. &lt;br /&gt;
Outcome: Sign up sheet with preregistered students for other sessions &lt;br /&gt;
To-dos: &lt;br /&gt;
•	Keep students engaged and mail them prerequisites about the following sessions.&lt;br /&gt;
•	Continue to prepare logistics for other sessions&lt;br /&gt;
Session 2: Tool Assisted Code Reviews, July 30th 2009&lt;br /&gt;
Speaker: Dalci, Eric, Bruce Mayhew (Ounce Lab) and Fortify (?)&lt;br /&gt;
Time: 2hours and half&lt;br /&gt;
Logistics: Hands on setup as in logistic section.&lt;br /&gt;
Location: TBD&lt;br /&gt;
Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
Outcome: Sign up sheet with preregistered students for other sessions &lt;br /&gt;
To-dos: &lt;br /&gt;
•	Keep students engaged and mail them prerequisites about the following sessions.&lt;br /&gt;
•	Continue to prepare logistics for other sessions&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Eric will introduce the session and the vendors. This is an introduction course to two Static Analysis tools (Fortify SCA and Ounce Labs 6):&lt;br /&gt;
•	Fortify will demo its tool and scan Webgoat.&lt;br /&gt;
•	Ounce Lab will demo its tool, but we may scan a different project such as HacmeBank just to avoid tool comparison.&lt;br /&gt;
•	We will have an open discussion session after each demo for students to ask questions to the vendors.&lt;br /&gt;
•	Vendors should not interfere with each other’s session. Questions related to tool comparison will not be answered since this is not the goal of this session.&lt;br /&gt;
Session 3: Customization Lab (Fortify), August 13th 2009&lt;br /&gt;
Speaker: Mike Ware&lt;br /&gt;
Time: 3 hours&lt;br /&gt;
Logistics: Hands on setup as in logistic section.&lt;br /&gt;
Location: TBD&lt;br /&gt;
Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
Prerequisite: Attended session 2&lt;br /&gt;
Outcome: Sign up sheet with preregistered students for other sessions &lt;br /&gt;
To-dos: &lt;br /&gt;
•	Keep students engaged and mail them prerequisites about the following sessions.&lt;br /&gt;
•	Continue to prepare logistics for other sessions&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Mike will train the students on how to customize the Fortify Source Code Analyzer (SCA)&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
•	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
•	Custom rules (2 hours)&lt;br /&gt;
o	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
•	Data flow sources and sinks [private data sent to custom logger, turn web service entry points into data flow rules?]&lt;br /&gt;
•	Data flow cleanse and pass through [cleanse: HTML escaping, pass through: third party library]&lt;br /&gt;
•	Semantic [use of a sensitive API [getEmployeeSSN()]&lt;br /&gt;
•	Structural [all Struts ActionForms must extend custom base ActionForm]&lt;br /&gt;
•	Configuration [properties: data.encryption = off]&lt;br /&gt;
•	Control flow [always call securityCheck() before downloadFile()] &lt;br /&gt;
•	Filtering (.5 hours)&lt;br /&gt;
o	Prioritizing remediation efforts&lt;br /&gt;
•	Priority filters (e.g., P1, P2, etc) &lt;br /&gt;
o	Isolating findings (&amp;quot;security controls&amp;quot; example)&lt;br /&gt;
•	Authentication&lt;br /&gt;
•	Authorization&lt;br /&gt;
•	Data validation&lt;br /&gt;
•	Session management&lt;br /&gt;
•	Etc. &lt;br /&gt;
&lt;br /&gt;
Session 4: Customization Lab (Ounce Lab), August 27th 2009&lt;br /&gt;
Speaker: Nabil Hannan&lt;br /&gt;
Time: 3 hours&lt;br /&gt;
Logistics: Hands on setup as in logistic section.&lt;br /&gt;
Location: TBD&lt;br /&gt;
Classroom size: 30 stations, 40 attendees max&lt;br /&gt;
Prerequisite: Attended session 2&lt;br /&gt;
Outcome: Sign up sheet with preregistered students for other sessions &lt;br /&gt;
To-dos: &lt;br /&gt;
•	Keep students engaged and mail them prerequisites about the following sessions.&lt;br /&gt;
•	Continue to prepare logistics for other sessions&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Nabil will train the students on how to customize the Ounce Labs 6 tool&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
•	Approach to auditing scan results to determine true positives and false positives (.5 hours)&lt;br /&gt;
•	Custom rules (1.5 hours)&lt;br /&gt;
o	Hands on examples of different rule types applied to code that resembles real business logic&lt;br /&gt;
•	Data flow sources and sinks [private data sent to custom logger]&lt;br /&gt;
•	Data flow cleanse [cleanse: HTML encoding]&lt;br /&gt;
•	Semantic [use of a sensitive API e.g. getEmployeeSSN()] &lt;br /&gt;
•	Filtering (.5 hours)&lt;br /&gt;
o	Prioritizing remediation efforts&lt;br /&gt;
•	Understanding the Ounce Vulnerability Matrix&lt;br /&gt;
•	Modifying finding severity/category &lt;br /&gt;
o	Isolating findings (using &amp;quot;bundles&amp;quot;)&lt;br /&gt;
•	Input Validation&lt;br /&gt;
•	SQL Injection&lt;br /&gt;
•	Cross-Site Scripting&lt;br /&gt;
•	Etc.&lt;br /&gt;
•	Reporting (.5 hours)&lt;br /&gt;
o	Demonstrate compliance with industry regulations and best practices&lt;br /&gt;
•	OWASP Top 10&lt;br /&gt;
•	PCI &lt;br /&gt;
&lt;br /&gt;
Session 5: Tool Adoption and Deployment, September 10th 2009&lt;br /&gt;
Speaker: Shivang Trivedi&lt;br /&gt;
Time: 2 hours&lt;br /&gt;
Logistics: Slideware and projector&lt;br /&gt;
Location: TBD&lt;br /&gt;
Prerequisite: Preferably attended session 2, but not mandatory &lt;br /&gt;
Classroom size: Open&lt;br /&gt;
Outcome: Sign up sheet with preregistered students for other sessions &lt;br /&gt;
To-dos: &lt;br /&gt;
•	Keep students engaged and mail them prerequisites about the following sessions.&lt;br /&gt;
•	Continue to prepare logistics for other sessions&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Shivang will talk about integration of a Static Analysis tool into the SDLC.&lt;br /&gt;
&lt;br /&gt;
Agenda (draft):&lt;br /&gt;
&lt;br /&gt;
•	Tool Selection&lt;br /&gt;
o	Flexible with Static Analysis and/or Penetration Testing&lt;br /&gt;
o	Coverage&lt;br /&gt;
o	Enterprise Support&lt;br /&gt;
o	Quality of Security Findings &lt;br /&gt;
•	Phases of Integration&lt;br /&gt;
o	Pre-requisites&lt;br /&gt;
o	Goals and Challenges&lt;br /&gt;
o	Distribution of Roles and Responsibilities&lt;br /&gt;
o	Considering LOE &lt;br /&gt;
•	Model Per Activity&lt;br /&gt;
o	Activity Flow&lt;br /&gt;
o	Phase Transition &lt;br /&gt;
•	Deployment Model&lt;br /&gt;
o	Advantages&lt;br /&gt;
o	Disadvantages &lt;br /&gt;
•	Free and Handy Tools to&lt;br /&gt;
o	Continuously Integrate&lt;br /&gt;
o	Join activity flow &lt;br /&gt;
•	Improvements and Lessons Learned&lt;br /&gt;
o	Effective use of tool’s capabilities&lt;br /&gt;
o	Expanding Coverage&lt;br /&gt;
o	Analysis Techniques&lt;br /&gt;
o	Improving Results Accuracy&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62882</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62882"/>
				<updated>2009-05-28T07:27:18Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:OWASP_roadmap.png&amp;diff=62881</id>
		<title>File:OWASP roadmap.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:OWASP_roadmap.png&amp;diff=62881"/>
				<updated>2009-05-28T07:26:18Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: uploaded a new version of &amp;quot;Image:OWASP roadmap.png&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62880</id>
		<title>Code Review and Static Analysis with tools</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=Code_Review_and_Static_Analysis_with_tools&amp;diff=62880"/>
				<updated>2009-05-28T07:25:37Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: /* Static Analysis Curriculum */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Chapter: [[Virginia_(Northern_Virginia)#OWASP_Washington_VA_Local_Chapter | OWASP NoVA]] &amp;gt;&amp;gt; [[Virginia_(Northern_Virginia)#Knowledge | Knowledge]]&lt;br /&gt;
&lt;br /&gt;
== Static Analysis Curriculum ==&lt;br /&gt;
&lt;br /&gt;
* For an introduction to the OWASP Static Analysis (SA) Track goals, objectives, and session roadmap, please see [http://www.owasp.org/index.php/Image:OWASP_NoVA_SA_Track_Final_20090408.ppt this presentation].&lt;br /&gt;
&lt;br /&gt;
The following is the agenda of the OWASP Static Analysis track roadmap.&lt;br /&gt;
&lt;br /&gt;
[[Image:OWASP roadmap.png|800px|OWASP Static Analysis Roadmap - Northern Virginia Chapter 2009]]&lt;br /&gt;
&lt;br /&gt;
== Code Review and Static Analysis with tools ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What: [[Security_Code_Review_in_the_SDLC | Secure Code Review]]&lt;br /&gt;
&amp;lt;LI&amp;gt;Who: Performed by Security Analysts&lt;br /&gt;
&amp;lt;LI&amp;gt;Where it fits: [http://bsi-mm.com/ssf/ssdl/cr/ BSIMM Secure Code Review]&lt;br /&gt;
&amp;lt;LI&amp;gt;Cost: Scales with depth, threat facing application, and application size/complexity&lt;br /&gt;
&amp;lt;/UL&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This article will answer the following questions about secure code review and use of static analysis tools:&lt;br /&gt;
&amp;lt;OL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;What are static analysis tools and how do I use them?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I select a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I customize a static analysis tool?&lt;br /&gt;
&amp;lt;LI&amp;gt;How do I scale my assessment practices with secure code review?&lt;br /&gt;
&amp;lt;/OL&amp;gt; &lt;br /&gt;
&lt;br /&gt;
== Organizational == &lt;br /&gt;
How do I scale my assessment practices with secure code review?&amp;lt;p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Implementing a static analysis tool goes a long way to providing a force multiplier for organizations. The following presentation discusses a comprehensive set of steps organizations can undertake to successfully adopt such tools. The presentation discusses who should adopt the tool, what steps they should take, who they should involve, and how long/much it will cost.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:Cigital_-_Fortify_Implementation_Preso.ppt|Implementing a Static Analysis Tool.ppt]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For those with existing assessment practices involving secure code review (whether or not those practices leverage tools) the question often becomes, &amp;quot;I can review an application, but how do I scale the practice to my entire organization without astronomic cost?&amp;quot; The following presentation addresses this question:&lt;br /&gt;
&lt;br /&gt;
[[Maturing_Software_Assessment_Through_Static_Analysis | Maturing Assessment Through Static Analysis]]&lt;br /&gt;
&lt;br /&gt;
== Customization ==&lt;br /&gt;
People who believe that the value of static analysis is predominantly within their core capabilities &amp;quot;out of the box&amp;quot; come up incredibly short. By customizing your chosen tool you can expect:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;UL&amp;gt;&lt;br /&gt;
&amp;lt;LI&amp;gt;Dramatically better accuracy (increased true positives, decreased false positives, and decreased false negatives)&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for corporate security standards&lt;br /&gt;
&amp;lt;LI&amp;gt;Automated scanning for an organization's top problems&lt;br /&gt;
&amp;lt;LI&amp;gt;Visibility into adherence to (or inclusion of) sanctioned toolkits&lt;br /&gt;
&amp;lt;/UL&amp;gt;   &lt;br /&gt;
&lt;br /&gt;
The following presentation was given at the NoVA chapter in '06 and discusses deployment and customization:&lt;br /&gt;
&lt;br /&gt;
[[Media:OWASP_Adopting_a_Static_Analysis_Tool.ppt|Adopting a Static Analysis Tool]]&lt;br /&gt;
&lt;br /&gt;
Warning: this presentation is old and gives examples using the now defunct &amp;quot;CodeAssure&amp;quot; from what was then SecureSoftware.&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	<entry>
		<id>https://wiki.owasp.org/index.php?title=File:OWASP_roadmap.png&amp;diff=62879</id>
		<title>File:OWASP roadmap.png</title>
		<link rel="alternate" type="text/html" href="https://wiki.owasp.org/index.php?title=File:OWASP_roadmap.png&amp;diff=62879"/>
				<updated>2009-05-28T07:19:07Z</updated>
		
		<summary type="html">&lt;p&gt;Edalci: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Edalci</name></author>	</entry>

	</feed>