• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Static Analysis Tool Evaluation Criteria Working (redirected from Static Analysis Tool)

Page history last edited by Sherif Koussa 11 years, 11 months ago

Note: This is a working document, do not link to or reference content within this URL.

 

Criteria Second Draft

 

1. Platform support

 

1.1. Tool installation and support: Installation documentation availability and accuracy, support offered for installation and setup processes

1.2. Platform support: Operating systems supported 

1.3. Scalability support: Features offered for scalability like ability to chain machines, support for multi-core machines..etc

 

 

 

2. Technology Support

 

2.1 Programming Languages Supported: What languages does the tool support Java, C++, Ruby…etc

2.2 Cross-language Analysis Support:   Ability of the tool to perform its analysis from one language, let's say Java, to SQL, and back to Java

2.3 Mobile Technology Support: Whether the application support specific mobile technology such as objective C, J2ME…etc

2.4 Configuration Aided Analysis: The ability of the tool to scan configuration files, ability to configure the tool to identify other non-usual configuration files (.properties, .ini…etc)

2.5 Frameworks Supported: The ability of the tool to understand frameworks (struts, spring, jquery…etc), to understand custom inputs, data flow and framework specific vulnerabilities

2.6 Extension re-Configuration: The ability to re-configure file extensions to a built-in file type.

2.7 Industry Standards Aided Analysis: The ability of the tool to restrict or prioritize results based on WASC TC, OWASP Top 10 or SANS top 25 

 

 

3. Scan, Command and Control Support

 

3.1 Command line support: Whether the tool supports command line support and what are the options available

3.2 IDE integration support: Which IDE integration does the tool support and how

3.3 Build system integration support: Whether the tool supports integration into build systems

3.4 Support for custom rules: Whether the tool supports adding custom rules

3.5 Support for editing core rules: Whether the tool supports editing code rules

3.6 Scan Configuration Capabilities: The ability to configure the scan properties: output file name, memory used, rules used, …etc

3.7 Findings Prioritization Configuration: The ability to configure findings prioritization scheme, for example the number of buckets and which finding categories go into which bucket.

 

 

4. Product Signatures Update

 

4.1 Frequency of signature update: Whether the vendor has a schedule for signature updates

4.2 Relevance of signature to evolving threats: Signature updates based on zero-day vulnerabilities in underlying frameworks, research done by the vendor and the documentation related to that

 

 

5. Reporting Capabilities

 

5.1 Support for Role-based Reports: Whether the tools supports report customized to different roles: dev, line of business, c-level…etc (horizontal granularity)

5.2 Finding-level reporting information: The granularity by which the report is generated (vertical granularity). 

5.3 Support for different report formats: What format does the tool support: pdf, html, xls…etc

5.4 Support for template based reports: Whether the tool support template creation ( users create templates where they could customize look and feel, layout and content…etc)

 

 

6. Triage and Remediation Support

 

6.1 Findings Data: The information provided around a finding (explanation of the vulnerability, recommendations, accuracy level) and the relevance to the actual finding

6.2 Ability to Merge assessments: The ability of the tool to merge two assessments, for example an assessment done last year on the application versus today Ability to diff assessments

6.3 Ability to Diff Assessments: done on the same application and find the differences 

6.4 Remediation Advise Customization: Whether the tool support customizing remediation advice presented for each vulnerability.

 

 

 

7. Enterprise Level Support

 

When making a choice on a static analysis tool in the Enterprise, an important consideration to make is support for integration into various systems at the Enterprise level. These systems include bug tracking systems, systems for reporting on the risk posture of various applications, and systems that mine the data for evaluating trending patterns.

 

7.1   Support for Integration into Bug Trackers: Whether there is support to integrate into internal bug tracking system so findings can be converted to bugs automatically or with minimal effort.

 

Static analysis tools are used to scan code over and over again as existing applications are modified or new applications are implemented. These tools should be capable of recording the findings and surfacing them for consumption by the organization in a meaningful way. Typically organizations will resort to tracking bugs in a bug tracking system of their choice, either commercial or open source. These systems provide important metrics on the health of the application from a quality perspective. Therefore, it is important to evaluate the static analysis tool for its integration capabilities into the organization's defect tracker.

 

Static analysis tools should be able to support:

 

7.1.1   Automatic integration into the organizations bug tracking system:

            This will allow for automation of the security vulnerabilities found by the static analysis scans into the bug tracking system.

 

7.1.2   Manual filing of security vulnerabilities found by the static analysis tool into a bug tracking system by the code reviewer.

            Manual filing will allow the person filing the bug to customize the bug as necessary.

 

7.1.3   Ability to capture the details of security vulnerabilities found by the static analysis tool into the organizations bug tracking system. This should allow for capturing of basic defect tracking information like:

 

             7.1.3.1   The security vulnerability title and description along with attack vectors

 

             7.1.3.2   Defect classification. Defect category (Owasp Top 10, CVE or CWE ID, etc.)

 

             7.1.3.3   Defect priority / severity: Defect rating in terms of ease of exploitation, importance of fixing.

 

             7.1.3.4   Recommendations to fix: Default recommendations by the static analysis tool as well as the ability to add customized recommendations based on in-house organization capabilities

 

7.1.4   Ability to map existing fields in the static analysis tool to that of the defect tracker.

 

7.1.5   Addition of custom fields in the static analysis tool for mapping to existing or custom fields in a bug tracking system. Examples of custom fields in an organizations bug tracker could include: Bug SLA, bug domain, bug assignee.

 

7.1.6   Ability to track defect states (Open, In progress, Closed) across the two systems and in the case of automated synchronization, the ability to synchronize states.

  

7.1.7   Ability to baseline code so that duplicate vulnerabilities are not reported into the defect tracking system. This is important for tracking the key metric of how many security vulnerabilities are being introduced by a new feature, new code, or new version.

 

7.1.8   Ability to exclude certain findings from being reported into the bug tracker. This would include findings that are marked as false positives or meet certain organization criteria that deems them as accepted risks.

 

7.1.9   APIs: Static analysis tool should provide APIs to allow for programmatically synchronizing vulnerabilities between the two systems. The APIs should allow for ease of integration with the organizations bug tracking system.

 7.1.10 Actionable information should be available for the developers including the code to fix the security issues. This should be available later on to referred by other project teams as an example

 

7.1.11 bugs should also be tracked from the regular build data. Data from the auto scan during the build should also be added into the bug tracker and auto mails needs to be triggered to ensure appropriate levels are notified about the bugs encountered during the auto build activity

 

 

7.2   Data Mining Capabilities Reports: whether there are capabilities to present trends towards vulnerabilities found and issues fixed.

 

It is an important goal of any security team to be able to understand the security trends of an organization’s applications. To meet this goal, static analysis tools should provide the user with the ability to mine the vulnerability data, present trends and build intelligence from it.

 

7.2.1   Trend information can be:

 

7.2.1.1   Summary of vulnerabilities found versus fixed over time

 

7.2.1.2   Summary of vulnerabilities in different states (Open, In Progress, Closed)

 

7.2.1.3   Classification of vulnerabilities based on category type (Owasp Top 10, CVE or CWE ID, etc.)

 

7.2.1.4   Classification of vulnerabilities based on severity / priority / threat levels.

 

7.2.1.5   Summary of trend information based on an application / feature / domain that characterizes the security posture of one against the others.

 

7.2.1.6   Summary of assignees to different vulnerabilities. 

  

7.2.1.7   Historical trends: Historical information that compares how well an application is doing over time

 

7.2.1.8   Average time to fix a vulnerability plus other statistical trends.

 

 7.2.2  This trend information can be extracted and / or presented in a number of ways.

 

7.2.2.1   Via an in-built dashboard within the static analysis tool. The static analysis tool could present different dashboards.  A default dashboard for all, a dashboard that can be customized by individuals, or dashboard that is shared between groups.

 

7.2.2.2   Drag and drop widgets or gadgets to quickly put together visuals on trends.

 

7.2.2.3   Provision by the tool to present the information using different charting features or different formats. Examples include: Line graphs, Scatter plots, Pie charts, export data to other tools like Excel, etc.

  

7.2.2.4   Via a web interface that allows adhoc queries: Ability to query the vulnerability data using a simple query syntax and export the result to different formats (csv, xml, etc.).

 

7.2.2.5   Provide users the ability to define custom templates for depicting trend data. 

7.2.2.6   Provide users the ability to use APIs to query data and extract trend information.

 

7.2.2.7   By running batch jobs or individual tasks from the command line that query for data.

 

7.2.2.8   By directly accessing the vulnerability database and using standard data mining techniques.

 

 7.3   Integration into Enterprise Level Risk Management System: Whether the tool supports integration into enterprise level risk management system. 

 

Information security teams and organizations need to present an accurate view of the risk posture of their applications and systems at all times. Factors that should be considered:

 

7.3.1   What are the different risk management systems in my organization and can the static analysis tool be integrated easily with these systems?

 

7.3.2   Do each of the systems have APIs that can be leveraged to programmatically export vulnerability data from one and represent in the other? How easily can customized mappings of data between the static analysis tool and the risk management systems  be created in order to accurately capture and represent  the risk?

 

7.3.2   Can data from the static analysis tool be easily exported from its system into a format that can be easily imported into the organizations' risk management system?

 

7.3.3   What would it take to keep the systems in sync so that there is an accurate representation of the static analysis systems findings in real time with the risk management system?

 

 7.4   Ability to Aggregate Projects: whether the tool provides the ability to require meta-data to be added on a new scan. This data could be used to aggregate and classify projects, which could be used to drive intelligence to management. For example which programming languages seem to be generating more vulnerabilities?

 

Projects in organizations are built using a certain set of technologies and/or frameworks. These can be commercial, open source or built in-house. Certain projects may tend to have more security flaws as compared to others based on a technology or framework used or based on certain coding styles. Static analysis tools could be used to configure similar projects with additional metadata to detect these patterns. This will build intelligence around them that lends to being able to detect which application components have more security vulnerabilities and why.

 

When configuring projects for static analysis, the following capabilities should be considered:

 

7.4.1   Ability to add metadata to a project to classify, tag or label it with each new scan.

 

7.4.2   Ability to allow the user to tag similar projects together that are based on similar technologies / frameworks / coding styles.

 

7.4.3   Ability to roll up similar projects under one group.

 

7.4.4   Ability to graph aggregated projects for trending and reporting purposes.

 

 7.5   Licensing Scheme: An important criteria to consider when making a selection for a static analysis tool, which always ties in with the cost of the tool, is its licensing model. When evaluating take into account the following:

 

7.5.1   What are the different licensing schemes provided by the tool? 

 

7.5.2   What is the annual cost for ongoing support for the tool? What type of support is provided? (phone, chat, community support, web portal, etc.)

 

7.5.3   What is the cost for tool maintenance? How will updates be provided to the tool on an on-going basis? What is the frequency of updates?

 

7.5.4   As new technologies come in, how will the organization get added support for these?

 

7.5.5   What are the different user classifications available with the tool and what level of permissions do they have?  Does the tool differentiate between users who can configure scans, users who can run configured scans, users who can audit results, and users who can view reports only. How many users in each role can use the tool at a time?

 

7.5.6 Is the tool licensed by the number of applications covered?

 

7.5.7   How many parallel scans and / or assessments can be run at a time?

 

7.5.8   Hardware criteria for the tool. What operating systems & databases are supported?

 

7.5.9   Any other licensing restrictions?

 

7.5.10 Is the user training and awareness part of the licence agreement or support agreement

 

7.5.11 Is there user training material provided with detailed phased approach of implementing the SCSA tool in the organisation in a phased manner (first with pilot project, then multiple critical projects and so on)

 

7.5.12 How many support personnel are required to support the static code analyser

 

 

 

 

 

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Criteria First Draft.

 

1. Tool Setup and Installation

1.1 Time required to perform initial installation

1.2 Skills required to perform initial installation

1.3 Privileges required to perform initial installation

1.4 Documentation setup accuracy

1.5 Platform Support

 

2. Performing a Scan

2.1 Time required to perform a scan

2.2 Number of steps required to perform a scan

2.3 Skills required to perform a scan

 

3. Tool Coverage:

 3.1 Languages supported by the tool

3.2 Support for Semantic Analysis

3.3 Support for Syntactic Analysis 

3.4 Ability of the tool to understand different components of a project (.sql, .xml, .xsd, .properties…etc)

3.5 Coverage of Industry Standard Vulnerability Categories (OWASP Top 10, SANS Top 25…etc) 

 

4. Detection Accuracy

4.1 Number of false positives

4.2 Number of true negatives

4.3 Accuracy % 

 

5. Triage and Remediation Process

5.1 Average time to triage a finding

5.2 Quality of data surrounding a finding (explanation, tracing, trust level…etc)

5.3 Ability to mark findings as false positive

5.4 Ability to “diff” assessments

5.5 Ability to merge assessments

5.6 Correctness of remediation advice

5.7 Completeness of remediation advice

5.8 Does the tool automatically prioritize defects

 

6. UI Simplicity and Intuitiveness

6.1 Quality of triage interface (need a way to measure this)

6.2 Quality of remediation interface (need a way to measure this)

6.3 Support for IDE plug-ins both out of the box and on-demand

6.4 Quality of tools’ out of the box plugin UI

 

7. Product Update Process

7.1 Frequency of signature update

7.2 Relevance of signatures to evolving threats

7.3 Re-activeness to evolving threats

 

8. Product Maturity and Scalability

8.1 Peak memory usage

8.2 Number of scans done before a crash or serious degradation in performance

8.3 Maximum lines of code the tool can scan per project

8.4 What languages does the tool support?

 

9. Enterprise Offerings

9.1 Ability to integrate with major bug tracking systems

9.2 Ability to integrate with enterprise software configuration management

 

10. Reporting Capabilities

10.1 Quality of reports

10.2 Availability of role-based reports

10.3 Availability of report customization

 

11. Tool Customization and Automation

11.1 Can custom rules be added?

11.2 Do the rules need learning new language\script?

11.3 Can the tool be scripted? (e.g. integrated into ANT build script or other build script)

11.4 Can documentation be customized (installation instructions, remediation advice, finding explanation…etc)

11.5 Can the defect prioritization scheme customized?

11.6 Can the tool be extended so that custom plugins could be developed for other IDEs?

          

         

Voting on the categories took place via the mailing list and here by are the results:

 

1. Tool Setup and Installation

KEEP


2. Configuration and Project Setup 

KEEP, although I think there is a good amount of confusion on what does this category cover, we might change the name to better reflect the function.

 

To Do: Come up with a new name to better reflect the goal (Sherif)


3. Scan Coverage and Accuracy 

  • Suggestion (From GP): "and performance (time to feedback)"

           SK: This is actually should be covered under #2. Configuration and Project Setup

  • Suggestion (From BG): "Should be splitted, coverage and accuracy are 2 main concerns, why joining them, this will be a huge section".

           SK: Agreed, I think there is a value in splitting them

  • Suggestion (From AS): "Vulnerability Coverage and Detection Accuracy."

SK: The goal for scan coverage is to identify how many languages does the tool cover? would the tool look at SQL files for example?...etc. Now, that we are splitting this category into two, it might be adequate to include the former in addition to vulnerability coverage including industry standards, I am not sure       though that Vulnerability Coverage would be the most suitable name given that we are talking about         horizontal coverage (i.e. languages, file types...etc) and vertical coverage (industry standards and types of vulnerabilities covered), how about "Tool Coverage"?. I like "Detection Accuracy" instead of "Scan               Accuracy"

  • Suggestion (From SR): "this could have sub categories like languages supported, syntactic level, semantic level, library scanning, weakness captured etc."

SK: Yes, this is the intention of this category

  • Suggestion (From MA): "False positives and false negatives need to be detailed out, Further this should be scanning the security issues based on the OWASP top 10, SANS 20 and well known industrial standards, Where would PCI requirements fall ? This section is the crux i feel, having sub divisions would help"

SK: Absolutely, False positive and true negatives ratios are important and should be covered in "Detection Accuracy which is the new name for Scan Accuracy". Industry standards are definitely important and         should be covered under Scan Accuracy (or whatever the new name we are going to come up with)

 

Action Items:

  • Split this category into two categories


4. Triage and Remediation Process

  • Suggestion (From BG): "KEEP but I would merge it with #9 this is a very short section

SK: I respectfully disagree, the triage process is where security consultants would spend most of their time, how flexible and sophisticated the tool is could save A LOT of time. Please have a look at                   http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working  and let me know if you still think it is a small section.


5. UI Simplicity and Intuitiveness

  • Suggestion (From SR): "How do we measure UI Simplicity and Intuitiviness"

SK: Good point. Actually, if you had a look at           http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working   I had the same concern. Any idea guys?

 

Action Items: How can we measure or assess UI Simplicity and intuitiveness?


6. Product Update Process

  • Suggestion (From AS) "‘Quality’ doesn’t seem to capture the important points  about the updates: ability to perform updates from different sources (update server, network share, downloadable update file), ability to update the program components and detection rules separately, etc. I think “Update Process” is more generic."

           SK: Makes sense. I will change it.

 

Action Item: Change name to the one suggested

7. Product Maturity and Scalability

  • Suggestion (AW): "under this could we also add time to scan an app? One of the SaaS vendors I reviewed took days vs another which took hours."

          SK: This is actually covered under "2. Configuration Setup"


8. Enterprise Offerings:

  • Suggestion (From GP): "and integration with most common IDE"

           SK: This is covered under 5. UI Simplicity and Intuitiveness

  • Suggestion (From AW): "Keep,  I'm not sure about the name would Application Integration be better?"

SK: Application integration to me sounds like integration with the application being scanned. I would like to hear what everybody thinks?

  • Suggestion (From BG): "KEEP but I would merge it with #9 this is a very short section"

SK: I agree that this is a short section, however, I am not sure about merging it with #9 because they are totally measuring two different criteria of the tool. What do you think?


9. Reporting Capabilities

KEEP

10. Tool Customization and Automation

  • Suggestion (From AW): "Keep,  for the sub-category not limit to just QA tools but also vulnerability management tools like Conduit. "

SK: Makes sense.

 

Action Item: Keep other tools like Vulnerability Management tools in mind while constructing the sub-categories

Comments (0)

You don't have permission to comment on this page.