David A. Wheeler edited this page Jun 16, 2018 · 34 revisions

Here are some of the OSS projects that have made improvements so that they could get a badge. Since one of the purposes of the badge is to encourage improvements to projects if they are lacking, this is fantastic, and is evidence that the badge is helping.

We're very grateful to the projects who were willing to make those changes and tell us about them. This is not a complete list; projects aren't required to tell us what they changed, and we might not have recorded the changes here. We also know that some projects, such as openstack, didn't need to make any changes to get a badge. See the website for the current full list of projects who have badges; examples projects that have received a badge include the Linux kernel, Node.js, GitLab, curl, OpenSSL, GnuPG, NTPsec, Apache Syncope, openstack, Blender, and LibreOffice.

In any case, with those caveats, here is a list of some projects that made improvements due to the badge, and what those improvements were.


OWASP ZAP is a widely-used tool for scanning web applications to look for security vulnerabilities. We used it to help develop our BadgeApp application. You can see the details here:

ZAP refers to the badge on its github page and have tweeted about it as well: They currently can't add the badge reference to the OWASP wiki page, as it doesn't allow external images, but they've asked if this can be changed:

The OWASP ZAP project lead, Simon Bennetts (a.k.a. Psiinon), had some really nice things to say (quoted with permission): "I can definitely confirm that the badging project has helped us improve ZAP quality. It allowed us to see where we were doing well and where we were falling short, and that has helped us focus on the areas that needed most improvement. For us it has definitely not been a 'box ticking' excercise. We want to follow the best practices, and have made sure that we have changed our development processes so that we are doing all we can to make ZAP into a high quality project. I'm a big fan of the badging project, and will be very happy to be quoted as being a strong supporter of it :)."

For example, before they started pursuing the badge, the project had relatively limited automated testing. Limited testing turns out to be a widespread problem for this kind of tool. Users of these tools are looking for problems they don't know about, and these kinds of tools use a lot of heuristics, so users typically don't notice when a tool fails to detect what it should detect. Naturally, without user feedback about failures, it's easy to skip creating automated tests (users aren't complaining!). This isn't a guess; Shay Chen's "WAVSEP Web Application Scanner Benchmark 2014" reported in benchmarking these kinds of tools that, "More than a few tools that got high results in the previous benchmarks categories, got lesser results in this one – in the same categories, although nothing in the test environment has changed... The overall problem is related to product testing and maintenance... software bugs may cause a variety of crucial features not to function for long periods of time, without anyone being aware of them."

I'm delighted to report that the ZAP folks have made great strides in their automated testing, and that was the last criteria they needed to meet to get a badge. This is exactly the sort of thing that a best practices badge can point out - it can identify things that should be done, even if it's not immediately obvious to users.


league/commonmark (a CommonMark implementation) got a badge.

Colin O'Dell reported, "thank you for your work on the CII Best Practices program! Having a concrete list of best practices was a huge help in finding and fixing the gaps in my project. I'm looking forward to seeing more and more projects get their badges."

In a private email to David A. Wheeler on 2016-08-11, Colin O'Dell reported that, "The PHP League (of which league/commonmark is a member project) already had some strict guidelines and best practices in the project template which aligned very well with your best practices. If I recall correctly, the only additional things I had to implement were:

  • TLS for the website (and all links from the repository to the website)
  • Publishing the process for reporting vulnerabilities."

"Even though the changes were relatively minimal in my case, I do think having that master checklist was very beneficial - we were able to resolve a couple gaps and see that our other best practices were aligned with the broader open-source community."


OPNFV posted a blog post about their experience getting a badge. They said:

The requirements to get the badge are quite extensive, so we realized we needed to do some work in order to comply. One example is that we replaced crypto algorithms that are no longer considered secure (MD5). Another example is that we updated the OPNFV wiki pages with more specific and clear instructions on how to report security incidents. We believe that security can never be achieved by an isolated security group. The work needs to involve everybody in the project, from developers to management.

See also: "How OPNFV Earned Its Security Stripes and Received a CII Best Practices Badge" (


vim-metamath has a badge.

To get a badge, the project made changes:

  • An automated test suite was added.
  • Text was added to meet "The information on how to contribute SHOULD include the requirements for acceptable contributions (e.g., a reference to any required coding standard). (URL required) [contribution_requirements]"
  • It was modified to meet, "The project MUST have a unique version number for each release intended to be used by users. [version_unique]
  • It was modified to meet, "The project MUST provide, in each release, release notes that are a human-readable summary of major changes in that release. (URL required) [release_notes]"

The Assimilation Project

Alan Robertson reports that he did change some things in the Assimilation project in response to the badging project criteria:

  • Set up coding guidelines.
  • Set up a process for vulnerability reports; he plans to have a dedicated email address and document it.

The "general information" website (mostly source documentation) didn't support TLS; he investigating how to fix that.

JSON for Modern C++

JSON for Modern C++ got a badge. The author stated that "I really appreciate some formalized quality assurance which even hobby projects can follow."

To get the badge the project made two changes:

  • explicitly mentioned how to privately report errors.
  • added a static analysis check to the continuous integration script.

"Apart from that, the project was already set, and it was nice to see that nothing fundamental was missing."


BRL-CAD posted the following comment on 2016-08-22:

Happy to be #8 in the list and 28th to get to 100%. Here’s a retrospective with feedback. In all, it took about 3 interrupted hours total to gather, fact check, and write up responses for all fields. Probably would have taken an hour uninterrupted. Getting to 100% passing was relatively easy for BRL-CAD with only one MUST item arguably being unmet beforehand (our website certificate didn’t match our domain, fixed). The rest was mostly a matter of documentation and elaboration.

POCO C++ Libraries

POCO C++ Libraries has earned a badge. Günter Obiltschnig said on 2016-09-13:

... thank you for setting up the best practices site. It was really helpful for me in assessing the status of, and making the necessary changes and additions.

Some of the changes they made were:

  • updated the file to include a statement on reporting security issues via email and added a link to that file to
  • updated the instructions for preparing a release in the Wiki to include running clang-analyzer.
  • enabled HTTPS for the project website (using a Let’s Encrypt certificate and certbot-auto), which was actually most of the work (including fixing all links, etc.).
  • Ran clang-analyze on the code base for peace of mind ;-)

CII Census

The CII census project predates the badging project, so when the census project was first created there were no badging criteria and no specific effort to get the badge. To get the badge the CII census project had to do several things:

  • Create an automated test suite, and document the expectation that major changes must extend the test suite to cover that new functionality
  • Add a static analysis tool (and fix the problems it identified)
  • Document how to report vulnerabilities
  • Document the coding conventions used.

The CII census project now has a badge.


Experimental Physics and Industrial Control System (EPICS) is "a set of Open Source software tools, libraries and applications developed collaboratively and used worldwide to create distributed soft real-time control systems for scientific instruments such as a particle accelerators, telescopes and other large scientific experiments." EPICS has worked on getting a best practices badge. Some of the issues they worked on to get a badge were:

  • static code analysis.

GNU Make

  • HTTPS. They depend on Savannah as their hosting system, which at the time did not support HTTPS for repositories (even though it supported HTTPS for project home pages). We reported this problem to the Savannah maintainers, who modified their hosting system to support HTTPS for repositories. This gives greater confidence that the software downloaded by users is what was posted on the repository.


Flawfinder is "a simple program that examines C/C++ source code and reports possible security weaknesses (“flaws”) sorted by risk level. It’s very useful for quickly finding and removing at least some potential security problems before a program is widely released to the public."

This is an old project started by the CII Badging lead many years ago, before more-modern OSS development practices became common. As a result, several changes had to be made to the flawfinder program which improved it. These include:

  • Adding the use of a static analysis tool on the software. This not only found a number of style problems that reduced readability, but it also found several bugs in the software that were repaired.
  • Adding a "" file to the project to clearly explain contribution requirements.
  • Adding information on how to report vulnerabilities.
  • Adding support for an additional package management system (pip), which made it easier for many users to install.

tuf (The Update Framework)

Vladimir Diaz reported the following from tuf:

We resolved the remaining “Warnings fixed” practice! We got it down to less than 1 warning per 100 lines of code.

The Best Practices program has been great so far. It’s been helpful in improving quality and keeping track of our project’s progress. ... we're adopting it for many of our other projects too. More to come! (Emails from 2017-11-21)


"Thank you. The program overall has been very helpful in training up a security team from scratch and giving a new project a checklist to aim at. [The] CII's badge program has been used to create a best practice here." (Kate Stewart, 2018-06-16)

Potential future entries

Comments from "Open Source Security" podcast

The Open Source Security podcast episode 14 - David A Wheeler: CII Badges had a nice quote: "This is a fantastic project... I think it is one of the most important security things going on today without question... folks go get your badges and make the world a better place...".

Clone this wiki locally
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.
Press h to open a hovercard with more details.