Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adjust requirements for the OpenSSF Badge at the Adopted Stage #556

Merged
merged 8 commits into from
Jun 26, 2024

Conversation

jmertic
Copy link
Contributor

@jmertic jmertic commented Nov 30, 2023

Based on the discussion during the 2023-11-29 TAC Meeting ( agenda item #502 ), this addresses two changes discussed:

  1. Change from requiring the achievement of the OpenSSF Best Practices badge at the Gold level to having a large portion of the badge completed. I put in 75% for now, but I expect the group to refine.
  2. I put that the OpenSSF Best Practices badge at the Gold level should be complete by the next Annual Review after the project reaches the Adopted Stage; again, the TAC can determine if one year is enough.

Separately, I will work on a table that outlines how to complete the OpenSSF Best Practices badge requirements.

Comment away, everyone :-)

Signed-off-by: John Mertic <jmertic@linuxfoundation.org>
@jfpanisset
Copy link
Contributor

jfpanisset commented Dec 1, 2023

As a point of discussion, here's a spreadsheet which aggregates the current badging state for ASWF projects:

https://docs.google.com/spreadsheets/d/1n8xEdbJ77fVk5YxtuqjC7KZywi0W7ZfXlGf0YjVZI9Q/edit#gid=1274673236

  • No project has yet achieved Silver, with MaterialX and OpenEXR being the closest
  • OpenEXR, OpenColorIO, MaterialX, OpenVDB, OpenImageIO, OpenCue, OSL, OpenAssetIO have achieved Passing
    OpenTimelineIO is pretty close to achieving Passing
  • Some requirements are present at more than one badge level, for instance a requirement might be a "SHOULD" for Passing, but a "MUST" for Silver: the API only returns a single result for a requirement and doesn't indicate which badge it applies to. There are only a few such "cross badge" requirements (4 or 5), so that shouldn't skew the results too much.

Signed-off-by: John Mertic <jmertic@linuxfoundation.org>
Signed-off-by: John Mertic <jmertic@linuxfoundation.org>
@jmertic
Copy link
Contributor Author

jmertic commented Mar 16, 2024

For discussion at the next TAC meeting, we have outlined seven requirements based on projects completing the badge requirements that should be temporarily omitted from the requirements until there are cleared paths to address them.

The project MUST have performed a security review within the last 5 years. This review MUST consider the security requirements and security boundary

Rationale: A preferred approach is to start with doing a threat model analysis, which could help projects improve their security posture before doing a security audit in the future. Work on this to be tracked at #615

The project MUST provide an assurance case that justifies why its security requirements are met. The assurance case MUST include: a description of the threat model, clear identification of trust boundaries, an argument that secure design principles have been applied, and an argument that common implementation security weaknesses have been countered.

The project MUST implement secure design principles where applicable. If the project is not producing software select "not applicable".

Rationale: This would best come together with the threat model analysis discussed above. Work on this to be tracked at #615

The project MUST have FLOSS automated test suite(s) that provide at least 90% statement coverage if there is at least one FLOSS tool that can measure this criterion in the selected language.

The project MUST have FLOSS automated test suite(s) that provide at least 80% branch coverage if there is at least one FLOSS tool that can measure this criterion in the selected language.

Rationale: The plurality of projects have complex hardware needs to get to this level of coverage. Work to be done on how to address this.

The project MUST have a reproducible build. If no building occurs (e.g., scripting languages where the source code is used directly instead of being compiled), select "not applicable"

Rationale: More analysis to be done. If the project is not distributing binaries, technically, this requirement is N/A, but it is desired to see how at least the common build patterns are reproducible.

The project website, repository (if accessible via the web), and download site (if separate) MUST include key hardening headers with nonpermissive values.

The GitHub Pages and Readthedocs seem to have some issues here; work to be done to fix this ( see coreinfrastructure/best-practices-badge#1878 )

ACTION: TAC to review and approve.

The current analysis across all projects is at https://docs.google.com/spreadsheets/d/1bEacUNFizeT8QtfsvqiRNNgvty8_tweHjassHko6OhQ/edit?usp=sharing

@jmertic jmertic added the 4-tac-meeting-short Short agenda item for the TAC meeting ( 5 minutes or less ) label Mar 16, 2024
@jfpanisset
Copy link
Contributor

For the 2 points about test coverage:

  • the wording about about "FLOSS tool" may be a bit too restrictive: a lot of interesting tools have a "free for open source" offering but aren't FLOSS themselves, for instance SonarCloud which is used by a number of projects
  • rather than setting arbitrary branch and instruction coverage values. maybe restart as "a commitment to improving branch and instruction coverage over time"?

For "reproducible builds": we should separate the steps of having a "reproducible build process" (which most projects provide through their CI) from the more stringent "reproducible build" requirement of being able to produce a bit for bit exact artifact which none of our projects are able to achieve (and may not be possible in many cases). It may be sufficient to clearly define the scope of what is meant by "reproducible builds" to allow most projects to meet that requirement by having a CI in place.

For the project website: adding a clause about "widely used web hosting infrastructures such as GitHub Pages or ReadTheDocs" could allow projects to tick off that slightly modified requirement?

@jmertic
Copy link
Contributor Author

jmertic commented Mar 18, 2024

Good points @jfpanisset - comments on two of your points...

For the 2 points about test coverage:

  • the wording about about "FLOSS tool" may be a bit too restrictive: a lot of interesting tools have a "free for open source" offering but aren't FLOSS themselves, for instance SonarCloud which is used by a number of projects

I think the intention here is to not burden a project to have to purchase/depend on commercial tools, which, even if there is open source pricing available, can still be prohibitive. If anything, the restriction is probably more relief than a burden to the projects; if there aren't FLOSS tools available for the language, that means that code in that language won't count towards any coverage requirements.

For the project website: adding a clause about "widely used web hosting infrastructures such as GitHub Pages or ReadTheDocs" could allow projects to tick off that slightly modified requirement?

I think something of this nature is being added to the additional description section of that requirement at some point based on the discussion of the issue thread I mentioned

@JeanChristopheMorinPerso
Copy link
Member

JeanChristopheMorinPerso commented Mar 18, 2024

the wording about about "FLOSS tool" may be a bit too restrictive: a lot of interesting tools have a "free for open source" offering but aren't FLOSS themselves, for instance SonarCloud which is used by a number of projects

The requirements are about FLOSS test suites, not FLOSS coverage tools. In other words, projects should not rely on a closed source/proprietary test runner for example. SonarCloud doesn't generate the coverage data, it's just used to store and display the coverage results.

For "reproducible builds": we should separate the steps of having a "reproducible build process" (which most projects provide through their CI) from the more stringent "reproducible build" requirement of being able to produce a bit for bit exact artifact which none of our projects are able to achieve (and may not be possible in many cases). It may be sufficient to clearly define the scope of what is meant by "reproducible builds" to allow most projects to meet that requirement by having a CI in place.

A project is either reproducible or it's not. Providing CI should not count as reproducible. As for bit-for-bit, I wouldn't say that our projects are not able to. I have not heard of a project that tried to see if their builds were reproducible. And don't forget that Python projects also need to be reproducible (because yes, that's a thing too in Python).

In general, I think I like's @jmertic statement:

Rationale: More analysis to be done. If the project is not distributing binaries, technically, this requirement is N/A, but it is desired to see how at least the common build patterns are reproducible.

So before relaxing, mark it as "need investigation". I would change "If the project is not distributing binaries, technically, this requirement is N/A" to say that a zip file or a tar file (for example packages on PyPI) do count as binaries because they do have to be reproducible. If they are not, Linux distributions will complain if our python projects get distributed by them.

@jmertic
Copy link
Contributor Author

jmertic commented Apr 1, 2024

Based on TAC feedback, we should refine the specific requirements for the Adopted Stage not to require the OpenSSF Best Practices at the Gold Level badge. Instead, we can pull in the most common requirements and then encourage projects to achieve the badge. Action is for a group to help review/refine.

@lgritz
Copy link
Contributor

lgritz commented Jun 26, 2024

Just a few additional items (besides the ones listed above) that in re-reading today, I'm still unsure exactly how to satisfy. I'm not saying they should be dropped, but at least we should give some explicit guidance about what to do, or ASWF policy about what exactly constitutes satisfying them (ideally with a working example in one project):

  • The project (both project sites and project results) SHOULD follow accessibility best practices so that persons with disabilities can still participate in the project and use the project results where it is reasonable to do so.

  • The project MUST list external dependencies in a computer-processable way.

  • The project results MUST check all inputs from potentially untrusted sources to ensure they are valid (an allowlist), and reject invalid inputs, if there are any restrictions on the data at all.

@jmertic
Copy link
Contributor Author

jmertic commented Jun 26, 2024

@lgritz - those all seem reasonable to exclude. Note that the one is a SHOULD, so whether we include it or not probably isn't a big deal either way.

I think for this one...

The project MUST list external dependencies in a computer-processable way.

Any python project using requirements.txt or the like would be fine here. For C/C++ projects, I believe CMake somewhat has a way to address external dependencies IIRC.

Signed-off-by: John Mertic <jmertic@linuxfoundation.org>
@jmertic
Copy link
Contributor Author

jmertic commented Jun 26, 2024

Approved during the TAC meeting on 20240626 - will have approved by the GB via email

@jmertic jmertic merged commit 90da158 into main Jun 26, 2024
4 checks passed
@jmertic jmertic deleted the jmertic-patch-2 branch June 26, 2024 20:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
4-tac-meeting-short Short agenda item for the TAC meeting ( 5 minutes or less )
Development

Successfully merging this pull request may close these issues.

None yet

4 participants