-
-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Harden CI Pipeline #4590
[WIP] Harden CI Pipeline #4590
Conversation
ca75d22
to
0d54fd3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Kriechi: This is ready for review and comments for someone with a fresh mind! In a nutshell, we're now making sure that we pin all our dependencies in the build process. It's massively painful, and I now see why noone else is doing it. Anyhow, lots of comments below.
Also, @briansmith has done some pretty interesting investigations into the default security of GitHub Actions at briansmith/untrusted#50 (thanks!). I've tried taking the main points into account here, in particular I've also switched our $GITHUB_TOKENs to be read-only.
- uses: TrueBrain/actions-flake8@v1.4.1 | ||
- uses: actions/checkout@v2 | ||
with: | ||
persist-credentials: false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Without setting this, every action below has implict access to $GITHUB_TOKEN. That's not necessary.
.github/workflows/main.yml
Outdated
with: | ||
file: ./web/coverage/coverage-final.json | ||
name: web | ||
|
||
docs: | ||
if: github.repository != 'mitmproxy/mitmproxy' || github.ref != 'refs/heads/main' | ||
# mirrored below >>> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This situation here is a bit unfortunate: We only want regular commits on the main branch to have access to deployment secrets, but when we set environment
for a job, it does not run without that environment being present at all. So we effectively duplicate this job, and have one version with the environment and one without.
Alternatively we could call actions/upload-artifacts here and then download them in a separate action. That also adds a lot of complexity (e.g. we currently grab all artifacts in the deploy step, we'd then need to manually list them), so the current way is probably more pragmatic.
.github/workflows/main.yml
Outdated
CI_BUILD_WHEEL: ${{ matrix.platform == 'linux' }} | ||
CI_BUILD_PYINSTALLER: 1 | ||
CI_BUILD_WININSTALLER: ${{ matrix.platform == 'windows' }} | ||
CI_BUILD_KEY: ${{ secrets.CI_BUILD_KEY }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've merged the build-binaries and build-wheels jobs, with the linux build creating the wheel now as well. That makes things easier to handle overall.
release/docker/Dockerfile
Outdated
COPY requirements.txt /home/mitmproxy/ | ||
RUN pip3 install --no-cache-dir --require-hashes -r /home/mitmproxy/requirements.txt \ | ||
&& pip3 install --no-cache-dir --no-deps /home/mitmproxy/${MITMPROXY_WHEEL} \ | ||
&& rm /home/mitmproxy/${MITMPROXY_WHEEL} /home/mitmproxy/requirements.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Our Docker images now also only install pinned dependencies.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we pin the container itself as well?
release/pins/make.py
Outdated
out = outfile.read_bytes() | ||
out = re.sub(b"-e .+", b"# --- line removed by make.py ---", out) | ||
out = re.sub(b"pip-compile --.+", b"./make.py (on the same platform)", out) | ||
outfile.write_bytes(out) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now we come to the ugly part of generating our pinned requirements.txt files.
- We need a custom requirements.txt for each OS and each Python version, because the dependencies are slightly different in each case. As we only build binaries on one version (3.9 at the moment), we only need to concern ourselves with different OSes.
- We use
pip-tools
to generate requirements.txt files which includes hashes. For some fabulous reasons we then must remove mitmproxy itself from the generated file, because editable requirements cannot be installed at the same time as pinned dependencies with hashes (we don't have a hash for the editable...).
The most annoying part is that we need to update requirements.txt on three OSes whenver we want to update dependencies. It's going to be massively painful, but I don't see a way around it. At least we can copy those files from previous CI runs...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we install ourself as editable here? Shouldn't we build the wheel, then install it + deps, and then take the requirements.txt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well we can only pin our dependencies, we can't pin mitmproxy itself. That hash changes with every commit. 😄 So in either case we need to doctor ourselves out of the generated file.
The alternative would be to have all requirements in a separate requirements.txt file and load that file in setup.py, then we could ingest that requirements file here. But that means ugly file processing in setup.py for environment markers. I don't think this would be much better?
release/pins/pip-install-pinned.py
Outdated
] | ||
|
||
check_call([*pip_install, "--require-hashes", "-r", f"requirements-{platform.system().lower()}.txt"], cwd=here) | ||
check_call([*pip_install, "--no-deps", "-e", "../.."], cwd=here) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So... when we want to install mitmproxy with pinned dependencies, we first install all pinned dependencies, and then in a second step install mitmproxy as editable, making sure that no additional dependencies are sneaking in. 🙃
setup.py
Outdated
"pytest-asyncio>=0.10.0,<0.14,!=0.14", | ||
"pytest-cov>=2.7.1,<3", | ||
"pytest-timeout>=1.3.3,<2", | ||
"pytest-xdist>=2.1.0,<3", | ||
"pytest>=6.1.0,<7", | ||
"requests>=2.9.1,<3", | ||
"tox>=3.5,<4", | ||
] | ||
"twine>=3.4.1,<4", | ||
"wheel>=0.36.2,<0.37", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These dependencies were specified in tox.ini, I've just moved them all in here for simplicity. Otherwise we need to maintain different pinned requirements files etc, it's all pain.
tox.ini
Outdated
[testenv:wheeltest] | ||
recreate = True | ||
deps = | ||
commands = | ||
pip install {posargs} | ||
pip install --require-hashes -r release/pins/requirements-linux.txt | ||
pip install --no-deps {posargs} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here was another pitfall: cibuild calls tox -e wheeltest
, which if unchanged would again introduce unpinned dependencies here. 🤷
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suppose we only really test a small percentage of our dependencies here - just enough to start the CLI and print the version...
This reverts commit 30cfb7a.
@@ -126,56 +113,129 @@ jobs: | |||
run: yarn | |||
- working-directory: ./web | |||
run: npm test | |||
- uses: codecov/codecov-action@v1 | |||
- uses: codecov/codecov-action@a1ed4b322b4b38cb846afb5a0ebfa17086917d27 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pinning is nice - but who is checking for updates if they fix a critical security issue?
for python deps we have requires.io/dependabot, I'm not aware of similar tooling for github action steps? (next startup idea?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is none. I believe the security benefit of not being prone to supply chain attacks is much higher than a potential issue in these actions. I mean how would you exploit a bug in here? It's only ever run in trusted contexts if we merge in the exploit code, at which point all hope is lost anyways.
- The exception would be the action talking to some servers (like codecov did before the release pinned here...), but I don't think that's happening with any of our actions now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dependabot apparently will notify you if you have it enabled. I haven't tested it.
release/docker/Dockerfile
Outdated
COPY requirements.txt /home/mitmproxy/ | ||
RUN pip3 install --no-cache-dir --require-hashes -r /home/mitmproxy/requirements.txt \ | ||
&& pip3 install --no-cache-dir --no-deps /home/mitmproxy/${MITMPROXY_WHEEL} \ | ||
&& rm /home/mitmproxy/${MITMPROXY_WHEEL} /home/mitmproxy/requirements.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we pin the container itself as well?
release/pins/requirements-linux.txt
Outdated
@@ -0,0 +1,811 @@ | |||
# |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When during our dev cycle do we check in these files? And when do we update + verify + test them?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When: Never.
Update: Manually.
Test: Well the CI jobs do that.
But yeah. 🤷 I really don't know what we can do to address that. We could conceivably have some kind of additional CI job for that which gets triggered when setup.py is changed, then runs make.py
on all three OSes and creates a PR with the fix. Or updates the existing one (with a token that requires write access). All of this is just a massive PITA.
It's really a lot of maintenance work for security benefits invisible to users. The fact that we support three OSes and want to be installable from pip makes things massively complex. Ostrich algorithm is always an alternative, of course.
I mean maybe we just don't want to pin. In a sense, all dependencies are frozen in time when we generate binaries anyways. We'd just need to make sure that all jobs that have upload access are protected.
tox.ini
Outdated
[testenv:wheeltest] | ||
recreate = True | ||
deps = | ||
commands = | ||
pip install {posargs} | ||
pip install --require-hashes -r release/pins/requirements-linux.txt | ||
pip install --no-deps {posargs} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suppose we only really test a small percentage of our dependencies here - just enough to start the CLI and print the version...
This PR has some attempts to harden our CI pipeline, in particular by strictly pinning dependency versions.