You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In regular intervals, we should collect all of our external dependencies on the main branch and upload them to our own caching server.
This is particularly necessary for RPM dependencies, since mirrors usually aggressively clean up old package versions.
Tricky parts:
Surprise! There's actually no way to get Bazel to tell us about all dependencies. You can use a resolved file and bazel sync, but this doesn't actually intercept the repository_ctx.download[_and_extract] calls, it just evaluates the pure parts - you won't get the URL for custom repository rules like Go or Rust. It would appear that we actually have to fully run all repository rules and use an HTTP proxy or similar trickery to intercept all downloads. Ugh!
git_repository cannot be cached. But we need to kill those anyways, given that Bazel can't even cache them locally - the repository cache only works for HTTP downloads.
There also isn't a good way of telling Bazel "hey, check this remote cache first". Repository rules do not necessarily evaluate to something that's content addressable (see above). --distdir only works for local paths and uses the.... filename, it seems? We could either just set HTTP_PROXY in tools/bazel and point it to our cache server (very easy, but affects everything and breaks in envs that want to set their own proxy), or programmatically modify WORKSPACE to add extra URLs to the rules that we want to cache (more fine grained, but does not work for all kinds of rules + requires modifying files).
ERROR: /home/leoluk/.cache/bazel/_bazel_leoluk/ea09550584ad64933c457c8fa53a71e0/external/qemu/BUILD:906:10: @qemu//:qemu-x86_64-softmmu depends on @glib//glib:glib in repository @glib which failed to fetch. no such package '@glib//glib': java.io.IOException: Error downloading [https://gitlab.gnome.org/GNOME/glib/-/archive/2.67.5/glib-2.67.5.tar.gz] to /home/leoluk/.cache/bazel/_bazel_leoluk/ea09550584ad64933c457c8fa53a71e0/external/glib/temp10012729359771221538/glib-2.67.5.tar.gz: GET returned 500 Internal Server Error
ERROR: Analysis of target '//metropolis/vm/smoketest:smoketest_container' failed;
In regular intervals, we should collect all of our external dependencies on the main branch and upload them to our own caching server.
This is particularly necessary for RPM dependencies, since mirrors usually aggressively clean up old package versions.
Tricky parts:
Surprise! There's actually no way to get Bazel to tell us about all dependencies. You can use a resolved file and
bazel sync
, but this doesn't actually intercept therepository_ctx.download[_and_extract]
calls, it just evaluates the pure parts - you won't get the URL for custom repository rules like Go or Rust. It would appear that we actually have to fully run all repository rules and use an HTTP proxy or similar trickery to intercept all downloads. Ugh!git_repository
cannot be cached. But we need to kill those anyways, given that Bazel can't even cache them locally - the repository cache only works for HTTP downloads.There also isn't a good way of telling Bazel "hey, check this remote cache first". Repository rules do not necessarily evaluate to something that's content addressable (see above). --distdir only works for local paths and uses the.... filename, it seems? We could either just set HTTP_PROXY in tools/bazel and point it to our cache server (very easy, but affects everything and breaks in envs that want to set their own proxy), or programmatically modify WORKSPACE to add extra URLs to the rules that we want to cache (more fine grained, but does not work for all kinds of rules + requires modifying files).
Ref: https://github.com/bazelbuild/bazel/blob/a523c9bae3c0e490813a7b76cb9fea661a83605f/src/main/java/com/google/devtools/build/lib/bazel/repository/downloader/DownloadManager.java#L54
Ref: rmohr/bazeldnf#35
The text was updated successfully, but these errors were encountered: