Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MCR proxy cache does not remember pulled tags #15591

Closed
Rast1234 opened this issue Sep 14, 2021 · 20 comments
Closed

MCR proxy cache does not remember pulled tags #15591

Rast1234 opened this issue Sep 14, 2021 · 20 comments

Comments

@Rast1234
Copy link

Rast1234 commented Sep 14, 2021

Expected behavior and actual behavior:
I have several proxies set up. Proxy for Docker Hub works as expected. However, Microsoft Container Registry proxy does not list tags which i have pulled.

Steps to reproduce the problem:

  1. docker pull something via Docker Hub proxy
  2. find this artifact in Harbor UI and see that there is a tag listed
  3. docker pull something via MCR proxy
  4. find this artifact in Harbor UI and see that there are no tags at all. API calls like /v2/.../tags/list and /v2.0/.../artifacts/.../tags also return no tags.

Versions:

  • harbor version: [2.3.2]
  • docker engine version: [20.10.8]
@stonezdj
Copy link
Contributor

stonezdj commented Sep 27, 2021

Yes, The behavior of Docker 20.10.x client changed, there is a local cache store the tag and digest, the docker pull <image>:<tag> might be changed to <image>@sha256:xxxxxx, it is converted to pull by digest, in this case, the Harbor proxy cache only get the request of docker pull <image>@sha256:xxxxxx, then it couldn't create this tag in the proxy cache.

@yogeek
Copy link

yogeek commented Oct 22, 2021

@stonezdj thank you for this explanation. What solution do you see then to avoid this issue ?

@yogeek
Copy link

yogeek commented Dec 16, 2021

@stonezdj this issue is quite disturbing for some of our production images, did you have the chance to think about a potential solution please ?

@jehof
Copy link

jehof commented Dec 30, 2021

Noticed this behavior today too. dotnet/aspnet:6.0.0 is not listed even though it is pulled through proxy-cache.

@vizv
Copy link

vizv commented Jan 23, 2022

btw, GC cleans up untagged artifacts makes this issue even worse 😢

Still happen with v2.4.1-c4b06d79, the tag appears on the first pull and disappear the next day...

@vizv
Copy link

vizv commented Jan 23, 2022

I found it's deleted by harbor-jobservice, and in harbor-core shows

2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'InternalArtifact' on topic 'PUSH_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'P2PPreheat' on topic 'PUSH_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'InternalArtifact' on topic 'PULL_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-admin OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'PUSH_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'Replication' on topic 'PUSH_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'PULL_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-admin OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'PUSH_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:22Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'PULL_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-admin OccurAt-2022-01-22 15:45:22
2022-01-22T15:45:41Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'InternalArtifact' on topic 'PULL_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-robot$docker.io+Trivy-56edd68e-7b9a-11ec-9268-92e5fd85d255 OccurAt-2022-01-22 15:45:41
2022-01-22T15:45:41Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'PULL_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-robot$docker.io+Trivy-56edd68e-7b9a-11ec-9268-92e5fd85d255 OccurAt-2022-01-22 15:45:41
2022-01-22T15:45:41Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'PULL_ARTIFACT': ID-92, Repository-docker.io/library/busybox Tags-[] Digest-sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee Operator-robot$docker.io+Trivy-56edd68e-7b9a-11ec-9268-92e5fd85d255 OccurAt-2022-01-22 15:45:41
2022-01-22T15:51:39Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'P2PPreheat' on topic 'PUSH_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:51:39
2022-01-22T15:51:39Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'InternalArtifact' on topic 'PUSH_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:51:39
2022-01-22T15:51:39Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'PUSH_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:51:39
2022-01-22T15:51:39Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'PUSH_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:51:39
2022-01-22T15:51:39Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'Replication' on topic 'PUSH_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor#proxy-cache-service OccurAt-2022-01-22 15:51:39
2022-01-23T00:00:02Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'Replication' on topic 'DELETE_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[latest] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor-jobservice OccurAt-2022-01-23 00:00:02
2022-01-23T00:00:02Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'DELETE_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[latest] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor-jobservice OccurAt-2022-01-23 00:00:02
2022-01-23T00:00:02Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'DeleteArtifactWebhook' on topic 'DELETE_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[latest] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor-jobservice OccurAt-2022-01-23 00:00:02
2022-01-23T00:00:02Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'DELETE_ARTIFACT': ID-93, Repository-docker.io/library/busybox Tags-[latest] Digest-sha256:a9ab0640d76ced5846659fb1f4efbc1bfb3ca68514a9b6282a6c0a1efa6f13db Operator-harbor-jobservice OccurAt-2022-01-23 00:00:02

Note I pulled docker.io/library/busybox:latest at 2022-01-22T15:45:22Z and harbor-jobservice deletes it (from database only? I saw nothing happens in harboar-registry) at 2022-01-23T00:00:02Z

@vizv
Copy link

vizv commented Jan 23, 2022

And this as well

2022-01-23T00:00:02Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'RetentionWebhook' on topic 'TAG_RETENTION': TaskID-33 Status-SUCCESS Deleted-[docker.io:library/busybox:[latest]] OccurAt-2022-01-23 00:00:02

@vizv
Copy link

vizv commented Jan 23, 2022

Found default retention policy (see below) for proxy cache project doesn't work as expected...

For the repositories matching **, retain the artifacts pulled within the last 7 days with tags matching ** with untagged

When dry-run, it's trying to delete the artifact pulled 5 minutes ago tagged with latest

@vizv
Copy link

vizv commented Jan 23, 2022

More feedback: for some reason, the artifact has incorrect pull_time, even if I pull again:

"pull_time": "0001-01-01T00:00:00.000Z"

Which makes dayspl evaluator drops this artifact on the next schedule run

@vizv
Copy link

vizv commented Jan 23, 2022

Okay, my issue is actually #16230, however the artifact has a empty tags field, is this expected?

@m-yosefpor
Copy link

We are replicating our proxy-cached images to different clusters, but right now the tag is not replicated as only the digest exists in proxy-cache project. So we can not fetch images from replicated projects in other places due to this issue.

@m-yosefpor
Copy link

m-yosefpor commented Mar 31, 2022

I think one workaround is to allowing replication to proxy-cache projects. (related to what is describe in #15155, but instead of skipping those tags (#16286), allowing to replicate the blobs somehow (for better caching). Right now push to proxy-cached project is denined, but maybe having an option to allow this in project settings is useful for such scenarios.

@creker
Copy link

creker commented Apr 11, 2022

Happens to me too but with hub.docker.com images. Looks like it's indeed limited to images that are pulled through library/. Easily reproduced by pulling nginx and bitnami/nginx. Only the latter will be properly tagged.

@creker
Copy link

creker commented Apr 12, 2022

Looking at nginx logs, the only difference I see is that for some reason for nginx docker client makes additional GET request

nginx

"GET /v2/hub.docker.com/library/nginx/manifests/1.21.6 HTTP/1.1" 200 1862 "-" "docker/19.03.13 go/go1.13.15 git-commit/4484c46d9d kernel/4.15.0-151-generic os/linux arch/amd64 UpstreamClient(Docker-Client/20.10.7 \x5C(linux\x5C))" 2.607 2.607 .
"GET /v2/hub.docker.com/library/nginx/manifests/sha256:83d487b625d8c7818044c04f1b48aabccd3f51c3341fc300926846bca0c439e6 HTTP/1.1" 200 1570 "-" "docker/19.03.13 go/go1.13.15 git-commit/4484c46d9d kernel/4.15.0-151-generic os/linux arch/amd64 UpstreamClient(Docker-Client/20.10.7 \x5C(linux\x5C))" 0.015 0.015 .

bitnami/nginx

"GET /v2/hub.docker.com/bitnami/nginx/manifests/1.21.6 HTTP/1.1" 200 2826 "-" "docker/19.03.13 go/go1.13.15 git-commit/4484c46d9d kernel/4.15.0-151-generic os/linux arch/amd64 UpstreamClient(Docker-Client/20.10.7 \x5C(linux\x5C))" 2.061 2.061 .

Looks like this is due to multiarchitecture images. Requesting v2/hub.docker.com/library/nginx/manifests/1.21.6 returns media type application/vnd.docker.distribution.manifest.list.v2+json which consists of a list of manifest digests for different architectures. Requesting v2/hub.docker.com/bitnami/nginx/manifests/1.21.6 returns a manifest itself. Maybe the fact that docker client requests manifest by its digest throws Harbor off and it can't recognize which tag is being pulled?

UPDATE
Yep, pulling multiarch ubuntu/nginx produces the same behaviour - no tags are being pulled. So this has nothing to do with puling through library/. It's just happens to be that many images there are multiarch.

@creker
Copy link

creker commented Apr 13, 2022

Found another interesting case. If you pull multiarch image that Harbor never seen before for some reason it triggers some routine that periodically logs [DEBUG] [/controller/proxy/manifestcache.go:128]: waiting for the manifest ready, repo hub.docker.com/library/alpine, tag:3.12 This happens 20 times, exactly maxManifestListWait tries. After that image tag finally appears in UI and I can see this in the logs

2022-04-13T09:10:19Z [DEBUG] [/controller/proxy/manifestcache.go:146]: The manifest list payload: {
   "schemaVersion": 2,
   "mediaType": "application/vnd.docker.distribution.manifest.list.v2+json",
   "manifests": [
      {
         "mediaType": "application/vnd.docker.distribution.manifest.v2+json",
         "size": 528,
         "digest": "sha256:a777c9c66ba177ccfea23f2a216ff6721e78a662cd17019488c417135299cd89",
         "platform": {
            "architecture": "amd64",
            "os": "linux"
         }
      }
   ]
}
2022-04-13T09:10:19Z [DEBUG] [/controller/proxy/manifestcache.go:98]: Saved key:trimmedmanifestlist:, value:sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944
022-04-13T09:10:19Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id 0d573fe9-a072-4187-9c1f-33dbf60f44dd to the logger for the request PUT /v2/hub.docker.com/library/alpine/manifests/3.15
2022-04-13T09:10:19Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:53]: In artifact info middleware, url: /v2/hub.docker.com/library/alpine/manifests/3.15
2022-04-13T09:10:19Z [DEBUG] [/server/middleware/security/proxy_cache_secret.go:40][requestID="0d573fe9-a072-4187-9c1f-33dbf60f44dd"]: a proxy cache secret security context generated for request PUT /v2/hub.docker.com/library/alpine/manifests/3.15
2022-04-13T09:10:19Z [DEBUG] [/server/middleware/immutable/pushmf.go:51]: failed to list artifact, artifact hub.docker.com/library/alpine:3.15 not found
2022-04-13T09:10:20Z [DEBUG] [/server/middleware/quota/quota.go:137][action="request" middleware="quota" requestID="0d573fe9-a072-4187-9c1f-33dbf60f44dd" url="/v2/hub.docker.com/library/alpine/manifests/3.15"]: not warning resources found
2022-04-13T09:10:20Z [DEBUG] [/pkg/notifier/event/event.go:94]: event PUSH_ARTIFACT published
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'InternalArtifact' on topic 'PUSH_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator-harbor#proxy-cache-service OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [DEBUG] [/controller/event/handler/p2p/preheat.go:71]: preheat: artifact pushed hub.docker.com/library/alpine:[3.15]@sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944
2022-04-13T09:10:20Z [INFO] [/controller/event/handler/webhook/artifact/artifact.go:75]: []
2022-04-13T09:10:20Z [DEBUG] [/controller/event/handler/webhook/artifact/artifact.go:77]: cannot find policy for PUSH_ARTIFACT event: ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator-harbor#proxy-cache-service OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'PUSH_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator-harbor#proxy-cache-service OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [DEBUG] [/controller/event/handler/replication/event/handler.go:51]: no policy found for the event &{artifact_push 0xc00106fd10}, do nothing
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'Replication' on topic 'PUSH_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator-harbor#proxy-cache-service OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [DEBUG] [/pkg/notifier/event/event.go:94]: event PULL_ARTIFACT published
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'InternalArtifact' on topic 'PULL_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator- OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [INFO] [/controller/event/handler/webhook/artifact/artifact.go:75]: []
2022-04-13T09:10:20Z [DEBUG] [/controller/event/handler/webhook/artifact/artifact.go:77]: cannot find policy for PULL_ARTIFACT event: ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator- OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'ArtifactWebhook' on topic 'PULL_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator- OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'PULL_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator- OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'AuditLog' on topic 'PUSH_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator-harbor#proxy-cache-service OccurAt-2022-04-13 09:10:20
2022-04-13T09:10:20Z [DEBUG] [/pkg/allowlist/manager.go:75]: No CVE allowlist found for project 0, returning empty list.
2022-04-13T09:10:20Z [DEBUG] [/controller/p2p/preheat/enforcer.go:304]: No preheat policy matched for the artifact hub.docker.com/library/alpine@sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944
2022-04-13T09:10:20Z [INFO] [/pkg/notifier/notifier.go:205]: Handle notification with Handler 'P2PPreheat' on topic 'PUSH_ARTIFACT': ID-62, Repository-hub.docker.com/library/alpine Tags-[3.15] Digest-sha256:b58571851d35c36e89621602f36a0ddb0066b3130aef9d8dcb1cd2c590fcf944 Operator-harbor#proxy-cache-service OccurAt-2022-04-13 09:10:20

If I then remove that image and pull it again this no longer happens and image tag never appears.

@creker
Copy link

creker commented Apr 14, 2022

If anyone intersted, I found a workaround that fixes the issue completely. Comment out this line and rebuild harbor-core.

manifestlist.MediaTypeManifestList,

Ideally we would add small check here to ensure that acceptedMediaTypes doesn't contain that media type but I don't see this argument being used anywhere, so for now it's not required.

func (c *client) PullManifest(repository, reference string, acceptedMediaTypes ...string) (

This tells remote registry that Harbor doesn't support application/vnd.docker.distribution.manifest.list.v2+json and instead remote registry should return amd64 manifest directly. The downside is that it means you can no longer pull containers of other architectures. I don't care about that so I'm completely satisfied by this workaround and will simply use custom Harbor builds until this is fixed properly.

@github-actions
Copy link

github-actions bot commented Jul 5, 2022

This issue is being marked stale due to a period of inactivity. If this issue is still relevant, please comment or remove the stale label. Otherwise, this issue will close in 30 days.

@github-actions github-actions bot added the Stale label Jul 5, 2022
@github-actions
Copy link

github-actions bot commented Aug 5, 2022

This issue was closed because it has been stalled for 30 days with no activity. If this issue is still relevant, please re-open a new issue.

@github-actions github-actions bot closed this as completed Aug 5, 2022
@creker
Copy link

creker commented Sep 7, 2022

Still relevant. @stonezdj could you please reopen this issue?

@yogeek
Copy link

yogeek commented Mar 30, 2023

@stonezdj is it possible to reopen this issue please ? And to get information about the potential fix ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants