You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Recently, I was trying to automate Docker builds and was experimenting with different Dagger features.
Trying to assign the whole built image to a variable led to an unexpected error. For example:
ERROR:root:Unexpected dagger error
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/dagger/client/_core.py", line 139, in execute
result = await self.conn.session.execute(query)
File "/usr/local/lib/python3.10/dist-packages/dagger/client/_session.py", line 129, in execute
return await (await self.get_session()).execute(query)
File "/usr/local/lib/python3.10/dist-packages/gql/client.py", line 1639, in execute
raise TransportQueryError(
gql.transport.exceptions.TransportQueryError: {'message': 'file size 219969536 exceeds limit 134217728', 'path': ['container', 'build', 'asTarball', 'contents']}
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/root/generate.py", line 44, in dbt_image
a = await val.contents()
File "<@beartype(dagger.client.gen.File.contents) at 0x7f6191977b50>", line 10, in contents
File "/usr/local/lib/python3.10/dist-packages/dagger/client/gen.py", line 2713, in contents
return await _ctx.execute(str)
File "/usr/local/lib/python3.10/dist-packages/dagger/client/_core.py", line 164, in execute
raise error from e
dagger.QueryError: file size 219969536 exceeds limit 134217728
After researching a bit, I found that https://github.com/dagger/dagger/blob/main/engine/buildkit/ref.go#L45 contains a constant limit of 128MBs, but I don't understand why it is set to such a low value. Is it a buildkit hard-capped limit or just an arbitrarily set value that could be changed without affecting the underlying file processing logic? I'd be grateful if anyone could explain this. Thanks in advance.
The text was updated successfully, but these errors were encountered:
The issue is that this would probably result in an absolutely massive graphql request - I think now since #6772 is merged, this could technically work (and maybe we could lift the restriction)? But also, I think I'd discourage passing raw file contents over base64, a better approach would be to Export the file to disk, and read it from there.
@vito@sipsma is there any reason we couldn't lift this limit? Or maybe it's worth keeping, since Contents is less efficient than exporting/reading.
Yeah I think we need some kind of limit here, just as a matter of principle. If you built a 1GB image for example these errors should act like guardrails to get you on the right track for handling that amount of data, rather than trying to hold 1GB in memory and pass it around like any other value (e.g. as a function arg 😱).
(But maybe we could have a more instructive guardrail, the error message is decent but pretty low level.)
What happened? What did you expect to happen?
Recently, I was trying to automate Docker builds and was experimenting with different Dagger features.
Trying to assign the whole built image to a variable led to an unexpected error. For example:
results in such error:
After researching a bit, I found that https://github.com/dagger/dagger/blob/main/engine/buildkit/ref.go#L45 contains a constant limit of 128MBs, but I don't understand why it is set to such a low value. Is it a buildkit hard-capped limit or just an arbitrarily set value that could be changed without affecting the underlying file processing logic? I'd be grateful if anyone could explain this. Thanks in advance.
The text was updated successfully, but these errors were encountered: