-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Job disk resource limits are not honoured #3653
Comments
Also, the |
I did a bit of digging for this. When a Ideally the @wdbaruni what do you think ? |
I'd recommend exploring the StorageProvider implementation used for fetching URL storage sources: https://github.com/bacalhau-project/bacalhau/blob/main/pkg/storage/url/urldownload/storage.go#L79. It's worth noting that the size of the content may not always be readily available or accurately reported by the server - as the comment in the above link suggests. One way we can (try to) mitigate this is by aborting downloads that exceed the content length specified by the server. |
Indeed completely agree to you. However, what I was pointing to was that there are no checks overall. Lets say you replace the I may be wrong about this. But i don't see any such checks so. If you feel there are such checks could you please point me to it ? Also for s3 we are successfully fetching the size in all scenarios.
|
Thank you for clarifying offline. As we discussed. The checks are happening via the interface Will update once i have an approach for it. |
This task should fail because the requested URL download is larger than 100Kb. But it succeeds.
The text was updated successfully, but these errors were encountered: