-
Notifications
You must be signed in to change notification settings - Fork 273
large file download fails with OverflowError #30
Comments
This uses the google-api-python-client under the hood. That is where the bug is. However, I am really sorry about this - that is appalling forethought to dump the entire thing into memory without streaming. |
You might want to have a look at #27 |
I don't have a 32bit system handy for testing, but could you report whether replacing
with
works (you'll probably need an from apiclient.http import MediaIoBaseDownload somewhere)? Inasmuch as it seems to download a 4Gb file om random data, without any serious memory use, on my machine, I posit the dreaded "works on my machine", but that is a 64bit one. If it does work, I think I can cook up a way to let PyDrive take a decision to do this for files over a certain size, but I then think that I shall want to open a feature request to solicit responses as to what that limit should be, as well as whether the limit should be the chunk size, then. |
Thanks, this works! (I upped the chunk size by a factor of 10 to save time. Otherwise it was On 17 February 2016 at 02:38, Fjodor42 notifications@github.com wrote:
|
@rupertlevene This should be resolved now. Post here if you are still encountering this issue. |
Thank you for the solution.
|
@smichaud btw, |
On my 32-bit linux machine, files over 2GB fail to download. Memory usage while my test script runs gets very high, suggesting the entire download is being cached in memory; I think the download should be streamed to disk instead.
To use the script, upload a large file called bigvid.avi to google drive and put client_secrets.json in the working directory.
The text was updated successfully, but these errors were encountered: