Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use cached download to reduce google drive load #31

Closed
wants to merge 1 commit into from
Closed

Use cached download to reduce google drive load #31

wants to merge 1 commit into from

Conversation

aleneum
Copy link

@aleneum aleneum commented Jun 23, 2020

run.py and PULSE.py attempt to download files from Google drive every time they are executed. This leads to OSError: Google Drive quota exceeded quite fast. Checking the cache directory first for synthesis.pt, mapping.pt or shape_predictor_68_face_landmarks.dat reduces server queries and also reduces initialization time.

Little remark: While automatic downloading approaches failed every time, I was able to copy the URL into my browser and download all the files manually as a workaround.

@aleneum
Copy link
Author

aleneum commented Jun 23, 2020

Had a closer look at drive.open_url and realized caching is already there but preceeds the file name with a url hash. So I guess the continous download attempts only appear when a file cannot be downloaded automatically. I am closing this since caching is already solved in a more flexible manner and manual downloads would be detected when prefixed with the url md5hash.

@aleneum aleneum closed this Jun 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants