-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there some way to download the images for Celeba-HQ #21
Comments
The authors released the dataset, and here is the link please refer to this repo . |
@nashory seems it is only the pretrained model, not the source images |
@petergerten aren't they stored in the .dat files in the |
The |
Hi @nperraud! Thanks for sharing the scripts. I am trying to download the deltas from Google Drive but it gives me an error that the "download quota exceeded for this file" for all files of course. I guess there is a limit to the number of downloads per day for Google Drive. Do you have any idea if I can find the files stored somewhere else? |
I generated the files and stored it as convenient zip files. You can download from Google Drive or generate using pre-built docker image. Visit suvojit-0x55aa/celebA-HQ-dataset-download |
@suvojit-0x55aa there should be 200k images in Celeba HQ,but your sharing are 30k of them. |
sorry for my careless check,you're right. |
Another question,how could i use the |
Got it ,there is a image list file https://raw.githubusercontent.com/nperraud/download-celebA-HQ/master/image_list.txt which releated celaba and celaba-hq files. |
Is It possible to share labels with the dataset? Thanks! |
@suvojit-0x55aa On your github page you said "The size of the final dataset is 89G". But in Google Drive, you have far less than 89Gb. How come? |
@aciobanusebi as far as I remember that's the memory needed for raw dataset, the deltas, and the generated dataset. |
Or if you can make them available to me I can host them somewhere for people to download. I'd really like the dataset but don't particularly want to go through the process to generate it myself.
The text was updated successfully, but these errors were encountered: