-
Notifications
You must be signed in to change notification settings - Fork 3
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unexpected output? #5
Comments
time nvidia-docker run -it --rm infogan-torch th -e 'require "cudnn"' [EDIT: For reference, I see
So I think there's probably something suspect going on in your particular setup. What graphics card are you using? Which version of CUDA is installed? |
Your docker run command (from above) is just running. I'm just on a GTX 1060. For reference: |
I think your best bet would be to try using a CUDA 8.0 base image for the Docker container, since it seems like that new Pascal cards do not play well with CUDA 7.5. Unfortunately this does mean rebuilding the Docker image for a third time. Edit the first line of the Dockerfile like so: - FROM nvidia/cuda:7.5-cudnn5-devel
+ FROM nvidia/cuda:8.0-cudnn5-devel Then rebuild the image: nvidia-docker build -t infogan-torch . |
Okay, all good now with |
Fantastic! I've added a note to the readme (7b51a16) so that future users don't run into this problem. |
A couple of strange things: 1) it pauses/hangs for a very long time before starting training, 2) output images (continuous variable) are 280x140 (I expected square 280x280), and the reproduction is certainly not approaching parity with the training data (this is image 50).
Maybe 50 epochs (the default) is too few? But still, the shape seems odd... It seems like there's still a bug somewhere.
[EDIT: Sorry, I just noticed that the images on your git page are actually 5x10 digits. So the output is probably correct. I guess it's just a question of whether 50 is too few epochs.]
The text was updated successfully, but these errors were encountered: