-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make unnecessary computations optional #368
Make unnecessary computations optional #368
Conversation
Original (20 epochs LeNet on MNIST): 30s And now I'm seeing extra output in the log. I thought this PR turned OFF confusion matrices?
|
Perhaps our datasets don't have the same size? My MNIST dataset has 45k training samples and 15k validation samples. So you're saying the patch doesn't provide any speedup? I'll double check on my end. |
Oh, that was careless of me. I misnamed my dataset and didn't notice. Whoops!
I'm definitely picking up the new code because I'm seeing the confusion matrix in my log. |
It looks like I need to review my patch! |
Closing as patch needs to be revisited. May re-open later. |
Training accuracy is not displayed in DIGITS (not for Caffe either) so it is not necessary to compute training accuracy and confusion matrix. Disabling those computations speeds training up: LeNet (MNIST, 30 epochs): 1m54s -> 1m40s Alexnet (CIFAR10, 2 epochs): 5m14s -> 4m38s GoogLeNet (reduced CIFAR10, 1 epoch): 2m4s -> 2m2s
899da0b
to
575255c
Compare
Re-opening with a new patch. These are the numbers I get:
|
I've verified similar results on my machine. LGTM! |
Make unnecessary computations optional
Original (20 epochs LeNet on MNIST): 187s
Now: 142s
Helps with bug #339