Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix batch accumulation #1262

Merged
merged 2 commits into from
Nov 16, 2016
Merged

Conversation

lukeyeager
Copy link
Member

Fix #1240

@kostinalexey
Copy link

kostinalexey commented Nov 14, 2016

You need to modify following lines to fix test and display parameters for caffe

542-544, 
-            solver.test_iter.append(int(math.ceil(float(self.dataset.get_entry_count(constants.VAL_DB)) / val_data_layer.data_param.batch_size)))
-            solver.test_iter.append(int(math.ceil(float(self.dataset.get_entry_count(constants.VAL_DB)) / (val_data_layer.data_param.batch_size \* self.batch_accumulation))))

603, 
-                int(math.ceil(5000.0 / train_data_layer.data_param.batch_size))
-                int(math.ceil(5000.0 / (train_data_layer.data_param.batch_size \* self.batch_accumulation)))

772-773
-            solver.test_iter.append(int(math.ceil(float(self.dataset.get_entry_count(constants.VAL_DB)) / val_image_data_layer.data_param.batch_size)))
-            solver.test_iter.append(int(math.ceil(float(self.dataset.get_entry_count(constants.VAL_DB)) / (val_image_data_layer.data_param.batch_size \* self.batch_accumulation))))

828
-                int(math.ceil(5000.0 / train_image_data_layer.data_param.batch_size))
-                int(math.ceil(5000.0 / (train_image_data_layer.data_param.batch_size \* self.batch_accumulation)))

@lukeyeager
Copy link
Member Author

@kostinalexey as far as I can tell, batch accumulation has no effect during the TEST phase. Do you have proof to the contrary?

You're right about the display bug though - thanks! Let me fix that...

@kostinalexey
Copy link

You are right, test phase is not effected by batch_accumulation. My mistake.

@lukeyeager
Copy link
Member Author

No problem at all - thanks for double-checking my work!

@lukeyeager lukeyeager merged commit 80c08ed into NVIDIA:master Nov 16, 2016
@lukeyeager lukeyeager deleted the fix-batch-accumulation branch November 16, 2016 21:37
ethantang95 pushed a commit to ethantang95/DIGITS that referenced this pull request Jul 10, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants