New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correct 3x3 convolutions to 5x5 convolutions #2555

Open
wants to merge 1 commit into
base: master
from

Conversation

Projects
None yet
5 participants
@JoshVarty

JoshVarty commented Oct 19, 2017

Fixes #2545

This PR corrects the 3x3 convolutions to 5x5 convolutions in Inception v1.

I have run the tests in inception_v1_test.py and ensured that they pass following these changes.

@tensorflow-jenkins

This comment has been minimized.

Collaborator

tensorflow-jenkins commented Oct 19, 2017

Can one of the admins verify this patch?

@googlebot googlebot added the cla: yes label Oct 19, 2017

@@ -225,7 +225,7 @@ def inception_v1_base(inputs,
branch_1 = slim.conv2d(branch_1, 320, [3, 3], scope='Conv2d_0b_3x3')
with tf.variable_scope('Branch_2'):
branch_2 = slim.conv2d(net, 32, [1, 1], scope='Conv2d_0a_1x1')
branch_2 = slim.conv2d(branch_2, 128, [3, 3], scope='Conv2d_0a_3x3')
branch_2 = slim.conv2d(branch_2, 128, [5, 5], scope='Conv2d_0b_5x5')

This comment has been minimized.

@JoshVarty

JoshVarty Oct 19, 2017

There was also a scope name change here. (0a -> 0b)

@@ -114,7 +114,7 @@ def testModelHasExpectedNumberOfParameters(self):
inception.inception_v1_base(inputs)
total_params, _ = slim.model_analyzer.analyze_vars(
slim.get_model_variables())
self.assertAlmostEqual(5607184, total_params)
self.assertAlmostEqual(5988112, total_params)

This comment has been minimized.

@JoshVarty

JoshVarty Oct 19, 2017

I've replaced the old value with the value returned by slim.model_analyzer.analyze_vars().

@JoshVarty

This comment has been minimized.

JoshVarty commented Oct 19, 2017

This PR doesn't address the missing LocalResponseNorm operations or the redundant 1x1 Conv layer at the head of the network. If you think these are worth addressing, let me know and I can add them too.

@nealwu

This comment has been minimized.

Member

nealwu commented Oct 19, 2017

Nice, let me loop in a few people for review.

@nealwu nealwu requested review from derekjchow and sguada Oct 19, 2017

@JoshVarty

This comment has been minimized.

JoshVarty commented Oct 30, 2017

Let me know if you guys need anything more for this PR. I could try generating the new checkpoint if it looks good to you.

@sguada

This comment has been minimized.

Member

sguada commented Nov 2, 2017

The problem with this change is that it would break all the previous checkpoints.

@JoshVarty

This comment has been minimized.

JoshVarty commented Nov 2, 2017

Yes it would break them, but I believe they're incorrect right now. Is it a problem to regenerate them and release the new checkpoints?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment