Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support max pooling with padding #2543

Merged
merged 14 commits into from
Mar 3, 2015
Merged

support max pooling with padding #2543

merged 14 commits into from
Mar 3, 2015

Conversation

yaoli
Copy link
Contributor

@yaoli yaoli commented Feb 27, 2015

the grad is not right now...working on it

@@ -19,7 +19,7 @@ def max_pool2D(*args, **kwargs):
return max_pool_2d(*args, **kwargs)


def max_pool_2d(input, ds, ignore_border=False, st=None):
def max_pool_2d(input, ds, ignore_border=False, st=None, padding=(0,0)):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PEP8: (0, 0) with a space after the comma

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, add documentation for the padding parameter in the docstring.

@lamblin
Copy link
Member

lamblin commented Feb 27, 2015

The gradient of the gradient is also missing, but I'm not sure there is a cudnn op for that.
In that case, the grad method of DownsampleFactorMaxGrad should return grad_not_implemented for these cases.

gx[n, k, row_ind, col_ind] += gz[n, k, r, c]
import ipdb; ipdb.set_trace()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That should not be here...

@yaoli
Copy link
Contributor Author

yaoli commented Mar 2, 2015

addressed some comments.

The gradient check still fails in "def test_DownsampleFactorMaxPaddingStride_grad(self)" in tests/test_downsample.py

@yaoli
Copy link
Contributor Author

yaoli commented Mar 2, 2015

grad is still not working....

@@ -320,16 +334,17 @@ def c_code_cache_version(self):
class DownsampleFactorMaxGrad(Op):
__props__ = ('ds', 'ignore_border', 'st')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'padding' should be in there too

@yaoli
Copy link
Contributor Author

yaoli commented Mar 2, 2015

Just simplified some minor stuff, grad test is fast now, but still broken


# pad the image
fill = x.min()-1
y = numpy.zeros((x.shape[0], x.shape[1], img_rows, img_cols)) + fill
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have to give y the same dtype as x, otherwise the comparison later will not work.

@@ -101,6 +100,9 @@ def out_shape(imgshape, ds, ignore_border=False, st=None):
extra row/col of partial downsampling (False) or ignore it (True).
:type ignore_border: bool

:param padding: pad zeros on four borders of the images
:type padding: tuple of two ints
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explain that the first int will be the padding both on top and on bottom, and the second int is the padding to the left and to the right.

@lamblin
Copy link
Member

lamblin commented Mar 2, 2015

You also have to pass the padding to the Grad Op in the grad function.

@yaoli
Copy link
Contributor Author

yaoli commented Mar 2, 2015

I've tried to fix as many pep8 as I could.
The grad is good now.

On Mon, Mar 2, 2015 at 5:30 PM, Pascal Lamblin notifications@github.com
wrote:

You also have to pass the padding to the Grad Op in the grad function.


Reply to this email directly or view it on GitHub
#2543 (comment).

@lamblin
Copy link
Member

lamblin commented Mar 2, 2015

OK, I'll make another pass.

@@ -101,6 +101,10 @@ def out_shape(imgshape, ds, ignore_border=False, st=None):
extra row/col of partial downsampling (False) or ignore it (True).
:type ignore_border: bool

:param padding: (pad_h, pad_w), pad zeros on four borders
of the images, pad_h for padding rows, and pad_w for columns.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is a bit ambiguous: "padding rows" could mean that you pad each row so the rows are longer, but actually it adds new rows on top and bottom. Can you say that pad_h is the size of the top and bottom borders, and pad_w is the size of the left and right bordres instead?


def test_DownsampleFactorMaxPaddingStride_grad(self):
rng = numpy.random.RandomState(utt.fetch_seed())
imval = rng.rand(1, 1, 10, 10) * 10.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also try with a non-square image.

@lamblin
Copy link
Member

lamblin commented Mar 3, 2015

So, to summarize, the things that are still needed are:

  • define pad_h and pad_w before using them, both in the forward op and the grad
  • clarification of pad_h and pad_w in the documentation
  • test gradient with non-square image and non-square padding
  • my sub-PR about pep8

@yaoli
Copy link
Contributor Author

yaoli commented Mar 3, 2015

I addressed first three. What shall I do with the sub-PR?

On Mon, Mar 2, 2015 at 7:02 PM, Pascal Lamblin notifications@github.com
wrote:

So, to summarize, the things that are still needed are:

  • define pad_h and pad_w before using them, both in the forward op and
    the grad
  • clarification of pad_h and pad_w in the documentation
  • test gradient with non-square image and non-square padding
  • my sub-PR about pep8


Reply to this email directly or view it on GitHub
#2543 (comment).

@yaoli
Copy link
Contributor Author

yaoli commented Mar 3, 2015

I've just pulled it

On Mon, Mar 2, 2015 at 7:13 PM, Li Yao yaoli.email@gmail.com wrote:

I addressed first three. What shall I do with the sub-PR?

On Mon, Mar 2, 2015 at 7:02 PM, Pascal Lamblin notifications@github.com
wrote:

So, to summarize, the things that are still needed are:

  • define pad_h and pad_w before using them, both in the forward op
    and the grad
  • clarification of pad_h and pad_w in the documentation
  • test gradient with non-square image and non-square padding
  • my sub-PR about pep8


Reply to this email directly or view it on GitHub
#2543 (comment).

@lamblin
Copy link
Member

lamblin commented Mar 3, 2015

Thanks, I think it's correct.
I'll merge when travis passes.

@yaoli
Copy link
Contributor Author

yaoli commented Mar 3, 2015

Awesome, thank you so much for the help!

On Mon, Mar 2, 2015 at 7:18 PM, Pascal Lamblin notifications@github.com
wrote:

Thanks, I think it's correct.
I'll merge when travis passes.


Reply to this email directly or view it on GitHub
#2543 (comment).

lamblin added a commit that referenced this pull request Mar 3, 2015
support max pooling with padding
@lamblin lamblin merged commit 471a171 into Theano:master Mar 3, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants