Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DownsampleFactorMax support strides: issue #2196 #2222

Merged
merged 15 commits into from
Dec 19, 2014

Conversation

SinaHonari
Copy link
Contributor

No description provided.

continue
for j in xrange(ds1):
col_ind = col_st + j
if col_ind >= img_cols:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can remove the if, but using for j in xrange(ds1, ...). ... is the max number of iteration. It is excluded.

@nouiz
Copy link
Member

nouiz commented Oct 29, 2014

I get this error:

======================================================================
ERROR: test_DownsampleFactorMax_hessian (theano.tensor.signal.tests.test_downsample.TestDownsampleFactorMax)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/u/bastienf/repos/Theano.Clean/theano/tensor/signal/tests/test_downsample.py", line 116, in test_DownsampleFactorMax_hessian
    fn_hess = function(inputs=[x_vec], outputs=grad_hess)
  File "/u/bastienf/repos/Theano.Clean/theano/compile/function.py", line 265, in function
    profile=profile)
  File "/u/bastienf/repos/Theano.Clean/theano/compile/pfunc.py", line 511, in pfunc
    on_unused_input=on_unused_input)
  File "/u/bastienf/repos/Theano.Clean/theano/compile/function_module.py", line 1540, in orig_function
    defaults)
  File "/u/bastienf/repos/Theano.Clean/theano/compile/function_module.py", line 1403, in create
    _fn, _i, _o = self.linker.make_thunk(input_storage=input_storage_lists)
  File "/u/bastienf/repos/Theano.Clean/theano/gof/link.py", line 490, in make_thunk
    output_storage=output_storage)[:3]
  File "/u/bastienf/repos/Theano.Clean/theano/gof/vm.py", line 891, in make_all
    no_recycling))
  File "/u/bastienf/repos/Theano.Clean/theano/gof/op.py", line 726, in make_thunk
    output_storage=node_output_storage)
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cc.py", line 1023, in make_thunk
    keep_lock=keep_lock)
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cc.py", line 965, in __compile__
    keep_lock=keep_lock)
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cc.py", line 1403, in cthunk_factory
    key=key, fn=self.compile_cmodule_by_step, keep_lock=keep_lock)
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cmodule.py", line 950, in module_from_key
    src_code = next(compile_steps)
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cc.py", line 1287, in compile_cmodule_by_step
    mod = self.build_dynamic_module()
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cc.py", line 1331, in build_dynamic_module
    self.code_gen()
  File "/u/bastienf/repos/Theano.Clean/theano/gof/cc.py", line 701, in code_gen
    raise NotImplementedError("%s cannot produce C code" % op)
  File "/u/bastienf/repos/Theano.Clean/theano/tensor/signal/downsample.py", line 148, in __str__
    self.ds, self.st, self.ignore_border)
TypeError: <unprintable TypeError object>

@nouiz nouiz closed this Oct 29, 2014
@nouiz nouiz reopened this Oct 29, 2014

def __str__(self):
return '%s{%s,%s}' % (self.__class__.__name__,
self.ds, self.ignore_border)
self.ds, self.st, self.ignore_border)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You forgot to modify the string on line 147. This cause crash.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In fact, I would take this opportunity to make this op use the new props = (...) attribute.

Remove hash, eq and str in that op. Add the attribute props = ('ds', 'st', 'ignore_border') to that class. This will cause the 3 methods to be generated automatically.

@nouiz nouiz changed the title initial changes for issue #2196 DownsampleFactorMax support strides: issue #2196 Oct 29, 2014
@nouiz
Copy link
Member

nouiz commented Oct 29, 2014

I wasn't able to reproduce the error you wrote me about c_code in an email. So I guess it was the str problem. If not, give me the error message.

thanks

:type ds: list or tuple of two ints

:param st: the stride size
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add that this is the distance between pooling regions.
Also add that it can be None, and in that case the value provided as ds will be used, and that it corresponds to the case of adjacent pooling regions.

@lamblin
Copy link
Member

lamblin commented Nov 19, 2014

OK, I did not see your comment about my formula before I reviewed, we should discuss what the right answer should be before trying to implement it.

nr = 0
nc = 0
if isinstance(r, theano.Variable):
nr = tensor.switch(tensor.ge(r - ds[0], 0), out_r, 0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be simplified in nr = tensor.max(out_r, 0) if r is a Variable, and nr = numpy.max(out_r, 0) otherwise.
Idem for nc. And you don't have to assign them 0 before.

@lamblin
Copy link
Member

lamblin commented Nov 29, 2014

Otherwise, it looks good, thanks for completing that part.

@lamblin
Copy link
Member

lamblin commented Dec 4, 2014

You will need to rebase and force-push so that Travis can run.

@@ -183,10 +232,13 @@ def grad(self, inp, grads):
gz, = grads
maxout = self(x)
return [DownsampleFactorMaxGrad(self.ds,
ignore_border=self.ignore_border)(
ignore_border=self.ignore_border,
st=self.st)(
x, maxout, gz)]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before it is tested, can you return theano.gradient.grad_not_implemented when self.st != self.ds? That way, we are sure not to give wrong results.


def c_code(self, node, name, inp, out, sub):
def c_code_tmp(self, node, name, inp, out, sub):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as the forward op, re-enable the C code when st == ds.

@SinaHonari
Copy link
Contributor Author

this is for ticket #2196

@@ -322,10 +386,14 @@ def infer_shape(self, node, in_shapes):
def grad(self, inp, grads):
x, maxout, gz = inp
ggx, = grads
if self.st != self.ds:
return [theano.gradient.grad_not_implemented(self, 0, x),
theano.gradient.grad_not_implemented(self, 1, maxout)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a missing comma at the end of the line, this is a syntax error.

lamblin added a commit that referenced this pull request Dec 19, 2014
DownsampleFactorMax support strides: issue #2196
@lamblin lamblin merged commit c2895dc into Theano:master Dec 19, 2014
@lamblin
Copy link
Member

lamblin commented Dec 19, 2014

Thanks for completing that part!

:param ds: factor by which to downscale (vertical ds, horizontal ds).
(2,2) will halve the image in each dimension.
:param ds: factor by which to downscale (vertical ds, horizontal ds).
(2,2) will halve the image in each dimension.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The method max_pool_2d need an st parameter. I'll fix it.

Also the GPU opt that move this op the GPU do not take check the new parameter! This introduce a bug. I'll fix it too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants