New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hide FunctionNode classes from chainer.functions
namespace
#4421
Conversation
Rebased on the current master. Need to fix newly introduced FunctionNodes.
|
Fixed the above mentioned 3 FunctionNodes. |
Ready for review, and very open for discussions. |
I'll fix the broken commit history. |
@kmaehashi Rebased and fixed the commit history. Could you please take at a look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please check comments.
Could you add this to the upgrade guide?
chainer/functions/__init__.py
Outdated
from chainer.functions.math.trigonometric import tan # NOQA | ||
from chainer.functions.math.trigonometric import Tan # NOQA | ||
|
||
from chainer.functions.noise.dropout import dropout # NOQA | ||
from chainer.functions.noise.dropout import Dropout # NOQA |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should it be removed? (Dropout
, Gaussian
, MaxPooling2D
, MaxPoolingND
are left)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Those are left on purpose, as they're stateful. For Dropout
and Gaussian
we sometimes want to reuse their random state/noise. As for the pooling functions, we similarly want to extract the resulting indices.
@@ -188,7 +188,8 @@ def backward(self, indexes, grad_outputs): | |||
def average_pooling_2d(x, ksize, stride=None, pad=0): | |||
"""Spatial average pooling function. | |||
|
|||
This function acts similarly to :class:`~functions.Convolution2D`, but | |||
This function acts similarly to | |||
:class:`~functions.connection.convolution_2d.Convolution2D`, but |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should point :func:~chainer.functions.convolution_2d
.
Same for other pooling functions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, fixed.
It is because :attr:`~chainer.functions.MaxPooling2D.indexes` is never | ||
created and stored in the :attr:`~chainer.functions.MaxPooling2D` | ||
It is because | ||
:attr:`~chainer.functions.pooling.max_pooling_2d.MaxPooling2D.indexes` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I didn't know this fact. So users should directly work with MaxPooling2D
to use F.upsampling_2d
...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes unfortunately... PyTorch's max_pool**
, takes a boolean flag by the way and returns either the output or a pair of (output, indices) depending on the flag. That'd be another way to do it.
@@ -246,19 +246,19 @@ def test_double_backward_negative_multi_axis_invert_gpu(self): | |||
|
|||
def test_invalid_axis_type(self): | |||
with self.assertRaises(TypeError): | |||
functions.Sum([0]) | |||
functions.logsumexp(self.x, [0]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
😅
@@ -190,11 +190,6 @@ def check_invalid_poolings(self): | |||
with self.assertRaises(ValueError): | |||
functions.spatial_pyramid_pooling_2d(self.v, 3, pooling='avg') | |||
|
|||
with testing.assert_warns(DeprecationWarning), \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it makes sense to leave this test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added the test back.
Hm that's what I thought. I don't think it makes a significant difference whether we remove it in this major release or in the next in that case. Please correct me if I am wrong. |
After v5 release, users of v4 may feel "F.MaxPooling2D suddenly disappeared although I was using documented API." How about updating documents here in separate PR and then backporting it to v4? |
Thanks for the suggestion, but this PR is not going to be backported? |
I mean:
|
Sorry I now see what you mean. I think that's an acceptable solution. I'll create a PR to fix the documentation once #4890 is merged. |
jenkins, test this please. |
jenkins, test this please. |
Jenkins CI test (for commit 67c49aa) failed with status FAILURE. |
jenkins, test this please. |
Jenkins CI test (for commit aeded94) succeeded without errors! |
@hvy Could you resolve conflict? (Sorry I had to merge this quickly...) |
@kmaehashi Done! |
jenkins, test this please. |
Thanks! Travis failure seems not related to this PR. |
Jenkins CI test (for commit 4caca0c) failed with status FAILURE. |
LGTM! Test failures are not related to this PR. |
chainer.functions
namespace
This PR hides all
FunctionNode
implementations fromfunctions/__init__.py
since these are considered implementation details and should not "be visible" to the user.Some exceptions are
FunctionNode
s with states that are exposed through APIs such as the indices for the max pooling classes and the random states in dropout and gaussian noise.Tests are modified accordingly to not rely on these classes directly. There are few exceptions including the the ones above.
functions/__init__
[ ] Remove(Maybe better done in a separate PR)FunctionNode
's__init__
's default argumentsNote: Reviewing commit by commit is probably much easier than looking at the total diff.