Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use numbers for input check in roi_{average|max}_{pooling|align}_2d.py #5636

Merged
merged 17 commits into from Aug 11, 2019

Conversation

knorth55
Copy link
Contributor

@knorth55 knorth55 commented Nov 7, 2018

Merge after #5634 and #5635.

use numpy.issubdtype instead of isinstance to check function args.
numpy.issubdtype supports int, float and numpy.integer, numpy.floating.

@beam2d beam2d added the st:blocked-by-another-pr State indicating that another ticket is preventing this ticket from being closed/merged. label Nov 21, 2018
@stale
Copy link

stale bot commented Feb 19, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale Not updated for a longer period of time. label Feb 19, 2019
@knorth55
Copy link
Contributor Author

waiting for #5635, not stale

@stale stale bot removed the stale Not updated for a longer period of time. label Feb 19, 2019
@mitmul
Copy link
Member

mitmul commented Feb 21, 2019

@knorth55 Could you fix the conflicts?

@knorth55
Copy link
Contributor Author

I merged master branch and travis passed.

@mitmul
Copy link
Member

mitmul commented Feb 28, 2019

Jenkins, test this please

@chainer-ci
Copy link
Member

Jenkins CI test (for commit 3cd5e48, target branch master) failed with status FAILURE.
(For contributors, please wait until the reviewer confirms the details of the error.)

@chainer-ci
Copy link
Member

Jenkins CI test (for commit ae7d9ca, target branch master) succeeded!

@knorth55
Copy link
Contributor Author

knorth55 commented Apr 9, 2019

@mitmul can you review this again ?

@knorth55
Copy link
Contributor Author

@Hakuyume kindly ping.

@Hakuyume
Copy link
Member

Hakuyume commented May 7, 2019

numbers.Integral can do similar thing and I think it is more simple.

import numbers
import numpy as np

isinstance(int(0), numbers.Integral)  # True
isinstance(np.int32(0), numbers.Integral)  # True
isinstance(float(0), numbers.Integral)  # False

@knorth55 knorth55 changed the title use numpy.issubdtype in roi_{average|max}_{pooling|align}_2d.py use numbers for input check in roi_{average|max}_{pooling|align}_2d.py May 7, 2019
@knorth55
Copy link
Contributor Author

knorth55 commented May 7, 2019

@Hakuyume i update to use numbers.

@chainer-ci
Copy link
Member

Jenkins CI test (for commit 095a94d, target branch master) failed with status FAILURE.

@mitmul mitmul added st:test-and-merge State indicating that pull request is approved by a reviewer and can be merged after CI passes. and removed st:blocked-by-another-pr State indicating that another ticket is preventing this ticket from being closed/merged. labels Jul 16, 2019
@mitmul
Copy link
Member

mitmul commented Jul 16, 2019

Jenkins, test this please

@chainer-ci
Copy link
Member

Jenkins CI test (for commit 095a94d, target branch master) succeeded!

@mitmul
Copy link
Member

mitmul commented Jul 17, 2019

@knorth55 I'm very sorry for late reaction to your PRs... Could you fix the conflicts? Then I'll re-run the tests and merge this.

@knorth55
Copy link
Contributor Author

@mitmul No problem. I resolved the conflict.

@chainer-ci
Copy link
Member

@mitmul This pull-request is marked as st:test-and-merge, but there were no activities for the last 3 days. Could you check?

@mitmul
Copy link
Member

mitmul commented Jul 21, 2019

Jenkins, test this please

@chainer-ci
Copy link
Member

Jenkins CI test (for commit ccc6293, target branch master) failed with status FAILURE.

@knorth55
Copy link
Contributor Author

knorth55 commented Jul 22, 2019

The test exceeds timeout limit in Python2.7.
Should I make the test ligheter?

@chainer-ci
Copy link
Member

@mitmul This pull-request is marked as st:test-and-merge, but there were no activities for the last 3 days. Could you check?

3 similar comments
@chainer-ci
Copy link
Member

@mitmul This pull-request is marked as st:test-and-merge, but there were no activities for the last 3 days. Could you check?

@chainer-ci
Copy link
Member

@mitmul This pull-request is marked as st:test-and-merge, but there were no activities for the last 3 days. Could you check?

@chainer-ci
Copy link
Member

@mitmul This pull-request is marked as st:test-and-merge, but there were no activities for the last 3 days. Could you check?

@toslunar
Copy link
Member

toslunar commented Aug 5, 2019

I'll rerun tests because the last two commits LGTM.

Jenkins, test this please.

@chainer-ci
Copy link
Member

Jenkins CI test (for commit 89e90b4, target branch master) failed with status FAILURE.

@knorth55
Copy link
Contributor Author

knorth55 commented Aug 6, 2019

Jenkins test failed, but it is not related to this PR.

17:35:21 =================================== FAILURES ===================================
17:35:21 __________ test_TrilTriu_param_4_{k=1, shape=(4, 3)}[float16-cuda:0] ___________
17:35:21 
17:35:21 device = cuda:0, args = (), kwargs = {'float_dtype': 'float16'}
17:35:21 backend_config = <BackendConfig use_chainerx=True chainerx_device='cuda:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'>
17:35:21 obj = <chainer.testing._bundle.TestTrilTriu_param_4_{k=1, shape=(4, 3)} object at 0x7f53f8b31908>
17:35:21 
17:35:21     @pytest.mark.parametrize_device(devices)
17:35:21     def entry_func(device, *args, **kwargs):
17:35:21         backend_config = _make_backend_config(device.name)
17:35:21     
17:35:21         # Forward test
17:35:21         obj = cls()
17:35:21         try:
17:35:21             obj.setup(*args, **kwargs)
17:35:21             obj.run_test_forward(backend_config)
17:35:21         finally:
17:35:21             obj.teardown()
17:35:21     
17:35:21         # If this is a NumpyOpTest instance, skip backward/double-backward
17:35:21         # tests if the forward test succeeds with acceptable errors.
17:35:21         if isinstance(obj, NumpyOpTest):
17:35:21             if obj.is_forward_successful_with_accept_errors:
17:35:21                 return  # success with expected errors
17:35:21     
17:35:21         # Backward test
17:35:21         obj = cls()
17:35:21         try:
17:35:21             obj.setup(*args, **kwargs)
17:35:21 >           obj.run_test_backward(backend_config)
17:35:21 
17:35:21 args       = ()
17:35:21 backend_config = <BackendConfig use_chainerx=True chainerx_device='cuda:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'>
17:35:21 cls        = <class 'chainer.testing._bundle.TestTrilTriu_param_4_{k=1, shape=(4, 3)}'>
17:35:21 device     = cuda:0
17:35:21 kwargs     = {'float_dtype': 'float16'}
17:35:21 obj        = <chainer.testing._bundle.TestTrilTriu_param_4_{k=1, shape=(4, 3)} object at 0x7f53f8b31908>
17:35:21 
17:35:21 /repo/tests/chainerx_tests/op_utils.py:366: 
17:35:21 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
17:35:21 /repo/tests/chainerx_tests/op_utils.py:95: in run_test_backward
17:35:21     super(OpTest, self).run_test_backward(backend_config)
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:255: in run_test_backward
17:35:21     do_check()
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:244: in do_check
17:35:21     **self.check_backward_options)
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/contextlib.py:99: in __exit__
17:35:21     self.gen.throw(type, value, traceback)
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:36: in raise_if_fail
17:35:21     cls.fail(message, e)
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:27: in fail
17:35:21     utils._raise_from(cls, message, exc)
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/utils/__init__.py:106: in _raise_from
17:35:21     six.raise_from(new_exc.with_traceback(orig_exc.__traceback__), None)
17:35:21 <string>:3: in raise_from
17:35:21     ???
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:34: in raise_if_fail
17:35:21     yield
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:244: in do_check
17:35:21     **self.check_backward_options)
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:900: in check_backward
17:35:21     detect_nondifferentiable, is_immutable_params=False
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:464: in run
17:35:21     self._run()
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:507: in _run
17:35:21     self._compare_gradients(gx_numeric, gx_backward, directions)
17:35:21 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
17:35:21 
17:35:21 self = <chainer.gradient_check._CheckBackward object at 0x7f53f8b31748>
17:35:21 gx_numeric = array(-0.00447911, shape=(), dtype=float64, device='cuda:0')
17:35:21 gx_backward = array(-0.00457345, shape=(), dtype=float64, device='cuda:0')
17:35:21 directions = [array([[-0.19084183, -0.13525285, 0.24186938],
17:35:21        [0.38141085, -0.54487473, -0.3863876 ],
17:35:21        [-0.17815417, 0.04524124, -0.48183729],
17:35:21        [0.11480988, -0.03716954, 0.12087778]], shape=(4, 3), dtype=float64, device='cuda:0')]
17:35:21 
17:35:21     def _compare_gradients(self, gx_numeric, gx_backward, directions):
17:35:21         atol = self.atol
17:35:21         rtol = self.rtol
17:35:21         # Compare the gradients
17:35:21         try:
17:35:21             testing.assert_allclose(
17:35:21                 gx_numeric, gx_backward, atol=atol, rtol=rtol)
17:35:21         except AssertionError as e:
17:35:21             eps = self.eps
17:35:21             xs = self.xs
17:35:21             gys = self.gys
17:35:21             f = six.StringIO()
17:35:21             f.write('check_backward failed (eps={} atol={} rtol={})\n'.format(
17:35:21                 eps, atol, rtol))
17:35:21             for i, x in enumerate(xs):
17:35:21                 f.write('inputs[{}]:\n'.format(i))
17:35:21                 f.write('{}\n'.format(x))
17:35:21             for i, gy in enumerate(gys):
17:35:21                 f.write('grad_outputs[{}]:\n'.format(i))
17:35:21                 f.write('{}\n'.format(gy))
17:35:21             for i, d in enumerate(directions):
17:35:21                 f.write('directions[{}]:\n'.format(i))
17:35:21                 f.write('{}\n'.format(d))
17:35:21             f.write('gradients (numeric):  {}\n'.format(gx_numeric))
17:35:21             f.write('gradients (backward): {}\n'.format(gx_backward))
17:35:21             f.write('\n')
17:35:21             f.write('x: numeric gradient, y: backward gradient')
17:35:21             f.write(str(e))
17:35:21 >           raise AssertionError(f.getvalue())
17:35:21 E           chainer.testing.function_link.FunctionTestError: backward is not implemented correctly
17:35:21 E           
17:35:21 E           (caused by)
17:35:21 E           AssertionError: check_backward failed (eps=0.001 atol=1e-05 rtol=0.005)
17:35:21 E           inputs[0]:
17:35:21 E           array([[0.9765625 , 0.35913086, 0.16003418],
17:35:21 E                  [0.87304688, 0.41284180, 0.34838867],
17:35:21 E                  [0.61523438, 0.55126953, 0.71191406],
17:35:21 E                  [0.40600586, 0.0482788 , 0.73046875]], shape=(4, 3), dtype=float16, device='cuda:0')
17:35:21 E           grad_outputs[0]:
17:35:21 E           array([[-0.29565430, 0.82910156, -0.86621094],
17:35:21 E                  [0.45214844, -0.57568359, 0.51074219],
17:35:21 E                  [-0.91455078, -0.56347656, 0.04678345],
17:35:21 E                  [-0.18273926, 0.54541016, -0.71386719]], shape=(4, 3), dtype=float16, device='cuda:0')
17:35:21 E           grad_outputs[1]:
17:35:21 E           array([[-0.82617188, -0.45629883, -0.77978516],
17:35:21 E                  [0.63574219, -0.13049316, 0.2536621 ],
17:35:21 E                  [-0.75195312, -0.43554688, -0.35913086],
17:35:21 E                  [0.80859375, 0.69677734, 0.96728516]], shape=(4, 3), dtype=float16, device='cuda:0')
17:35:21 E           directions[0]:
17:35:21 E           array([[-0.19084183, -0.13525285, 0.24186938],
17:35:21 E                  [0.38141085, -0.54487473, -0.3863876 ],
17:35:21 E                  [-0.17815417, 0.04524124, -0.48183729],
17:35:21 E                  [0.11480988, -0.03716954, 0.12087778]], shape=(4, 3), dtype=float64, device='cuda:0')
17:35:21 E           gradients (numeric):  array(-0.00447911, shape=(), dtype=float64, device='cuda:0')
17:35:21 E           gradients (backward): array(-0.00457345, shape=(), dtype=float64, device='cuda:0')
17:35:21 E           
17:35:21 E           x: numeric gradient, y: backward gradient
17:35:21 E           Not equal to tolerance rtol=0.005, atol=1e-05
17:35:21 E           
17:35:21 E           Mismatch: 100%
17:35:21 E           Max absolute difference: 9.43329122e-05
17:35:21 E           Max relative difference: 0.02062621
17:35:21 E            x: array(-0.004479)
17:35:21 E            y: array(-0.004573)
17:35:21 E           
17:35:21 E           assert_allclose failed: 
17:35:21 E             shape: () ()
17:35:21 E             dtype: float64 float64
17:35:21 E             i: (0,)
17:35:21 E             x[i]: -0.004479114874565315
17:35:21 E             y[i]: -0.0045734477868034085
17:35:21 E             relative error[i]: 0.020626213884039273
17:35:21 E             absolute error[i]: 9.433291223809315e-05
17:35:21 E             relative tolerance * |y[i]|: 2.286723893401704e-05
17:35:21 E             absolute tolerance: 1e-05
17:35:21 E             total tolerance: 3.2867238934017044e-05
17:35:21 E           x: -0.00447911
17:35:21 E           y: -0.00457345
17:35:21 
17:35:21 atol       = 1e-05
17:35:21 d          = array([[-0.19084183, -0.13525285, 0.24186938],
17:35:21        [0.38141085, -0.54487473, -0.3863876 ],
17:35:21        [-0.17815417, 0.04524124, -0.48183729],
17:35:21        [0.11480988, -0.03716954, 0.12087778]], shape=(4, 3), dtype=float64, device='cuda:0')
17:35:21 directions = [array([[-0.19084183, -0.13525285, 0.24186938],
17:35:21        [0.38141085, -0.54487473, -0.3863876 ],
17:35:21        [-0.17815417, 0.04524124, -0.48183729],
17:35:21        [0.11480988, -0.03716954, 0.12087778]], shape=(4, 3), dtype=float64, device='cuda:0')]
17:35:21 eps        = 0.001
17:35:21 f          = <_io.StringIO object at 0x7f53f8b3c168>
17:35:21 gx_backward = array(-0.00457345, shape=(), dtype=float64, device='cuda:0')
17:35:21 gx_numeric = array(-0.00447911, shape=(), dtype=float64, device='cuda:0')
17:35:21 gy         = array([[-0.82617188, -0.45629883, -0.77978516],
17:35:21        [0.63574219, -0.13049316, 0.2536621 ],
17:35:21        [-0.75195312, -0.43554688, -0.35913086],
17:35:21        [0.80859375, 0.69677734, 0.96728516]], shape=(4, 3), dtype=float16, device='cuda:0')
17:35:21 gys        = (array([[-0.29565430, 0.82910156, -0.86621094],
17:35:21        [0.45214844, -0.57568359, 0.51074219],
17:35:21        [-0.91455078, -0....-0.43554688, -0.35913086],
17:35:21        [0.80859375, 0.69677734, 0.96728516]], shape=(4, 3), dtype=float16, device='cuda:0'))
17:35:21 i          = 0
17:35:21 rtol       = 0.005
17:35:21 self       = <chainer.gradient_check._CheckBackward object at 0x7f53f8b31748>
17:35:21 x          = array([[0.9765625 , 0.35913086, 0.16003418],
17:35:21        [0.87304688, 0.41284180, 0.34838867],
17:35:21        [0.61523438, 0.55126953, 0.71191406],
17:35:21        [0.40600586, 0.0482788 , 0.73046875]], shape=(4, 3), dtype=float16, device='cuda:0')
17:35:21 xs         = (array([[0.9765625 , 0.35913086, 0.16003418],
17:35:21        [0.87304688, 0.41284180, 0.34838867],
17:35:21        [0.61523438, 0.55126953, 0.71191406],
17:35:21        [0.40600586, 0.0482788 , 0.73046875]], shape=(4, 3), dtype=float16, device='cuda:0'),)
17:35:21 
17:35:21 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:537: FunctionTestError
17:35:22 =============================== warnings summary ===============================
17:35:22 tests/chainerx_tests/unit_tests/routines_tests/test_explog.py::test_Erf_param_0_{shape=(2, 2), in_dtypes=('float16',), out_dtype='float16'}[float16-native:0]
17:35:22   /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/functions/math/erf.py:36: PerformanceWarning: SciPy is not available. Forward computation of erf in CPU can be slow without SciPy.
17:35:22     chainer.warnings.PerformanceWarning)
17:35:22 
17:35:22 -- Docs: https://docs.pytest.org/en/latest/warnings.html
17:35:22 =========================== short test summary info ============================
17:35:22 FAIL ../../../repo/tests/chainerx_tests/unit_tests/routines_tests/test_creation.py::test_TrilTriu_param_4_{k=1, shape=(4, 3)}[float16-cuda:0]
17:35:22 = 1 failed, 165337 passed, 5907 skipped, 41 xfailed, 1 warnings in 2079.43 seconds =
17:35:38 Build step 'Execute shell' marked build as failure

@chainer-ci
Copy link
Member

@mitmul This pull-request is marked as st:test-and-merge, but there were no activities for the last 3 days. Could you check?

@toslunar
Copy link
Member

Jenkins, test this please.

@chainer-ci
Copy link
Member

Jenkins CI test (for commit 89e90b4, target branch master) succeeded!

@mergify mergify bot merged commit f6dbcd4 into chainer:master Aug 11, 2019
@knorth55 knorth55 deleted the useissubdtype branch August 12, 2019 00:51
@toslunar toslunar added this to the v7.0.0b3 milestone Aug 12, 2019
@kmaehashi kmaehashi changed the title use numbers for input check in roi_{average|max}_{pooling|align}_2d.py Use numbers for input check in roi_{average|max}_{pooling|align}_2d.py Aug 22, 2019
@kmaehashi kmaehashi added the cat:enhancement Implementation that does not break interfaces. label Aug 22, 2019
@chainer-ci
Copy link
Member

Jenkins CI test (for commit 89e90b4, target branch master) failed with status FAILURE.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:enhancement Implementation that does not break interfaces. st:test-and-merge State indicating that pull request is approved by a reviewer and can be merged after CI passes.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants