Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add decorator skip_check_grad_ci #21836

Merged
merged 3 commits into from
Dec 20, 2019

Conversation

zhangting2020
Copy link
Contributor

add decorator to skip check_grd CI:
Check_grad is required for Op test cases. However, there are specical cases that do not need to check_grad. The decorator is used to avoid failures of check_grad checking.

Note: The execution of unit test will not be skipped. It just avoids check_grad checking in tearDownClass method by setting a no_need_check_grad flag.

@@ -142,6 +142,25 @@ def __set_elem__(tensor, i, e):
return gradient_flat.reshape(tensor_to_check.shape())


def skip_check_grad_ci(func):
"""Decorator to skip check_grd CI.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

check_grd typo

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

def skip_check_grad_ci(func):
"""Decorator to skip check_grd CI.

Check_grad is required for Op test cases. However, there are specical cases
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there are some special(typo) cases that do not need to do check_grad.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

"""Decorator to skip check_grd CI.

Check_grad is required for Op test cases. However, there are specical cases
that do not need to check_grad. The decorator is used to avoid failures of
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This decorator is used to skip the check_grad of the above cases.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

'TestLookupTableOpWithPadding',
'TestLookupTableOpWithTensorIdsAndPadding',
'TestLookupTableOpWithPadding',
'TestLookupTableOpWithTensorIdsAndPadding',
'TestSeqMaxPool2DInference',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Other cases in NO_NEED_CHECK_GRAD_CASES will use the skip_check_grad_ci wrapper later?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have fixed

@@ -64,6 +64,7 @@ def test_check_output(self):
self.attrs = {'padding_idx': int(padding_idx)}
self.check_output()

@skip_check_grad_ci
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about the followings? i.e remove line68-71

@skip_check_grad_ci(note)
class TestLookupTableOpWithPadding(TestLookupTableOp):
   def setUp(self):
     xxx
   def test_check_output(self):
     xxx

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@zhangting2020 zhangting2020 force-pushed the skip_check_grad_ci branch 4 times, most recently from dc73883 to 18e6245 Compare December 19, 2019 12:17
luotao1
luotao1 previously approved these changes Dec 19, 2019
Copy link
Contributor

@luotao1 luotao1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Nice job! Please Monitor the skip_check_grad_ci in next PR!

Copy link
Contributor

@luotao1 luotao1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@luotao1 luotao1 merged commit 538c848 into PaddlePaddle:develop Dec 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants