Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fully deprecate variadic inputs of checkpoint_sequential #25985

Closed

Conversation

@sublee
Copy link
Contributor

sublee commented Sep 11, 2019

To support variadic inputs of checkpoint_sequential was deprecated at #21006. This case should be warned with DeprecationWarning for PyTorch 1.2, but it should be simply failed with TypeError since PyTorch 1.3. This patch removes the DeprecationWarning for PyTorch 1.2.

@soumith

This comment has been minimized.

Copy link
Member

soumith commented Sep 24, 2019

@pytorchbot rebase this please

@sublee

This comment has been minimized.

Copy link
Contributor Author

sublee commented Sep 24, 2019

@soumith Thank you for reviewing it.

@yf225

This comment has been minimized.

Copy link
Contributor

yf225 commented Sep 25, 2019

@sublee We can't land it yet because test is failing:

Sep 24 20:52:17 ======================================================================
Sep 24 20:52:17 ERROR: test_checkpoint_module_list_multiple_args (__main__.TestCheckpoint)
Sep 24 20:52:17 ----------------------------------------------------------------------
Sep 24 20:52:17 Traceback (most recent call last):
Sep 24 20:52:17   File "test_utils.py", line 188, in test_checkpoint_module_list_multiple_args
Sep 24 20:52:17     torch.randn(1, 60, requires_grad=True)
Sep 24 20:52:17   File "test_utils.py", line 69, in _check_checkpoint_sequential
Sep 24 20:52:17     out = checkpoint_sequential(model_to_compare, num_chunks, *detached_inputs)
Sep 24 20:52:17   File "/opt/conda/lib/python3.6/site-packages/torch/utils/checkpoint.py", line 217, in checkpoint_sequential
Sep 24 20:52:17     preserve_rng_state=preserve_rng_state)
Sep 24 20:52:17   File "/opt/conda/lib/python3.6/site-packages/torch/utils/checkpoint.py", line 155, in checkpoint
Sep 24 20:52:17     return CheckpointFunction.apply(function, preserve, *args)
Sep 24 20:52:17   File "/opt/conda/lib/python3.6/site-packages/torch/utils/checkpoint.py", line 62, in forward
Sep 24 20:52:17     if preserve_rng_state:
Sep 24 20:52:17 RuntimeError: bool value of Tensor with more than one value is ambiguous
Copy link
Member

soumith left a comment

please fix test

@sublee

This comment has been minimized.

Copy link
Contributor Author

sublee commented Sep 26, 2019

@yf225, @soumith Sorry for my mistake. I ran only the test cases I added at #21006 but I had not to do. I just fixed those failures. Please review it.

@vincentqb vincentqb added the triaged label Sep 26, 2019
@sublee

This comment has been minimized.

Copy link
Contributor Author

sublee commented Oct 9, 2019

@soumith I guess you lost the last update. Would you please review this PR again?

@sublee sublee force-pushed the sublee:deprecate-checkpoint-sequential-varargs branch from aa20b4a to a07700f Oct 11, 2019
@sublee

This comment has been minimized.

Copy link
Contributor Author

sublee commented Oct 11, 2019

This patch was intended to deploy with v1.3.0. However, today PyTorch v1.3.0 has been released. I update the version to v1.4 in the deprecation documentation and rebase this branch onto the current master.

@albanD
albanD approved these changes Dec 3, 2019
Copy link
Contributor

albanD left a comment

Details in the test to make it look slightly better.
Then I'll merge.

Thanks for the PR !

test/test_utils.py Outdated Show resolved Hide resolved
test/test_utils.py Outdated Show resolved Hide resolved
sublee added a commit to sublee/pytorch that referenced this pull request Dec 4, 2019
@albanD albanD force-pushed the sublee:deprecate-checkpoint-sequential-varargs branch from 01bf464 to e61b020 Dec 4, 2019
@albanD

This comment has been minimized.

Copy link
Contributor

albanD commented Dec 4, 2019

Just rebased on top of master for CI.

And again as I rebased on top of a broken master...

albanD added a commit to sublee/pytorch that referenced this pull request Dec 4, 2019
Copy link
Contributor

facebook-github-bot left a comment

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@albanD albanD force-pushed the sublee:deprecate-checkpoint-sequential-varargs branch from e61b020 to 5829119 Dec 4, 2019
@soumith
soumith approved these changes Dec 4, 2019
Copy link
Contributor

facebook-github-bot left a comment

@albanD is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot

This comment has been minimized.

Copy link
Contributor

facebook-github-bot commented Dec 10, 2019

@albanD merged this pull request in fa251cf.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
8 participants
You can’t perform that action at this time.