This repository has been archived by the owner on Nov 22, 2022. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 801
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
cd1f8d2
to
deea7e4
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 2fb002c95a0c5f599e12ae1542df31813a64201d
deea7e4
to
5a6d563
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: d39f53e7e1b7b7f89f53609edcfe9eb46881481e
5a6d563
to
1efbb5a
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: df726315c224b2f89b03ff45c1032a874ba6ba91
1efbb5a
to
58a3d47
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 1cdb2a7f3e2739137200cb87d538084d71bdcc2b
58a3d47
to
96ca5f1
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 0c0123137a3b623b60d2d4a450c59c6eb2fd9486
96ca5f1
to
213fcad
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: d5e144d2723953fdf0f798d8232d99f5b62bf334
213fcad
to
0a6e87d
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 10d76c45f1513333baac131a2bda0f63d5378c4e
0a6e87d
to
4ee8177
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 23, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 64c8f1367f277bc133ab4bb52540ff964d44e971
4ee8177
to
23a8b68
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 24, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: a6dcbc506ec075002b9540ebe03f270df574f43c
23a8b68
to
bc8deca
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 27, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 2c78c81347bbacf8b2a129383d848b7dfb7de14b
bc8deca
to
e1885a0
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 29, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Differential Revision: D16276949 fbshipit-source-id: 7b106afea08a79e58dd67d4ca9b942217d20b0ba
e1885a0
to
b161866
Compare
Erica-Liu
pushed a commit
to Erica-Liu/pytext
that referenced
this pull request
Jul 30, 2019
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Reviewed By: chenyangyu1988 Differential Revision: D16276949 fbshipit-source-id: 2346c2bf9ca771eaef5c05196f762c765d68e891
Summary: Pull Request resolved: facebookresearch#782 write the optimizer wrapper in pytext supporting mixed precision training without amp Reviewed By: chenyangyu1988 Differential Revision: D16276949 fbshipit-source-id: cfbe17accf3ea8bddab02331628dea25d0284040
b161866
to
35af0b8
Compare
This pull request has been merged in 56036ff. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary: write the optimizer wrapper in pytext supporting mixed precision training without amp
Differential Revision: D16276949