Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove FutureMessage from sender TensorPipeAgent #50024

Closed
wants to merge 4 commits into from

Conversation

mrshenli
Copy link
Contributor

@mrshenli mrshenli commented Jan 3, 2021

Stack from ghstack:

Differential Revision: D25753386

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jan 3, 2021

💊 CI failures summary and remediations

As of commit d485bb7 (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-CircleCI failure(s)

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

This comment has been revised 15 times.

Comment on lines +277 to +278
std::shared_ptr<JitFuture> jitFuture =
std::make_shared<JitFuture>(at::AnyClassType::get());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change (replacing a value with a shared_ptr to a value) is undoing a slight optimization I had added: instead of having two nested shared_ptrs (hence two allocations, and two separate control blocks, with two reference counts) the code was using a single allocation (for the AtomicFutureMessage) with a single control block, and was relying on the shared_ptr "aliasing" constructor to extract a shared_ptr to the FutureMessage field that however still used the former control block.

The merits of this optimization are arguable, and it also has downsides (it keeps the atomic_flag alive even once it's not needed anymore), thus I'm fine to remove it, I just wanted to make sure this was intentional.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This wasn't intentional. Let me add that to my todo list, and see if I can bring it back in a followup PR.

@codecov
Copy link

codecov bot commented Jan 7, 2021

Codecov Report

Merging #50024 (d485bb7) into gh/mrshenli/276/base (392450d) will increase coverage by 10.21%.
The diff coverage is 100.00%.

@@                    Coverage Diff                    @@
##           gh/mrshenli/276/base   #50024       +/-   ##
=========================================================
+ Coverage                 70.46%   80.68%   +10.21%     
=========================================================
  Files                      1900     1900               
  Lines                    206321   206322        +1     
=========================================================
+ Hits                     145384   166463    +21079     
+ Misses                    60937    39859    -21078     

@facebook-github-bot
Copy link
Contributor

@mrshenli merged this pull request in 0684d07.

@facebook-github-bot facebook-github-bot deleted the gh/mrshenli/276/head branch January 11, 2021 15:17
hwangdeyu pushed a commit to hwangdeyu/pytorch that referenced this pull request Jan 14, 2021
Summary: Pull Request resolved: pytorch#50024

Test Plan: Imported from OSS

Reviewed By: lw

Differential Revision: D25753386

Pulled By: mrshenli

fbshipit-source-id: fdca051b805762a2c88f965ceb3edf1c25d40a56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged oncall: distributed Add this issue/PR to distributed oncall triage queue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants