-
Notifications
You must be signed in to change notification settings - Fork 501
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
core: Introduce ThreadPoolBuilder::use_current_thread. #1063
Conversation
r? @cuviper |
b9836fa
to
cd65b76
Compare
Fixed the formatting of the tests, whoops. |
@cuviper review ping? Modulo the naming issue discussed above (happy to tweak as needed) I think this should be in decent shape? |
Also ideas to be able to test the error (deadlocking) case of calling |
Hi, sorry -- I'm going to be unavailable the rest of this week, but I'll try to get to this early next week. |
No worries, I was just making sure this didn't fall through the cracks, thanks! (And enjoy your time off if that's the reason for your availability :)). |
Gentle ping? |
cd65b76
to
b95fe39
Compare
327626a
to
38c411b
Compare
…r=smaug,jnicol,supply-chain-reviewers This applies rayon-rs/rayon#1063 to our rayon-core. I'm hopeful it can be merged upstream soon, but meanwhile this seems worth having on its ow. Differential Revision: https://phabricator.services.mozilla.com/D186722
…r=smaug,jnicol,supply-chain-reviewers This applies rayon-rs/rayon#1063 to our rayon-core. I'm hopeful it can be merged upstream soon, but meanwhile this seems worth having on its ow. Differential Revision: https://phabricator.services.mozilla.com/D186722
38c411b
to
8c840bd
Compare
d343448
to
10f1ee6
Compare
…oop. This was originally done for rayon-rs#1063, in order to reuse this to allow cleaning up the TLS data allocated by use_current_thread. We ended up not using that, but this refactoring seems useful on its own.
The |
See discussion in rayon-rs#1052. Closes rayon-rs#1052.
…thread. As per suggestion in rayon-rs#1052.
…e_current_thread.
…thread. Ideas for testing the "call cleanup function from a job" case would be great.
10f1ee6
to
40b59c0
Compare
OK, I think we're good to go -- thanks for your persistence! :) bors r+ |
Build succeeded! The publicly hosted instance of bors-ng is deprecated and will go away soon. If you want to self-host your own instance, instructions are here. If you want to switch to GitHub's built-in merge queue, visit their help page.
|
…oop. This was originally done for rayon-rs#1063, in order to reuse this to allow cleaning up the TLS data allocated by use_current_thread. We ended up not using that, but this refactoring seems useful on its own.
…oop. This was originally done for rayon-rs#1063, in order to reuse this to allow cleaning up the TLS data allocated by use_current_thread. We ended up not using that, but this refactoring seems useful on its own.
Thank you! |
1087: core: registry: Factor out "wait till out of work" part of the main loop. r=cuviper a=emilio This was originally done for #1063, in order to reuse this to allow cleaning up the TLS data allocated by use_current_thread. We ended up not using that, but this refactoring seems useful on its own, perhaps. Co-authored-by: Emilio Cobos Álvarez <emilio@crisal.io>
This generalizes the approach used by targets that don't support threading like
wasm, allowing the builder thread to be part of a new thread-pool.
This PR:
Feedback welcome.
clean_up_use_current_thread
is not a great name, but Ithink it's descriptive, and maybe good enough given it's a rather niche API for
non-global pools?