Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[scheduler][autoscaler] Report placement resources for actor creation tasks #26813

Merged

Conversation

wuisawesome
Copy link
Contributor

@wuisawesome wuisawesome commented Jul 21, 2022

Why are these changes needed?

This change makes us report placement resources for actor creation tasks. Essentially, the resource model here is that a placement resource/actor creation task is a task that runs very quickly.

Related issue number

Closes #26806

Checks

  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

@wuisawesome wuisawesome changed the title [draft][scheduler][autoscaler] Report placement resources for actor creation tasks [scheduler][autoscaler] Report placement resources for actor creation tasks Aug 5, 2022
@wuisawesome
Copy link
Contributor Author

cc @yiranwang52 @DmitriGekhtman

Copy link
Contributor

@ericl ericl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lgtm

@ericl ericl added the @author-action-required The PR author is responsible for the next step. Remove tag to send back to the reviewer. label Aug 5, 2022
@@ -31,7 +31,7 @@ def test_warning_for_too_many_actors(shutdown_only):

p = init_error_pubsub()

@ray.remote
@ray.remote(num_cpus=0)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it appears that this test needed to be changed, because with the fixed scheduling class, these actors are getting hit by worker capping now.

@wuisawesome wuisawesome removed the @author-action-required The PR author is responsible for the next step. Remove tag to send back to the reviewer. label Aug 6, 2022
@wuisawesome
Copy link
Contributor Author

doc failures don't look related, merging.

@wuisawesome wuisawesome merged commit 50e278f into ray-project:master Aug 6, 2022
gramhagen pushed a commit to gramhagen/ray that referenced this pull request Aug 15, 2022
… tasks (ray-project#26813)

This change makes us report placement resources for actor creation tasks. Essentially, the resource model here is that a placement resource/actor creation task is a task that runs very quickly.

Closes ray-project#26806

Co-authored-by: Alex <alex@anyscale.com>
Signed-off-by: Scott Graham <scgraham@microsoft.com>
gramhagen pushed a commit to gramhagen/ray that referenced this pull request Aug 15, 2022
… tasks (ray-project#26813)

This change makes us report placement resources for actor creation tasks. Essentially, the resource model here is that a placement resource/actor creation task is a task that runs very quickly.

Closes ray-project#26806

Co-authored-by: Alex <alex@anyscale.com>
Stefan-1313 pushed a commit to Stefan-1313/ray_mod that referenced this pull request Aug 18, 2022
… tasks (ray-project#26813)

This change makes us report placement resources for actor creation tasks. Essentially, the resource model here is that a placement resource/actor creation task is a task that runs very quickly.

Closes ray-project#26806

Co-authored-by: Alex <alex@anyscale.com>
Signed-off-by: Stefan van der Kleij <s.vanderkleij@viroteq.com>
matthewdeng pushed a commit that referenced this pull request Sep 28, 2022
… tasks (#26813)

This change makes us report placement resources for actor creation tasks. Essentially, the resource model here is that a placement resource/actor creation task is a task that runs very quickly.

Closes #26806

Co-authored-by: Alex <alex@anyscale.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[GCS] Should report placement and not runtime resources to the autoscaler for scaling up
3 participants