Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RLlib] SAC on new API stack (w/ EnvRunner and ConnectorV2): SACLearner and SACTorchLearner classes. #42570

Merged
merged 5 commits into from Jan 25, 2024

Conversation

simonsays1980
Copy link
Collaborator

@simonsays1980 simonsays1980 commented Jan 22, 2024

This PR is coded alongside #42568 and should implement the learner logic needed to transfer 'SAC' over to our new stack.

Why are these changes needed?

Our major RL algorithms should be transferred to our new stack, which offers higher modularity and customizability to users.

Related issue number

Closes #37778

Checks

  • I've signed off every commit(by using the -s flag, i.e., git commit -s) in this PR.
  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
    • I've added any new APIs to the API Reference. For example, if I added a
      method in Tune, I've added it in doc/source/tune/api/ under the
      corresponding .rst file.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

…ality to 'TorchLearner' to build a trainable 'nn.Parameter' needed for the temperature parameter in 'SAC'.

Signed-off-by: Simon Zehnder <simon.zehnder@gmail.com>
@sven1977 sven1977 changed the title SACLearner and SACTorchLearner [RLlib] SAC on new API stack (w/ EnvRunner and ConnectorV2): SACLearner and SACTorchLearner classes. Jan 23, 2024
@sven1977 sven1977 marked this pull request as ready for review January 23, 2024 14:47
@@ -290,6 +290,9 @@ def build(self) -> None:
flags, so that `_make_module()` can place the created module on the correct
device. After running super() it will wrap the module in a TorchDDPRLModule
if `_distributed` is True.
Note, in inherited classes it is advisable to call the parent's `build()`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Btw, I'm not sure why we don't move the entire code before the super().build() call below into the c'tor already. All we do here is to determine the GPU device - if any - based on config settings, which are all available already at c'tor time.

Then we don't need this comment here and we should also add the @OverrideToImplementCustomLogic_CallToSuperRecommended decorator on top of this method.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am with you on this.

However, the comment I added there was b/c when calling the build() in inherited classes at the beginning in the inherited build() method configure_optimizers_for_module() gets called before variables were defined (example: variables defined in the build() of the SACLearner, then calling super().build() before defining the variables in the SACLearner would have led to a case where the Learner.build() would call all configure_optimizers_for_module methods that were overriden by the SACTorchLearner and needed variables defined in the SACLearner - and these were not yet defined. So first define variables then call build() and to keep the knowledge I added it here.)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, let's test this (and change only, if possible) before we merge ...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. So it's not about whether to call super, but about when to call it.

The call to super must be after(!) all parameters are available (for the optimizers to know).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exactly! Let's keep this in mind for documentation! I am sure users who derive from this class will run into it.

Signed-off-by: Simon Zehnder <simon.zehnder@gmail.com>
Signed-off-by: Simon Zehnder <simon.zehnder@gmail.com>
Signed-off-by: Simon Zehnder <simon.zehnder@gmail.com>
Signed-off-by: Simon Zehnder <simon.zehnder@gmail.com>
Copy link
Contributor

@sven1977 sven1977 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks @simonsays1980 :)

@sven1977 sven1977 merged commit 622f4d3 into ray-project:master Jan 25, 2024
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[RLlib] Enabling RLModule by default on SAC
2 participants