Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Algorithm] Online Decision transformer #1149

Merged
merged 136 commits into from
Aug 30, 2023
Merged

Conversation

BY571
Copy link
Contributor

@BY571 BY571 commented May 12, 2023

Description

Implements the Online Decision Transformer Paper

Motivation and Context

Why is this change required? What problem does it solve?
If it fixes an open issue, please link to the issue here.
You can use the syntax close #15213 if this solves the issue #15213

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • [X ] Example (update in the folder of examples)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • [ X] My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 12, 2023
)


class ModifiedGPT2Model(GPT2Model):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wrapper class to remove wpe layer of the GPT2Model from transformers. Maybe we can compress this even more?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. This should run even if transformers isn't installed.
  2. Do we have dedicated tests?
  3. Is it integrated in the doc?
  4. The docstring is a bit cryptic for someone who doesn't know what it is all about.
    I wish transformers had a more modular code... What is the signature of wpe? In some cases we can simply replace the layer by nn.Identity()...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to use the identity but got some shape issues. But I found out that with all the fixes I did it now even converges with the wpe layer. For comparison, I also ran a test where I exchanged the wpe layer with a custom ZeroPosEmbeddingLayer returning only zeros. In the graph, you can see with wpe and with zero wpe.
Let me know what you think. For now, I took the ZeroPosEmbeddingLayer off as it does converge but I can add it as well.

image

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh but at this point let's get rid of that class altogether no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I removed it all already. If you can have a final look I think it should be ready now.

Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! A couple of last edits and we can ship this!! 🚀💪🏻

examples/decision_transformer/lamb.py Outdated Show resolved Hide resolved
examples/decision_transformer/utils.py Show resolved Hide resolved
torchrl/modules/models/decision_transformer.py Outdated Show resolved Hide resolved
)


class ModifiedGPT2Model(GPT2Model):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. This should run even if transformers isn't installed.
  2. Do we have dedicated tests?
  3. Is it integrated in the doc?
  4. The docstring is a bit cryptic for someone who doesn't know what it is all about.
    I wish transformers had a more modular code... What is the signature of wpe? In some cases we can simply replace the layer by nn.Identity()...

torchrl/modules/models/decision_transformer.py Outdated Show resolved Hide resolved
torchrl/modules/models/models.py Outdated Show resolved Hide resolved
@vmoens vmoens merged commit b444007 into pytorch:main Aug 30, 2023
39 of 54 checks passed
@vmoens
Copy link
Contributor

vmoens commented Aug 30, 2023

A million thanks for this feature @BY571!
Amazing stuff

vmoens added a commit to hyerra/rl that referenced this pull request Oct 10, 2023
Co-authored-by: vmoens <vincentmoens@gmail.com>
Co-authored-by: Mateusz Guzek <matguzek@meta.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. new algo New algorithm request or PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants