-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More robust Online DPO changes for RL update #1664
Open
pluesclues
wants to merge
16
commits into
unslothai:main
Choose a base branch
from
pluesclues:main
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 1 commit
Commits
Show all changes
16 commits
Select commit
Hold shift + click to select a range
7445628
Update llama.py making set and reset functions in order to properly u…
pluesclues 389b98f
Update fast_lora.py, added mixed precising pytorch autocasting
pluesclues 4705906
Update llama.py did not included rotary embeddings in the reset funct…
pluesclues 9b5b8a1
Merge branch 'unslothai:main' into main
pluesclues 9d43808
Update rl.py: correct get reward model added as well as the eval step…
pluesclues 7503c71
Update rl.py removed function that did not need to be patched
pluesclues d219f25
Update llama.py: kept reset functions and made their names generic
pluesclues 52f4301
Update fast_lora.py
pluesclues 454b30d
Merge branch 'unslothai:main' into main
pluesclues 9ad2ea3
Update rl.py, try except
pluesclues f5dbcb7
Merge branch 'main' into main
pluesclues 06e694f
Update fast_lora.py, removing downcasting stuff
pluesclues c424ba8
Merge branch 'unslothai:main' into main
pluesclues 24ab05e
Update llama.py removed depircate LLamaLinearScalingRotaryEmbedding
pluesclues 8799a5c
Merge branch 'unslothai:main' into main
pluesclues 48e9dfb
Merge branch 'unslothai:main' into main
pluesclues File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Update rl.py removed function that did not need to be patched
- Loading branch information
commit 7503c716911dc5bd1a09b156625e16d84b2b6f99
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these primarily for Amazon Sagemaker?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just for Amazon Sagemaker, I wanted to keep the old functionality of the function itself, sadly it is very deep into transformers and
transformers.trainer
this is exclusively in the saving steps. This function had to be overwritten since it does evaluation outside of the trainer's code directly. I think we could theoretically remove this if you want to, I just did not want to remove this from the function if we did not have to. I think there is a lot of SMP forward passes in the function hence why I decided to keep it and tried to make it compatible.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/unslothai/unsloth/blob/main/unsloth/models/llama.py#L1956-L1961
Wasn't unsloth already setting this to False? Or were we missing something so far?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So actually yes, they do that, its more of to patch the trainer function itself right, so when trainer in huggingface wants to perform saving at like 500 steps right, it will call this function and if we do not write the Amazon sagemaker stuff here it will error out due to dependency issues and I wanted to keep most of the features of the function itself intact.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah cool. So no more force disabling sagemaker after these changes I presume
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So sagemaker by default will be disabled, its just that this function needs to check if sagemaker is there or not in order to also bring some of the other dependencies with it that are needed in the function. I did not really want to touch too much of the internal transformers stuff except for what unsloth needed to patch to get working.