Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deps: Update transformers to latest and skip broken prompt tuning tests #246

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

Abhishek-TAMU
Copy link

Description of the change

  • set lower limit of transformers to 4.41.0 which uses TrainingArgument eval_strategy instead of evaluation_strategy (By @anhuong )
  • set upper limit of transformers to 5.0, thus pulling in new versions of transformers (By @anhuong )
  • Commenting of prompt tuning test cases.

Related issue number

Previously we had seen degraded performance in v4.41 of transformers as described in issue #201. After further evaluation described in the issue, the throughput degradation on llama3-8B was only 17%, only on 2 A100, and not at all on 4 A100, and that memory utilization was significantly improved by moving to 4.41.

How to verify the PR

Was the PR tested

  • I have added >=1 unit test(s) for every new method I have added.
  • I have ensured all unit tests pass

@anhuong anhuong changed the title Changes by @anhuong: Update to transformers latest version and commenting of prompt tuning test cases deps: Update to transformers latest version and skip broken prompt tuning tests Jul 17, 2024
@@ -168,108 +168,108 @@ def test_parse_arguments_peft_method(job_config):
############################# Prompt Tuning Tests #############################


def test_run_causallm_pt_and_inference():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of commenting out these tests, lets add a pytest skip instead like @pytest.mark.skip(reason="currently crashes before validation is done") as seen here and then provide the reason and error

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for reviewing the PR Anh. I have added this in all 3 test cases.

@anhuong
Copy link
Collaborator

anhuong commented Jul 17, 2024

In addition, please address the linter error and run tox -e lint to ensure the error is addressed.

Signed-off-by: Abhishek <maurya.abhishek@ibm.com>
@anhuong anhuong changed the title deps: Update to transformers latest version and skip broken prompt tuning tests deps: Update transformers to latest and skip broken prompt tuning tests Jul 17, 2024
@@ -322,6 +322,7 @@ def train(
# this validation, we just drop the things that aren't part of the SFT Config and build one
# from our object directly. In the future, we should consider renaming this class and / or
# not adding things that are not directly used by the trainer instance to it.
train_args.use_cpu=True
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please leave this change out of the PR, although useful note that transformers 4.42 requires this change to run unit tests on macs!

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure thanks!

tox.ini Outdated
@@ -5,6 +5,7 @@ envlist = py, lint, fmt
description = run unit tests
deps =
pytest>=7
setuptools
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is setuptools needed to run unittests?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On my pc it is required, but I am not sure if its required in the build. Hence removing it for now.

Signed-off-by: Abhishek <maurya.abhishek@ibm.com>
Copy link
Collaborator

@anhuong anhuong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks Abhishek, please ensure before committing to run linters to ensure they pass tox -e fmt,lint. The formatter one is failing

Signed-off-by: Abhishek <maurya.abhishek@ibm.com>
Signed-off-by: Abhishek <maurya.abhishek@ibm.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants