Skip to content

[CI] More Fast GPU Test Fixes #9346

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Sep 3, 2024
Merged

[CI] More Fast GPU Test Fixes #9346

merged 6 commits into from
Sep 3, 2024

Conversation

DN6
Copy link
Collaborator

@DN6 DN6 commented Sep 2, 2024

What does this PR do?

  • Add attribute to models uses_custom_attn_processor to skip test_set_attn_processor_for_determinism tests.
  • Add a slow tag to a LoRA test that needs a checkpoint.

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@DN6 DN6 changed the title [CI] More Fast Test Fixes [CI] More Fast GPU Test Fixes Sep 2, 2024
@DN6 DN6 requested a review from sayakpaul September 3, 2024 05:40
Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a single comment. Not merge blocking.

@@ -34,6 +34,7 @@
class DiTTransformer2DModelTests(ModelTesterMixin, unittest.TestCase):
model_class = DiTTransformer2DModel
main_input_name = "hidden_states"
uses_custom_attn_processor = False
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could just default it to False in ModelTesterMixin , no?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I'll update.

@DN6 DN6 merged commit f6f16a0 into main Sep 3, 2024
14 checks passed
sayakpaul pushed a commit that referenced this pull request Dec 23, 2024
* update

* update

* update

* update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants