Skip to content

Conversation

sayakpaul
Copy link
Member

What does this PR do?

Hopefully, the last ones.

assert deprecated_warning_msg in str(cap_logger), "Deprecation warning not found in logs"

@pytest.mark.xfail(condition=is_transformers_version(">", "4.56.2"), reason="Some import error", strict=True)
def test_download_safetensors_only_variant_exists_for_model(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps I am missing something, but it appears that this test (and test_download_safetensors_variant_does_not_exist_for_model below) are still marked as failed rather than xfailed in the CI:

FAILED tests/pipelines/test_pipelines.py::DownloadTests::test_download_safetensors_only_variant_exists_for_model
FAILED tests/pipelines/test_pipelines.py::DownloadTests::test_download_safetensors_variant_does_not_exist_for_model

Is this expected?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah this is not expected and seeing for the first time. Let me check.

Copy link
Collaborator

@dg845 dg845 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like the following tests are still failing on the CI:

  • tests/pipelines/test_pipelines.py::DownloadTests::test_download_bin_only_variant_exists_for_model
  • tests/pipelines/test_pipelines.py::DownloadTests::test_download_bin_variant_does_not_exist_for_model

These tests also previously failed on the (unrelated) PR #12463, see #12463 (comment). So probably these tests should also be xfail-ed?

@sayakpaul
Copy link
Member Author

Hmm, I am not sure why the Transformers version is showing up to 4.57.0-dev here: https://github.com/huggingface/diffusers/actions/runs/18487464650/job/52673652920?pr=12455#step:5:19 whereas it should have been 5.0.0-dev. My plate is a little full at the moment to dive deeper into it.

So, I have made the xfail non-strict, which should hopefully unblock us. But I am open to other suggestions.

To double-check, I tried to replicate the exact same CI environment:

  1. docker run -it diffusers/diffusers-pytorch-cpu
  2. Clone diffusers and cd into it.
  3. Check out this PR branch.
  4. Then run these:
    uv pip install -e ".[quality]"
    uv pip uninstall transformers huggingface_hub && uv pip install --prerelease allow -U transformers@git+https://github.com/huggingface/transformers.git
    uv pip uninstall accelerate && uv pip install -U accelerate@git+https://github.com/huggingface/accelerate.git --no-deps
  5. Then run pytest <corresponding_test>.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants