Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make sure DDPM and diffusers can be used without Transformers #5668

Merged
merged 7 commits into from
Nov 7, 2023

Conversation

sayakpaul
Copy link
Member

Fixes #5661.

I think we need to make a patch release for this one.

While reviewing, could y'all please go over this comment too and verify things on your ends?

@sayakpaul sayakpaul changed the title fix: import bug [WIP] fix: import bug Nov 7, 2023
@sayakpaul sayakpaul marked this pull request as draft November 7, 2023 03:18
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Nov 7, 2023

The documentation is not available anymore as the PR was closed or merged.

@DN6
Copy link
Collaborator

DN6 commented Nov 7, 2023

@sayakpaul pip install -e "[.test]" installs transformers, which is why tests/others/test_dependencies.py didn't catch it.

@sayakpaul
Copy link
Member Author

But I am doing pip install -e . here:

@DN6
Copy link
Collaborator

DN6 commented Nov 7, 2023

Hmm that should have been caught then. Checking the container.

@DN6
Copy link
Collaborator

DN6 commented Nov 7, 2023

Do we even need to import PretrainedModel and PretrainedTokenizer if it's just a string that's being used for type hints?

@sayakpaul
Copy link
Member Author

Do we even need to import PretrainedModel and PretrainedTokenizer if it's just a string that's being used for type hints?

I think it's for type hints.

@DN6
Copy link
Collaborator

DN6 commented Nov 7, 2023

Hmm so the tests pass because torch isn't installed either, so just dummy objects are being created
https://github.com/huggingface/diffusers/actions/runs/6780411004/job/18429004601?pr=5671#step:5:20

@sayakpaul
Copy link
Member Author

Wonder if there is a more robust method to test the dependencies here. @DN6 what would you suggest?

@DN6
Copy link
Collaborator

DN6 commented Nov 7, 2023

I think there are a couple of issues

Models are all imported if torch is available otherwise we get dummies, but the Mixins can introduce additional dependencies on transformers or other soft dependencies. (Technically torch is optional too)

So we should be careful about how we add Mixins into model classes if they are only supposed to depend on a single dependency (torch or jax).

Adds dedicated workflows for the soft dependencies.

@DN6 DN6 mentioned this pull request Nov 7, 2023
6 tasks
@patrickvonplaten patrickvonplaten changed the title [WIP] fix: import bug Make sure diffusers can be used without Transformers Nov 7, 2023
@patrickvonplaten patrickvonplaten changed the title Make sure diffusers can be used without Transformers Make sure DDPM and diffusers can be used without Transformers Nov 7, 2023
@@ -18,14 +18,15 @@
import torch.nn.functional as F
from torch import nn

from ..loaders import PatchedLoraProjection, text_encoder_attn_modules, text_encoder_mlp_modules
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need to do this change? Don't think it's necessary no?

@@ -2390,7 +2390,7 @@ def unfuse_text_encoder_lora(text_encoder):
def set_adapters_for_text_encoder(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Going forward, let's make sure we always use strings for Transformers type hints

@@ -1,19 +1,40 @@
from typing import TYPE_CHECKING

from ...utils import (
DIFFUSERS_SLOW_IMPORT,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice clean-up!

@@ -1 +1,48 @@
from .pipeline_pixart_alpha import PixArtAlphaPipeline
from typing import TYPE_CHECKING

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice clean-up!

@patrickvonplaten
Copy link
Contributor

PR is good to go for me, just wondering whether we really need to do this change: #5668 (comment)

@patrickvonplaten patrickvonplaten merged commit 84cd9e8 into main Nov 7, 2023
15 checks passed
@patrickvonplaten patrickvonplaten deleted the loaders-import-bug branch November 7, 2023 16:38
patrickvonplaten added a commit that referenced this pull request Nov 7, 2023
* fix: import bug

* fix

* fix

* fix import utils for lcm

* fix: pixart alpha init

* Fix

---------

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
kashif pushed a commit to kashif/diffusers that referenced this pull request Nov 11, 2023
…ingface#5668)

* fix: import bug

* fix

* fix

* fix import utils for lcm

* fix: pixart alpha init

* Fix

---------

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
…ingface#5668)

* fix: import bug

* fix

* fix

* fix import utils for lcm

* fix: pixart alpha init

* Fix

---------

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
…ingface#5668)

* fix: import bug

* fix

* fix

* fix import utils for lcm

* fix: pixart alpha init

* Fix

---------

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

name 'PreTrainedTokenizer' is not defined in diffusers 0.22.1
4 participants