-
Couldn't load subscription status.
- Fork 6.5k
Closed
Description
What API design would you like to have changed or added to the library? Why?
The AttentionProcessor type defined in diffusers.models.attention_processor.py does not include all AttnProcessor types. For example, in Stable Diffusion 3, the SD3Transformer2DModel uses JointAttnProcessor2_0. However, attempting to override the AttnProcessor in the same class results in a warning due to missing type hints. Therefore, I believe that all processors should be registered in the AttentionProcessor type.
What use case would this enable or better enable? Can you give us a code example?
For example:
from diffusers.models import SD3Transformer2DModel
from diffusers.models.attention_processor import JointAttnProcessor2_0
transformer: SD3Transformer2DModel = SD3Transformer2DModel.from_pretrained(
'stabilityai/stable-diffusion-3.5-large',
subfolder='transformer',
torch_dtype=torch.float16,
)
transformer.set_attn_processor(JointAttnProcessor2_0())Metadata
Metadata
Assignees
Labels
No labels
