Skip to content

Add Additional AttentionProcessor Types to Enhance Functionality #9908

@Prgckwb

Description

@Prgckwb

What API design would you like to have changed or added to the library? Why?
The AttentionProcessor type defined in diffusers.models.attention_processor.py does not include all AttnProcessor types. For example, in Stable Diffusion 3, the SD3Transformer2DModel uses JointAttnProcessor2_0. However, attempting to override the AttnProcessor in the same class results in a warning due to missing type hints. Therefore, I believe that all processors should be registered in the AttentionProcessor type.

What use case would this enable or better enable? Can you give us a code example?
For example:

from diffusers.models import SD3Transformer2DModel
from diffusers.models.attention_processor import JointAttnProcessor2_0

transformer: SD3Transformer2DModel = SD3Transformer2DModel.from_pretrained(
    'stabilityai/stable-diffusion-3.5-large',
    subfolder='transformer',
    torch_dtype=torch.float16,
)

transformer.set_attn_processor(JointAttnProcessor2_0())

like this:
SS_2024-11-12_at_16-13-06

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions