Skip to content

Error in loading flux2 klein lora adapter #13547

@saeedkhanehgir

Description

@saeedkhanehgir

Hi,

I train my lora with DiffSynth repo for flux 2 klein model. When I want to inference with below code, I get warning and I think this problem is important.

inference code :


import torch
from diffusers import Flux2KleinPipeline
from glob import glob
from PIL import Image 

device = "cuda"
dtype = torch.bfloat16

model_path = "./weights/flux2-klein-9b"
pipe = Flux2KleinPipeline.from_pretrained(model_path, torch_dtype=dtype)
pipe.load_lora_weights(
    "./FLUX.2-klein-9B_lora",  
    weight_name="epoch-5.safetensors"  
)
pipe.fuse_lora(lora_scale=1.0)

pipe.enable_sequential_cpu_offload()  # save some VRAM by offloading the model to CPU

prompt = """You are an expert image adder."""

images_path = glob("./images/*")
for image_path in images_path:
    image  = Image.open(image_path)
    image_name = image_path.split('/')[-1]
    image = pipe(
        prompt=prompt,
        image = image,
        height=1024,
        width=1024,
        guidance_scale=1.0,
        num_inference_steps=4,
        generator=torch.Generator(device=device).manual_seed(0)
    ).images[0]
    image.save(f"./output/{image_name}")

warning

No LoRA keys associated to Flux2Transformer2DModel found with the prefix='transformer'. This is safe to ignore if LoRA state dict didn't originally have any Flux2Transformer2DModel related params. You can also try specifying prefix=None to resolve the warning.

Can someone help me ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions