-
Notifications
You must be signed in to change notification settings - Fork 6.5k
[Remote code] Add functionality to run remote models, schedulers, pipelines #5472
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| hub_repo_id=None, | ||
| hub_revision=None, | ||
| class_name=None, | ||
| cache_dir=None, | ||
| revision=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest we keep these as **kwargs. Instead of hub_repo_id or hub_revision, we could also opt for repo_id or revision, respectively following what we do across the library for Hub related utilities.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to keep hub_revision as the other revision argument is already used for Git revisions
tests/pipelines/test_pipelines.py
Outdated
|
|
||
| # Check that only loading custom componets "my_unet", "my_scheduler" and explicit custom pipeline works | ||
| pipeline = DiffusionPipeline.from_pretrained( | ||
| "/home/patrick/tiny-stable-diffusion-xl-pipe", custom_pipeline="my_pipeline" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ckpt path needs to be changed.
sayakpaul
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, that was pretty clean!
Can we also add a nice doc as I believe this will be a very powerful feature. We could readily test it out with Show-1: https://github.com/showlab/Show-1.
Also, can we add a test to see if a custom pipeline (loaded with trust_remote_code=True can work seamlessly with a legacy component from the library?
For example:
from diffusers import DiffusionPipeline
from diffusers import UniPCMultistepScheduler
pipeline = DiffusionPipeline.from_pretrained("hf-internal-testing/tiny-sdxl-custom-components", trust_remote_code=True)
pipeline.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
pipeline = pipeline.to(torch_device)
images = pipeline("test", num_inference_steps=2, output_type="np")[0]|
Hmm, I guess it's okay for the first shipment but it won't support loading of a custom UNet in isolation I think. What if I only have a model repo on the Hub similar to the ones you have for Transformers. I guess for that we need Autoclasses. |
|
Facing an issue and proposed a solution here: #5491. |
|
Have been playing with this PR to support a bit more complicated pipelines such as: https://huggingface.co/showlab/show-1-base. Took a while to understand how the pipeline repository should be structured. So, documenting everything here. If your pipeline has custom components that In case of
from transformers import T5Tokenizer, T5EncoderModel
pipe_id = "showlab/show-1-base"
tokenizer = T5Tokenizer.from_pretrained(pipe_id, subfolder="tokenizer")
text_encoder = T5EncoderModel.from_pretrained(pipe_id, subfolder="text_encoder")
from diffusers import DPMSolverMultistepScheduler
scheduler = DPMSolverMultistepScheduler.from_pretrained(pipe_id, subfolder="scheduler")
from transformers import CLIPFeatureExtractor
feature_extractor = CLIPFeatureExtractor.from_pretrained(pipe_id, subfolder="feature_extractor")Now, we need to implement the custom UNet. It's already available here: https://github.com/showlab/Show-1/blob/main/showone/models/unet_3d_condition.py. So, we create a Python script called Once this is done, we can initialize the UNet: from showone_unet_3d_condition import ShowOneUNet3DConditionModel
unet = ShowOneUNet3DConditionModel.from_pretrained(pipe_id, subfolder="unet")And then we implement the custom Now that we have all the components, we can fully initialize the For sharing with others, we can push this pipeline to the Hub: pipeline.push_to_hub("custom-t2v-pipeline")After the pipeline is successfully pushed, we need to perform a couple of changes:
Then we're ready for inference: from diffusers import DiffusionPipeline
import torch
pipeline = DiffusionPipeline.from_pretrained(
"sayakpaul/show-1-base-with-code", trust_remote_code=True, torch_dtype=torch.float16
).to("cuda")
prompt = "hello"
# Text embeds
prompt_embeds, negative_embeds = pipeline.encode_prompt(prompt)
# Keyframes generation (8x64x40, 2fps)
video_frames = pipeline(
prompt_embeds=prompt_embeds,
negative_prompt_embeds=negative_embeds,
num_frames=8,
height=40,
width=64,
num_inference_steps=2,
guidance_scale=9.0,
output_type="pt"
).frames |
…ingface/diffusers into add_custom_remote_pipelines
…ingface/diffusers into add_custom_remote_pipelines
That's a great summary! Maybe we could make this a doc page in a follow-up PR? :-) |
|
@patrickvonplaten if you could look into #5491 before merging. Happy to create a doc after that. |
|
The documentation is not available anymore as the PR was closed or merged. |
|
@sayakpaul Think it's ready for a final review :-) |
sayakpaul
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay for me to merge.
I will do a follow up after merge.
…elines (huggingface#5472) * upload custom remote poc * up * make style * finish * better name * Apply suggestions from code review * Update tests/pipelines/test_pipelines.py * more fixes * remove ipdb * more fixes * fix more * finish tests --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
…elines (huggingface#5472) * upload custom remote poc * up * make style * finish * better name * Apply suggestions from code review * Update tests/pipelines/test_pipelines.py * more fixes * remove ipdb * more fixes * fix more * finish tests --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
…elines (huggingface#5472) * upload custom remote poc * up * make style * finish * better name * Apply suggestions from code review * Update tests/pipelines/test_pipelines.py * more fixes * remove ipdb * more fixes * fix more * finish tests --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
What does this PR do?
This PR adds code that allows to run remote models, schedulers, and pipelines.
You can try it out by looking at the tests and how these two example repos are structured: