Skip to content

Conversation

lstein
Copy link
Collaborator

@lstein lstein commented Apr 5, 2023

This PR introduces a new set of ModelManager methods that enables you to retrieve the individual parts of a stable diffusion pipeline model, including the vae, text_encoder, unet, tokenizer, etc.

To use:

from invokeai.backend import ModelManager

manager = ModelManager('/path/to/models.yaml')

# get the VAE
vae = manager.get_model_vae('stable-diffusion-1.5')

# get the unet
unet = manager.get_model_unet('stable-diffusion-1.5')

# get the tokenizer
tokenizer = manager.get_model_tokenizer('stable-diffusion-1.5')

# etc etc
feature_extractor = manager.get_model_feature_extractor('stable-diffusion-1.5')
scheduler = manager.get_model_scheduler('stable-diffusion-1.5')
text_encoder = manager.get_model_text_encoder('stable-diffusion-1.5')

# if no model provided, then defaults to the one currently in GPU, if any
vae = manager.get_model_vae()

…parts

- New method is ModelManager.get_sub_model(model_name:str,model_part:SDModelComponent)

To use:

```
from invokeai.backend import ModelManager, SDModelComponent as sdmc
manager = ModelManager('/path/to/models.yaml')
vae = manager.get_sub_model('stable-diffusion-1.5', sdmc.vae)
```
@blessedcoolant
Copy link
Collaborator

Code reads fine to me. Pretty straight forward. Do I need to test any use cases here?

@lstein
Copy link
Collaborator Author

lstein commented Apr 6, 2023

Code reads fine to me. Pretty straight forward. Do I need to test any use cases here?

I don't think so. I ran through all the test cases and didn't find any problems.

@lstein lstein requested a review from Kyle0654 April 7, 2023 02:03
@lstein lstein enabled auto-merge April 7, 2023 02:03
@lstein lstein merged commit e5f8b22 into main Apr 7, 2023
@lstein lstein deleted the feat/return-submodels branch April 7, 2023 05:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants