Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Setting "default_loras": [] in config.txt disables loras functionality in UI #2427

Closed
2 of 4 tasks
VictorZakharov opened this issue Mar 3, 2024 · 4 comments · Fixed by #2430
Closed
2 of 4 tasks
Labels
bug Something isn't working
Milestone

Comments

@VictorZakharov
Copy link

VictorZakharov commented Mar 3, 2024

Checklist

  • The issue exists on a clean installation of Fooocus
  • The issue exists in the current version of Fooocus
  • The issue has not been reported before recently
  • The issue has been reported before but has not been fixed yet

What happened?

Somehow I had this in my fooocus config, I was updating from a previous version.

"default_loras": [],

In the old version of fooocus, this did not prevent loras portion of the UI from showing, but in 2.2.0 it did.
First I did not realize the issue is config related, but after some discussion with @mashb1t, turned out it was 100% config related. Easy fix, but behavior was somewhat unexpected.

Steps to reproduce the problem

Add this to config on fooocus 2.2.0 config.txt and restart fooocus:

"default_loras": [],

What should have happened?

Loras UI should be displayed, but show all loras as empty / not selected.

What browsers do you use to access Fooocus?

Google Chrome

Where are you running Fooocus?

Locally

What operating system are you using?

Windows 10

Console logs

[System ARGV] ['G:\\Git\\StabilityMatrix\\Packages\\Fooocus\\launch.py']
Python 3.10.11 (tags/v3.10.11:7d4cc5a, Apr  5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
Fooocus version: 2.2.0
Total VRAM 12282 MB, total RAM 130983 MB
xformers version: 0.0.22.post4
Set vram state to: NORMAL_VRAM
Always offload VRAM
Device: cuda:0 NVIDIA GeForce RTX 4070 Ti : native
VAE dtype: torch.bfloat16
Using xformers cross attention
Refiner unloaded.
Running on local URL:  http://127.0.0.1:7865

To create a public link, set `share=True` in `launch()`.
model_type EPS
UNet ADM Dimension 2816
Using xformers attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using xformers attention in VAE
extra {'cond_stage_model.clip_g.logit_scale', 'cond_stage_model.clip_l.logit_scale', 'cond_stage_model.clip_l.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_g.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_l.text_projection'}
Base model loaded: G:\Git\StabilityMatrix\Models\StableDiffusion\juggernautXL_v9Rundiffusionphoto2.safetensors
Request to load LoRAs [] for model [G:\Git\StabilityMatrix\Models\StableDiffusion\juggernautXL_v9Rundiffusionphoto2.safetensors].
Fooocus V2 Expansion: Vocab with 642 words.
Fooocus Expansion engine loaded for cuda:0, use_fp16 = True.
Requested to load SDXLClipModel
Requested to load GPT2LMHeadModel
Loading 2 new models
[Fooocus Model Management] Moving model(s) has taken 0.29 seconds
Started worker with PID 3960
App started successful. Use the app with http://127.0.0.1:7865/ or 127.0.0.1:7865

Additional information

I have not updated my GPU driver recently. The only thing changed was updating fooocus to 2.2.0.

@VictorZakharov VictorZakharov added bug Something isn't working triage This needs an (initial) review labels Mar 3, 2024
@mashb1t mashb1t removed the triage This needs an (initial) review label Mar 3, 2024
@mashb1t
Copy link
Collaborator

mashb1t commented Mar 3, 2024

This could be prevented by adding "default_max_lora_number": 5, to the default.json preset, but generally speaking the default.json already includes default_loras, which is the fallback if default_max_lora_number is not set.
Not sure if really bug or only individual issue. Let's re-evaluate this in a few days.
Thanks for reporting!

@VictorZakharov
Copy link
Author

VictorZakharov commented Mar 3, 2024

@mashb1t Even if it's an edge case, if setting it to [] is not wrong, then fooocus should handle it. And if this/current behavior is intended, the UI should display a way to enable loras back. The logical connection from [] to missing loras in UI is not obvious. I would also argue that for any invalid config, the UI should revert to its default state. So if hiding loras is not intended using any config option, then there should be no way it happens like that. Meaning non-critical, but still a bug currently.

@mashb1t
Copy link
Collaborator

mashb1t commented Mar 3, 2024

@VictorZakharov i understand your point and may consider it.
Still, setting [] as default_loras instead of "none" array entries is kinda nonsense.

@mashb1t
Copy link
Collaborator

mashb1t commented Mar 3, 2024

@VictorZakharov i've added a fallback value if the list of default_loras is empty. See #2430

@mashb1t mashb1t closed this as completed Mar 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants