Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OSError: runwayml/stable-diffusion-v1-5 does not appear to have a file named config.json. #16

Open
wudabingm opened this issue Apr 2, 2024 · 8 comments

Comments

@wudabingm
Copy link

Traceback (most recent call last):
File "/home/lhs/project/nerf...wu/depth-fm-main/inference.py", line 113, in
main(args)
File "/home/lhs/project/nerf...wu/depth-fm-main/inference.py", line 64, in main
model = DepthFM(args.ckpt)
^^^^^^^^^^^^^^^^^^
File "/home/lhs/project/nerf...wu/depth-fm-main/depthfm/dfm.py", line 21, in init
self.vae = AutoencoderKL.from_pretrained(vae_id, subfolder="vae")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lhs/.conda/envs/depthfm/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 119, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/lhs/.conda/envs/depthfm/lib/python3.11/site-packages/diffusers/models/modeling_utils.py", line 569, in from_pretrained
config, unused_kwargs, commit_hash = cls.load_config(
^^^^^^^^^^^^^^^^
File "/home/lhs/.conda/envs/depthfm/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 119, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/lhs/.conda/envs/depthfm/lib/python3.11/site-packages/diffusers/configuration_utils.py", line 402, in load_config
raise EnvironmentError(
OSError: runwayml/stable-diffusion-v1-5 does not appear to have a file named config.json.

@GaoLL1026
Copy link

我也遇到了这个问题 请问解决了吗

@Xudong-Mao
Copy link

The problem may have been resolved by utilizing Google colab, but I encountered the same issue when attempting to run it locally.

@wudabingm
Copy link
Author

The problem may have been resolved by utilizing Google colab, but I encountered the same issue when attempting to run it locally.

Could you please tell me how to operate google colab? In addition, if the config error is reported, I can run it locally, but not on the server

@wudabingm
Copy link
Author

我也遇到了这个问题 请问解决了吗

还没有解决,好兄弟

@Xudong-Mao
Copy link

The problem may have been resolved by utilizing Google colab, but I encountered the same issue when attempting to run it locally.

Could you please tell me how to operate google colab? In addition, if the config error is reported, I can run it locally, but not on the server
Just look at the code in the inference.ipynb file and change the
Inferencedev = 'cuda:4 ' to dev = 'cuda:0' and execute all the code from start to finish.

@wudabingm
Copy link
Author

The problem may have been resolved by utilizing Google colab, but I encountered the same issue when attempting to run it locally.

Could you please tell me how to operate google colab? In addition, if the config error is reported, I can run it locally, but not on the server
Just look at the code in the inference.ipynb file and change the
Inferencedev = 'cuda:4 ' to dev = 'cuda:0' and execute all the code from start to finish.

But I started reporting errors in the inference.ipynb load model step of the code, the same error, runwayml/stable-diffusion-v1-5 does not appear to have a file named config.json.

@gopin95
Copy link

gopin95 commented Apr 17, 2024

有人解决了吗?

@woshiwahah
Copy link

woshiwahah commented Apr 30, 2024

有人解决了吗?

HF_ENDPOINT=https://hf-mirror.com python inference.py --num_steps 2 --ensemble_size 4 --img assets/dog.png --ckpt checkpoints/depthfm-v1.ckpt
这样运行这行命令有效,是因为抱抱脸连接不稳定

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants