Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hi, thank you for your work. I have no idea why this error occurs #3

Open
justinday123 opened this issue Mar 11, 2024 · 3 comments
Open

Comments

@justinday123
Copy link

Traceback (most recent call last):
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 270, in hf_raise_for_status
response.raise_for_status()
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/checkpoints/stable-diffusion-2-1-base/resolve/main/tokenizer/tokenizer_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/transformers/utils/hub.py", line 430, in cached_file
resolved_file = hf_hub_download(
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1374, in hf_hub_download
raise head_call_error
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1247, in hf_hub_download
metadata = get_hf_file_metadata(
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1624, in get_hf_file_metadata
r = _request_wrapper(
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 402, in _request_wrapper
response = _request_wrapper(
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 426, in _request_wrapper
hf_raise_for_status(response)
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 320, in hf_raise_for_status
raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-65ef52c7-40b3d3317d5897a76a3b514d;7b602761-da5b-451b-9a6f-8b425e3bbf99)

Repository Not Found for url: https://huggingface.co/checkpoints/stable-diffusion-2-1-base/resolve/main/tokenizer/tokenizer_config.json.
Please make sure you specified the correct repo_id and repo_type.
If you are trying to access a private or gated repo, make sure you are authenticated.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "inference.py", line 42, in
tokenizer = CLIPTokenizer.from_pretrained(args.sd_path, subfolder="tokenizer")
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1947, in from_pretrained
resolved_config_file = cached_file(
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/transformers/utils/hub.py", line 451, in cached_file
raise EnvironmentError(
OSError: checkpoints/stable-diffusion-2-1-base is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>

can you check it? thanks

@justinday123
Copy link
Author

Also, if i make a directory checkpoints/stable-diffusion-2-1-base, and move stable diffusion's .pt file to the directory, another error occurs

Traceback (most recent call last):
File "inference.py", line 42, in
tokenizer = CLIPTokenizer.from_pretrained(args.sd_path, subfolder="tokenizer")
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2008, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for 'checkpoints/stable-diffusion-2-1-base'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'checkpoints/stable-diffusion-2-1-base' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

@yrcong
Copy link
Owner

yrcong commented Mar 11, 2024

Hi,
it seems something wrong with your SD. I am not sure where you downloaded the stable diffusion's .pt.

Please directly download SD2.1 from huggingface: https://huggingface.co/stabilityai/stable-diffusion-2-1-base with

git lfs install
git clone https://huggingface.co/stabilityai/stable-diffusion-2-1-base

Anyhow, please make sure you can sample images using SD at first.

@justinday123
Copy link
Author

when i put "v2-1_512-ema-pruned.ckpt" to checkpoints/stable-diffusion-2-1-base, next error occurs.
/workspace/jaewoong/FLATTEN/flatten/models/pipeline_flatten.py:37: FutureWarning: Importing DiffusionPipeline or ImagePipelineOutput fr
om diffusers.pipeline_utils is deprecated. Please import from diffusers.pipelines.pipeline_utils instead.
from diffusers.pipeline_utils import DiffusionPipeline
/opt/conda/envs/flatten/lib/python3.8/site-packages/diffusers/models/cross_attention.py:30: FutureWarning: Importing from cross_attention is deprecated. Please import from diffusers.models.attention_processor instead.
deprecate(
Traceback (most recent call last):
File "inference.py", line 42, in
tokenizer = CLIPTokenizer.from_pretrained(args.sd_path, subfolder="tokenizer")
File "/opt/conda/envs/flatten/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2008, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for 'checkpoints/stable-diffusion-2-1-base'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'checkpoints/stable-diffusion-2-1-base' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants