-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OSError: [Errno 28] No space left on device #9
Comments
I uninstalled from I drive and reinstalled on C drive. I have 343GB of free space and 2024-03-02 22:48:04.079 | INFO | iopaint.runtime:setup_model_dir:82 - Model directory: C:\Users\mdmen.cache
2024-03-02 22:48:05.901 | INFO | iopaint.cli:start:153 - Image will be saved to C:\IOPaint\IOPaint-v1\IOOUT |
I'm sorry for the inconvenience, the only related issue I found in the huggingface_hub is this one huggingface/huggingface_hub#1498, and the solution is also to clear up disk space. Could you please check if there are a lot of files in the Recycle Bin? (Windows Explorer doesn't seem to consider files in the Recycle Bin as taking up space on the C drive, but in actuality, they do.) |
I have 365G left on my I drive and keep getting this no more space left on drive error
'2024-03-02 19:25:31.976 | INFO | iopaint.runtime:setup_model_dir:82 - Model directory: I:\IOPaint-v1\Models
2024-03-02 19:25:33.539 | INFO | iopaint.cli:start:153 - Image will be saved to I:\IOPaint-v1\Theoutputs
{
"host": "127.0.0.1",
"port": 8080,
"inbrowser": true,
"model": "diffusionbee/fooocus_inpainting",
"no_half": false,
"low_mem": false,
"cpu_offload": false,
"disable_nsfw_checker": true,
"local_files_only": false,
"cpu_textencoder": false,
"device": "cuda",
"input": null,
"output_dir": "I:\IOPaint-v1\Theoutputs",
"quality": 95,
"enable_interactive_seg": true,
"interactive_seg_model": "vit_b",
"interactive_seg_device": "cuda",
"enable_remove_bg": true,
"remove_bg_model": "briaai/RMBG-1.4",
"enable_anime_seg": false,
"enable_realesrgan": true,
"realesrgan_device": "cuda",
"realesrgan_model": "RealESRGAN_x4plus",
"enable_gfpgan": true,
"gfpgan_device": "cuda",
"enable_restoreformer": true,
"restoreformer_device": "cuda"
}
2024-03-02 19:25:35.095 | INFO | iopaint.plugins:build_plugins:32 - Initialize InteractiveSeg plugin
2024-03-02 19:25:35.095 | INFO | iopaint.plugins.interactive_seg:_init_session:62 - SegmentAnything model path: I:\IOPaint-v1\Models\torch\hub\checkpoints\sam_vit_b_01ec64.pth
2024-03-02 19:25:35.907 | INFO | iopaint.plugins:build_plugins:38 - Initialize RemoveBG plugin
2024-03-02 19:25:37.009 | INFO | iopaint.plugins:build_plugins:46 - Initialize RealESRGAN plugin: RealESRGAN_x4plus, cuda
2024-03-02 19:25:37.529 | INFO | iopaint.plugins.realesrgan:_init_model:73 - RealESRGAN model path: I:\IOPaint-v1\Models\torch\hub\checkpoints\RealESRGAN_x4plus.pth
2024-03-02 19:25:37.929 | INFO | iopaint.plugins:build_plugins:56 - Initialize GFPGAN plugin
2024-03-02 19:25:37.929 | INFO | iopaint.plugins:build_plugins:58 - Use realesrgan as GFPGAN background upscaler
2024-03-02 19:25:37.959 | INFO | iopaint.plugins.gfpgan_plugin:init:21 - GFPGAN model path: I:\IOPaint-v1\Models\torch\hub\checkpoints\GFPGANv1.4.pth
2024-03-02 19:25:39.001 | INFO | iopaint.plugins:build_plugins:69 - Initialize RestoreFormer plugin
2024-03-02 19:25:39.001 | INFO | iopaint.plugins.restoreformer:init:21 - RestoreFormer model path: I:\IOPaint-v1\Models\torch\hub\checkpoints\RestoreFormer.pth
Working with z of shape (1, 256, 16, 16) = 65536 dimensions.
2024-03-02 19:25:39.792 | INFO | iopaint.model_manager:init_model:38 - Loading model: diffusionbee/fooocus_inpainting
2024-03-02 19:25:40.162 | INFO | iopaint.model.utils:handle_from_pretrained_exceptions:985 - variant=fp16 not found, try revision=fp16
Couldn't connect to the Hub: 404 Client Error. (Request ID: Root=1-65e3c384-5d0f61961107b3d0375ea0dc;9bf5befe-fbbb-49e9-842e-875c2cfabf08)
Revision Not Found for url: https://huggingface.co/api/models/diffusionbee/fooocus_inpainting/revision/fp16.
Invalid rev id: fp16.
Will try to load from local cache.
2024-03-02 19:25:40.192 | INFO | iopaint.model.utils:handle_from_pretrained_exceptions:989 - revision=fp16 not found, try revision=main
unet\diffusion_pytorch_model.safetensors not found
unet/diffusion_pytorch_model.bin: 0%| | 10.5M/5.14G [00:00<01:57, 43.6MB/s]
text_encoder_2/pytorch_model.bin: 1%|▎ | 10.5M/1.39G [00:00<00:43, 31.6MB/s]
text_encoder/pytorch_model.bin: 4%|█▉ | 10.5M/246M [00:00<00:11, 21.3MB/s]
Fetching 17 files: 18%|███████████▎ | 3/17 [00:00<00:03, 4.03it/s]
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model\utils.py:982 in │00<00:10, 21.8MB/s]
│ handle_from_pretrained_exceptions │
│ │
│ 979 │
│ 980 def handle_from_pretrained_exceptions(func, **kwargs): │
│ 981 │ try: │
│ ❱ 982 │ │ return func(**kwargs) │
│ 983 │ except ValueError as e: │
│ 984 │ │ if "You are trying to load the model files of the
variant=fp16
" in str(e): ││ 985 │ │ │ logger.info("variant=fp16 not found, try revision=fp16") │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1111 in │
│ from_pretrained │
│ │
│ 1108 │ │ │ │ │ f'The provided pretrained_model_name_or_path "{pretrained_model_name │
│ 1109 │ │ │ │ │ " is neither a valid local path nor a valid repo id. Please check th │
│ 1110 │ │ │ │ ) │
│ ❱ 1111 │ │ │ cached_folder = cls.download( │
│ 1112 │ │ │ │ pretrained_model_name_or_path, │
│ 1113 │ │ │ │ cache_dir=cache_dir, │
│ 1114 │ │ │ │ resume_download=resume_download, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1726 in download │
│ │
│ 1723 │ │ │ │ │ "if such variant modeling files are not available. Doing so will lea │
│ 1724 │ │ │ │ │ "modeling files is deprecated." │
│ 1725 │ │ │ │ ) │
│ ❱ 1726 │ │ │ │ deprecate("no variant default", "0.24.0", deprecation_message, standard │
│ 1727 │ │ │ │
│ 1728 │ │ │ # remove ignored filenames │
│ 1729 │ │ │ model_filenames = set(model_filenames) - set(ignore_filenames) │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\utils\deprecation_utils.py:18 in deprecate │
│ │
│ 15 │ │
│ 16 │ for attribute, version_name, message in args: │
│ 17 │ │ if version.parse(version.parse(version).base_version) >= version.parse(versi │
│ ❱ 18 │ │ │ raise ValueError( │
│ 19 │ │ │ │ f"The deprecation tuple {(attribute, version_name, message)} should be r │
│ 20 │ │ │ │ f" version {version} is >= {version_name}" │
│ 21 │ │ │ ) │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ValueError: The deprecation tuple ('no variant default', '0.24.0', "You are trying to load the model files of the
variant=fp16
, but no such modeling files are available.The default model files: {'unet/diffusion_pytorch_model.bin','text_encoder_2/pytorch_model.bin', 'vae/diffusion_pytorch_model.bin', 'text_encoder/pytorch_model.bin'} will be loaded
instead. Make sure to not load from
variant=fp16
if such variant modeling files are not available. Doing so will leadto an error in v0.24.0 as defaulting to non-variantmodeling files is deprecated.") should be removed since diffusers'
version 0.26.3 is >= 0.24.0
During handling of the above exception, another exception occurred:
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_errors.py:304 in │
│ hf_raise_for_status │
│ │
│ 301 │ │
│ 302 │ """ │
│ 303 │ try: │
│ ❱ 304 │ │ response.raise_for_status() │
│ 305 │ except HTTPError as e: │
│ 306 │ │ error_code = response.headers.get("X-Error-Code") │
│ 307 │ │ error_message = response.headers.get("X-Error-Message") │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\requests\models.py:1021 in raise_for_status │
│ │
│ 1018 │ │ │ ) │
│ 1019 │ │ │
│ 1020 │ │ if http_error_msg: │
│ ❱ 1021 │ │ │ raise HTTPError(http_error_msg, response=self) │
│ 1022 │ │
│ 1023 │ def close(self): │
│ 1024 │ │ """Releases the connection back to the pool. Once this method has been │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
HTTPError: 404 Client Error: Not Found for url:
https://huggingface.co/api/models/diffusionbee/fooocus_inpainting/revision/fp16
The above exception was the direct cause of the following exception:
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1671 in download │
│ │
│ 1668 │ │ model_info_call_error: Optional[Exception] = None │
│ 1669 │ │ if not local_files_only: │
│ 1670 │ │ │ try: │
│ ❱ 1671 │ │ │ │ info = model_info(pretrained_model_name, token=token, revision=revision) │
│ 1672 │ │ │ except (HTTPError, OfflineModeIsEnabled, requests.ConnectionError) as e: │
│ 1673 │ │ │ │ logger.warn(f"Couldn't connect to the Hub: {e}.\nWill try to load from l │
│ 1674 │ │ │ │ local_files_only = True │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\hf_api.py:2220 in model_info │
│ │
│ 2217 │ │ if files_metadata: │
│ 2218 │ │ │ params["blobs"] = True │
│ 2219 │ │ r = get_session().get(path, headers=headers, timeout=timeout, params=params) │
│ ❱ 2220 │ │ hf_raise_for_status(r) │
│ 2221 │ │ data = r.json() │
│ 2222 │ │ return ModelInfo(**data) │
│ 2223 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_errors.py:311 in │
│ hf_raise_for_status │
│ │
│ 308 │ │ │
│ 309 │ │ if error_code == "RevisionNotFound": │
│ 310 │ │ │ message = f"{response.status_code} Client Error." + "\n\n" + f"Revision Not │
│ ❱ 311 │ │ │ raise RevisionNotFoundError(message, response) from e │
│ 312 │ │ │
│ 313 │ │ elif error_code == "EntryNotFound": │
│ 314 │ │ │ message = f"{response.status_code} Client Error." + "\n\n" + f"Entry Not Fou │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
RevisionNotFoundError: 404 Client Error. (Request ID:
Root=1-65e3c384-5d0f61961107b3d0375ea0dc;9bf5befe-fbbb-49e9-842e-875c2cfabf08)
Revision Not Found for url: https://huggingface.co/api/models/diffusionbee/fooocus_inpainting/revision/fp16.
Invalid rev id: fp16
The above exception was the direct cause of the following exception:
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model\utils.py:987 in │
│ handle_from_pretrained_exceptions │
│ │
│ 984 │ │ if "You are trying to load the model files of the
variant=fp16
" in str(e): ││ 985 │ │ │ logger.info("variant=fp16 not found, try revision=fp16") │
│ 986 │ │ │ try: │
│ ❱ 987 │ │ │ │ return func(**{kwargs, "variant": None, "revision": "fp16"}) │
│ 988 │ │ │ except Exception as e: │
│ 989 │ │ │ │ logger.info("revision=fp16 not found, try revision=main") │
│ 990 │ │ │ │ return func({**kwargs, "variant": None, "revision": "main"}) │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1111 in │
│ from_pretrained │
│ │
│ 1108 │ │ │ │ │ f'The provided pretrained_model_name_or_path "{pretrained_model_name │
│ 1109 │ │ │ │ │ " is neither a valid local path nor a valid repo id. Please check th │
│ 1110 │ │ │ │ ) │
│ ❱ 1111 │ │ │ cached_folder = cls.download( │
│ 1112 │ │ │ │ pretrained_model_name_or_path, │
│ 1113 │ │ │ │ cache_dir=cache_dir, │
│ 1114 │ │ │ │ resume_download=resume_download, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1920 in download │
│ │
│ 1917 │ │ │ │ raise │
│ 1918 │ │ │ else: │
│ 1919 │ │ │ │ # 2. we forced
local_files_only=True
whenmodel_info
failed ││ ❱ 1920 │ │ │ │ raise EnvironmentError( │
│ 1921 │ │ │ │ │ f"Cannot load model {pretrained_model_name}: model is not cached loc │
│ 1922 │ │ │ │ │ " while trying to fetch metadata from the Hub. Please check out the │
│ 1923 │ │ │ │ │ " above." │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
OSError: Cannot load model diffusionbee/fooocus_inpainting: model is not cached locally and an error occured while
trying to fetch metadata from the Hub. Please check out the root cause in the stacktrace above.
During handling of the above exception, another exception occurred:
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ I:\IOPaint-v1\installer\lib\site-packages\typer_config\decorators.py:92 in wrapped │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\cli.py:212 in start │
│ │
│ 209 │ │ restoreformer_device=restoreformer_device, │
│ 210 │ ) │
│ 211 │ print(api_config.model_dump_json(indent=4)) │
│ ❱ 212 │ api = Api(app, api_config) │
│ 213 │ api.launch() │
│ 214 │
│ 215 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\api.py:158 in init │
│ │
│ 155 │ │ │
│ 156 │ │ self.file_manager = self._build_file_manager() │
│ 157 │ │ self.plugins = self._build_plugins() │
│ ❱ 158 │ │ self.model_manager = self._build_model_manager() │
│ 159 │ │ │
│ 160 │ │ # fmt: off │
│ 161 │ │ self.add_api_route("/api/v1/gen-info", self.api_geninfo, methods=["POST"], respo │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\api.py:387 in _build_model_manager │
│ │
│ 384 │ │ ) │
│ 385 │ │
│ 386 │ def _build_model_manager(self): │
│ ❱ 387 │ │ return ModelManager( │
│ 388 │ │ │ name=self.config.model, │
│ 389 │ │ │ device=torch.device(self.config.device), │
│ 390 │ │ │ no_half=self.config.no_half, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model_manager.py:31 in init │
│ │
│ 28 │ │ ): │
│ 29 │ │ │ controlnet_method = self.available_models[name].controlnets[0] │
│ 30 │ │ self.controlnet_method = controlnet_method │
│ ❱ 31 │ │ self.model = self.init_model(name, device, **kwargs) │
│ 32 │ │
│ 33 │ @Property │
│ 34 │ def current_model(self) -> ModelInfo: │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model_manager.py:67 in init_model │
│ │
│ 64 │ │ │ │ ModelType.DIFFUSERS_SDXL_INPAINT, │
│ 65 │ │ │ │ ModelType.DIFFUSERS_SDXL, │
│ 66 │ │ │ ]: │
│ ❱ 67 │ │ │ │ return SDXL(device, **kwargs) │
│ 68 │ │ │
│ 69 │ │ raise NotImplementedError(f"Unsupported model: {name}") │
│ 70 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model\base.py:279 in init │
│ │
│ 276 │ def init(self, device, **kwargs): │
│ 277 │ │ self.model_info = kwargs["model_info"] │
│ 278 │ │ self.model_id_or_path = self.model_info.path │
│ ❱ 279 │ │ super().init(device, **kwargs) │
│ 280 │ │
│ 281 │ @torch.no_grad() │
│ 282 │ def call(self, image, mask, config: InpaintRequest): │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model\base.py:35 in init │
│ │
│ 32 │ │ """ │
│ 33 │ │ device = switch_mps_device(self.name, device) │
│ 34 │ │ self.device = device │
│ ❱ 35 │ │ self.init_model(device, **kwargs) │
│ 36 │ │
│ 37 │ @abc.abstractmethod │
│ 38 │ def init_model(self, device, kwargs): │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model\sdxl.py:57 in init_model │
│ │
│ 54 │ │ │ │ │ "madebyollin/sdxl-vae-fp16-fix", torch_dtype=torch_dtype │
│ 55 │ │ │ │ ) │
│ 56 │ │ │ │ model_kwargs["vae"] = vae │
│ ❱ 57 │ │ │ self.model = handle_from_pretrained_exceptions( │
│ 58 │ │ │ │ StableDiffusionXLInpaintPipeline.from_pretrained, │
│ 59 │ │ │ │ pretrained_model_name_or_path=self.model_id_or_path, │
│ 60 │ │ │ │ torch_dtype=torch_dtype, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\iopaint\model\utils.py:990 in │
│ handle_from_pretrained_exceptions │
│ │
│ 987 │ │ │ │ return func({kwargs, "variant": None, "revision": "fp16"}) │
│ 988 │ │ │ except Exception as e: │
│ 989 │ │ │ │ logger.info("revision=fp16 not found, try revision=main") │
│ ❱ 990 │ │ │ │ return func({**kwargs, "variant": None, "revision": "main"}) │
│ 991 │ │ raise e │
│ 992 │ except OSError as e: │
│ 993 │ │ previous_traceback = traceback.format_exc() │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1111 in │
│ from_pretrained │
│ │
│ 1108 │ │ │ │ │ f'The provided pretrained_model_name_or_path "{pretrained_model_name │
│ 1109 │ │ │ │ │ " is neither a valid local path nor a valid repo id. Please check th │
│ 1110 │ │ │ │ ) │
│ ❱ 1111 │ │ │ cached_folder = cls.download( │
│ 1112 │ │ │ │ pretrained_model_name_or_path, │
│ 1113 │ │ │ │ cache_dir=cache_dir, │
│ 1114 │ │ │ │ resume_download=resume_download, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\diffusers\pipelines\pipeline_utils.py:1872 in download │
│ │
│ 1869 │ │ │
│ 1870 │ │ # download all allow_patterns - ignore_patterns │
│ 1871 │ │ try: │
│ ❱ 1872 │ │ │ cached_folder = snapshot_download( │
│ 1873 │ │ │ │ pretrained_model_name, │
│ 1874 │ │ │ │ cache_dir=cache_dir, │
│ 1875 │ │ │ │ resume_download=resume_download, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub_snapshot_download.py:308 in │
│ snapshot_download │
│ │
│ 305 │ │ for file in filtered_repo_files: │
│ 306 │ │ │ _inner_hf_hub_download(file) │
│ 307 │ else: │
│ ❱ 308 │ │ thread_map( │
│ 309 │ │ │ _inner_hf_hub_download, │
│ 310 │ │ │ filtered_repo_files, │
│ 311 │ │ │ desc=f"Fetching {len(filtered_repo_files)} files", │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\tqdm\contrib\concurrent.py:94 in thread_map │
│ │
│ 91 │ │ [default: max(32, cpu_count() + 4)]. │
│ 92 │ """ │
│ 93 │ from concurrent.futures import ThreadPoolExecutor │
│ ❱ 94 │ return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs) │
│ 95 │
│ 96 │
│ 97 def process_map(fn, *iterables, **tqdm_kwargs): │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\tqdm\contrib\concurrent.py:76 in _executor_map │
│ │
│ 73 │ │ if not (3, 0) < sys_version < (3, 5): │
│ 74 │ │ │ map_args.update(chunksize=chunksize) │
│ 75 │ │ with PoolExecutor(**pool_kwargs) as ex: │
│ ❱ 76 │ │ │ return list(tqdm_class(ex.map(fn, *iterables, **map_args), **kwargs)) │
│ 77 │
│ 78 │
│ 79 def thread_map(fn, *iterables, **tqdm_kwargs): │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\tqdm\std.py:1195 in iter │
│ │
│ 1192 │ │ time = self._time │
│ 1193 │ │ │
│ 1194 │ │ try: │
│ ❱ 1195 │ │ │ for obj in iterable: │
│ 1196 │ │ │ │ yield obj │
│ 1197 │ │ │ │ # Update and possibly print the progressbar. │
│ 1198 │ │ │ │ # Note: does not call self.update(1) for speed optimisation. │
│ │
│ I:\IOPaint-v1\installer\lib\concurrent\futures_base.py:621 in result_iterator │
│ │
│ 618 │ │ │ │ while fs: │
│ 619 │ │ │ │ │ # Careful not to keep a reference to the popped future │
│ 620 │ │ │ │ │ if timeout is None: │
│ ❱ 621 │ │ │ │ │ │ yield _result_or_cancel(fs.pop()) │
│ 622 │ │ │ │ │ else: │
│ 623 │ │ │ │ │ │ yield _result_or_cancel(fs.pop(), end_time - time.monotonic()) │
│ 624 │ │ │ finally: │
│ │
│ I:\IOPaint-v1\installer\lib\concurrent\futures_base.py:319 in _result_or_cancel │
│ │
│ 316 def _result_or_cancel(fut, timeout=None): │
│ 317 │ try: │
│ 318 │ │ try: │
│ ❱ 319 │ │ │ return fut.result(timeout) │
│ 320 │ │ finally: │
│ 321 │ │ │ fut.cancel() │
│ 322 │ finally: │
│ │
│ I:\IOPaint-v1\installer\lib\concurrent\futures_base.py:458 in result │
│ │
│ 455 │ │ │ │ if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]: │
│ 456 │ │ │ │ │ raise CancelledError() │
│ 457 │ │ │ │ elif self._state == FINISHED: │
│ ❱ 458 │ │ │ │ │ return self.__get_result() │
│ 459 │ │ │ │ else: │
│ 460 │ │ │ │ │ raise TimeoutError() │
│ 461 │ │ finally: │
│ │
│ I:\IOPaint-v1\installer\lib\concurrent\futures_base.py:403 in __get_result │
│ │
│ 400 │ def __get_result(self): │
│ 401 │ │ if self._exception: │
│ 402 │ │ │ try: │
│ ❱ 403 │ │ │ │ raise self._exception │
│ 404 │ │ │ finally: │
│ 405 │ │ │ │ # Break a reference cycle with the exception in self._exception │
│ 406 │ │ │ │ self = None │
│ │
│ I:\IOPaint-v1\installer\lib\concurrent\futures\thread.py:58 in run │
│ │
│ 55 │ │ │ return │
│ 56 │ │ │
│ 57 │ │ try: │
│ ❱ 58 │ │ │ result = self.fn(*self.args, **self.kwargs) │
│ 59 │ │ except BaseException as exc: │
│ 60 │ │ │ self.future.set_exception(exc) │
│ 61 │ │ │ # Break a reference cycle with the exception 'exc' │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub_snapshot_download.py:283 in │
│ _inner_hf_hub_download │
│ │
│ 280 │ # so no network call happens if we already │
│ 281 │ # have the file locally. │
│ 282 │ def _inner_hf_hub_download(repo_file: str): │
│ ❱ 283 │ │ return hf_hub_download( │
│ 284 │ │ │ repo_id, │
│ 285 │ │ │ filename=repo_file, │
│ 286 │ │ │ repo_type=repo_type, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\utils_validators.py:118 in _inner_fn │
│ │
│ 115 │ │ if check_use_auth_token: │
│ 116 │ │ │ kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.name, has_token=ha │
│ 117 │ │ │
│ ❱ 118 │ │ return fn(*args, **kwargs) │
│ 119 │ │
│ 120 │ return _inner_fn # type: ignore │
│ 121 │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\file_download.py:1492 in │
│ hf_hub_download │
│ │
│ 1489 │ │ │ │ if local_dir is not None: │
│ 1490 │ │ │ │ │ _check_disk_space(expected_size, local_dir) │
│ 1491 │ │ │ │
│ ❱ 1492 │ │ │ http_get( │
│ 1493 │ │ │ │ url_to_download, │
│ 1494 │ │ │ │ temp_file, │
│ 1495 │ │ │ │ proxies=proxies, │
│ │
│ I:\IOPaint-v1\installer\lib\site-packages\huggingface_hub\file_download.py:550 in http_get │
│ │
│ 547 │ │ │ for chunk in r.iter_content(chunk_size=DOWNLOAD_CHUNK_SIZE): │
│ 548 │ │ │ │ if chunk: # filter out keep-alive new chunks │
│ 549 │ │ │ │ │ progress.update(len(chunk)) │
│ ❱ 550 │ │ │ │ │ temp_file.write(chunk) │
│ 551 │ │ │ │ │ new_resume_size += len(chunk) │
│ 552 │ │ │ │ │ # Some data has been downloaded from the server so we reset the numb │
│ 553 │ │ │ │ │ _nb_retries = 5 │
│ │
│ I:\IOPaint-v1\installer\lib\tempfile.py:483 in func_wrapper │
│ │
│ 480 │ │ │ func = a │
│ 481 │ │ │ @_functools.wraps(func) │
│ 482 │ │ │ def func_wrapper(*args, **kwargs): │
│ ❱ 483 │ │ │ │ return func(*args, **kwargs) │
│ 484 │ │ │ # Avoid closing the file as long as the wrapper is alive, │
│ 485 │ │ │ # see issue #18879. │
│ 486 │ │ │ func_wrapper._closer = self._closer │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
OSError: [Errno 28] No space left on device
The text was updated successfully, but these errors were encountered: