[2025-04-14 22:59:21,269] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect)
W0414 22:59:21.478000 39480 torch\distributed\elastic\multiprocessing\redirects.py:27] NOTE: Redirects are currently not supported in Windows or MacOs.
File Not Found
Fetching 2 files: 100%|██████████| 2/2 [00:02<00:00, 1.05s/it]
* Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Traceback (most recent call last):
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\blocks.py", line 2137, in process_api
result = await self.call_function(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\blocks.py", line 1663, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 2470, in run_sync_in_worker_thread
return await future
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 967, in run
result = context.run(func, *args)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\utils.py", line 890, in wrapper
response = f(*args, **kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\utils.py", line 890, in wrapper
response = f(*args, **kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\inference\gradio_composite_demo\cogstudio.py", line 727, in generate
latents, seed = infer(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\inference\gradio_composite_demo\cogstudio.py", line 217, in infer
init(name, image_input, video_input, dtype, full_gpu)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\inference\gradio_composite_demo\cogstudio.py", line 57, in init
init_txt2vid(name, dtype_str, full_gpu)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\inference\gradio_composite_demo\cogstudio.py", line 84, in init_txt2vid
dtype = init_core(name, dtype_str)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\inference\gradio_composite_demo\cogstudio.py", line 67, in init_core
pipe = CogVideoXPipeline.from_pretrained(name, torch_dtype=dtype).to(device)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 773, in from_pretrained
cached_folder = cls.download(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 1557, in download
cached_folder = snapshot_download(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\_snapshot_download.py", line 296, in snapshot_download
thread_map(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\tqdm\contrib\concurrent.py", line 69, in thread_map
return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\tqdm\contrib\concurrent.py", line 51, in _executor_map
return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs))
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\gradio\helpers.py", line 697, in __next__
return next(current_iterable.iterable) # type: ignore
File "concurrent\futures\_base.py", line 621, in result_iterator
File "concurrent\futures\_base.py", line 319, in _result_or_cancel
File "concurrent\futures\_base.py", line 458, in result
File "concurrent\futures\_base.py", line 403, in __get_result
File "concurrent\futures\thread.py", line 58, in run
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\_snapshot_download.py", line 270, in _inner_hf_hub_download
return hf_hub_download(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\file_download.py", line 961, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\file_download.py", line 1112, in _hf_hub_download_to_cache_dir
_download_to_tmp_and_move(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\file_download.py", line 1675, in _download_to_tmp_and_move
http_get(
File "T:\Programs\StabilityMatrix\Data\Packages\Cogstudio\venv\lib\site-packages\huggingface_hub\file_download.py", line 452, in http_get
temp_file.write(chunk)
OSError: [Errno 28] No space left on device
What happened?
Cogstudio downloads models to
.cache/huggingface/hubin the user home directory, regardless of where Stability Matrix is configured to store data.Steps to reproduce
Relevant logs
Version
v2.13.4
What Operating System are you using?
Windows