Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mac m1 虚拟机 乌班图 启动失败 base 配置文件 device 全都都改成0了 #9

Open
lh08164311 opened this issue May 22, 2024 · 0 comments

Comments

@lh08164311
Copy link

home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio_client/documentation.py:103: UserWarning: Could not get documentation group for <class 'gradio.mix.Parallel'>: No known documentation group for module 'gradio.mix'
warnings.warn(f"Could not get documentation group for {cls}: {exc}")
/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio_client/documentation.py:103: UserWarning: Could not get documentation group for <class 'gradio.mix.Series'>: No known documentation group for module 'gradio.mix'
warnings.warn(f"Could not get documentation group for {cls}: {exc}")
Caching examples at: '/home/parallels/Downloads/ai_webui-main/gradio_cached_examples/108'
Caching example 1/8
Ultralytics YOLOv8.0.120 🚀 Python-3.11.9 torch-2.1.1
Traceback (most recent call last):
File "/home/parallels/Downloads/ai_webui-main/webui.py", line 85, in
launch_webui(**opt.dict)
File "/home/parallels/Downloads/ai_webui-main/webui.py", line 52, in launch_webui
sam_tab(segmentation_args, ai_handler)
File "/home/parallels/Downloads/ai_webui-main/utils/gradio_tabs/image_tabs.py", line 50, in sam_tab
gr.Examples(examples=examples,
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio/helpers.py", line 74, in create_examples
client_utils.synchronize_async(examples_obj.create)
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio_client/utils.py", line 855, in synchronize_async
return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/fsspec/asyn.py", line 103, in sync
raise return_result
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/fsspec/asyn.py", line 56, in _runner
result[0] = await coro
^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio/helpers.py", line 276, in create
await self.cache()
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio/helpers.py", line 332, in cache
prediction = await Context.root_block.process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio/blocks.py", line 1392, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio/blocks.py", line 1097, in call_function
prediction = await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/gradio/utils.py", line 703, in wrapper
response = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home/parallels/Downloads/ai_webui-main/tools/fastsam_handler.py", line 43, in segment_everything
results = self.model(input,
^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/ultralytics/yolo/engine/model.py", line 111, in call
return self.predict(source, stream, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/ultralytics/yolo/engine/model.py", line 250, in predict
self.predictor.setup_model(model=self.model, verbose=is_cli)
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/ultralytics/yolo/engine/predictor.py", line 297, in setup_model
device = select_device(self.args.device, verbose=verbose)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/parallels/anaconda3/envs/aiwebui/lib/python3.11/site-packages/ultralytics/yolo/utils/torch_utils.py", line 75, in select_device
raise ValueError(f"Invalid CUDA 'device={device}' requested."
ValueError: Invalid CUDA 'device=0' requested. Use 'device=cpu' or pass valid CUDA device(s) if available, i.e. 'device=0' or 'device=0,1,2,3' for Multi-GPU.

torch.cuda.is_available(): False
torch.cuda.device_count(): 0
os.environ['CUDA_VISIBLE_DEVICES']: None
See https://pytorch.org/get-started/locally/ for up-to-date torch install instructions if no CUDA devices are seen by torch.

IMPORTANT: You are using gradio version 3.39.0, however version 4.29.0 is available, please upgrade.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant