Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA problem help please #16

Closed
vikolaz opened this issue May 8, 2024 · 7 comments
Closed

CUDA problem help please #16

vikolaz opened this issue May 8, 2024 · 7 comments

Comments

@vikolaz
Copy link

vikolaz commented May 8, 2024

it give me this error, I should have install Cuda toolkit already

To create a public link, set share=True in launch().
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 411, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in call
return await self.app(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\errors.py", line 186, in call
raise exc
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\errors.py", line 164, in call
await self.app(scope, receive, _send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\route_utils.py", line 707, in call
await self.app(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
raise exc
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 776, in app
await route.handle(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 297, in handle
await self.app(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
raise exc
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 75, in app
await response(scope, receive, send)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\responses.py", line 352, in call
await send(
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 50, in sender
await send(message)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 50, in sender
await send(message)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\errors.py", line 161, in _send
await send(message)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 549, in send
raise RuntimeError("Response content shorter than Content-Length")
RuntimeError: Response content shorter than Content-Length
Traceback (most recent call last):
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\queueing.py", line 527, in process_events
response = await route_utils.call_process_api(
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\route_utils.py", line 270, in call_process_api
output = await app.get_blocks().process_api(
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\blocks.py", line 1847, in process_api
result = await self.call_function(
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\blocks.py", line 1433, in call_function
prediction = await anyio.to_thread.run_sync(
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\anyio_backends_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\anyio_backends_asyncio.py", line 851, in run
result = context.run(func, *args)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\utils.py", line 805, in wrapper
response = f(*args, **kwargs)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main\infer.py", line 58, in infer_image
free_memory = torch.cuda.mem_get_info()[0]
File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\cuda\memory.py", line 661, in mem_get_info
device = torch.cuda.current_device()
File "D:_A.I.2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\cuda_init.py", line 769, in current_device
_lazy_init()
File "D:_A.I.2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\cuda_init.py", line 289, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

@pawansharmaaaa
Copy link
Owner

Install the pytorch-gpu module from here https://pytorch.org/get-started/locally/

@vikolaz
Copy link
Author

vikolaz commented May 9, 2024

Hi thanks for the response.
Actually I tried several time to install pytorch-gpu module from https://pytorch.org/get-started/locally/
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia

For sure there something that I'm doing wrong, I'm not so expert. I don't know if I run the torch command int the right place/folder.

If you can further help would super

Thank you!!

@vikolaz
Copy link
Author

vikolaz commented May 9, 2024

I've tried do downgrade the Nvidia Toolkit as I had the 12.1

Now I have reinstalled the Toolkit 11.8 version and reinstalled the releted pytorch

That what my system show i have:

**C:>python -c "import torch; print('PyTorch version:', torch.version)"
PyTorch version: 2.3.0+cu118

C:>python -c "import torchvision; print('Torchvision version:', torchvision.version)"
Torchvision version: 0.18.0+cu118

C:>nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Sep_21_10:41:10_Pacific_Daylight_Time_2022
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0**

@pawansharmaaaa
Copy link
Owner

Hello, I don't think that this is a problem with your cuda toolkit, it is an issue with pytorch installation. Did you run setup script?

@vikolaz
Copy link
Author

vikolaz commented May 9, 2024

Yes i did, also tried to delete the folder and start a fresh setup

@pawansharmaaaa
Copy link
Owner

Do one thing, activate the environment and install pytorch-gpu with pip

@vikolaz
Copy link
Author

vikolaz commented May 9, 2024

Yesss that solved the issue!!

Thank you very much

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants