GPU 0 #635
Unanswered
tiomaldy
asked this question in
General Q&A
GPU 0
#635
Replies: 3 comments 59 replies
-
Hi.
I think CUDA will be automatically be used when available on your local system, but you can try to force it setting argument |
Beta Was this translation helpful? Give feedback.
2 replies
-
I just bought a rtx 3090 and just ran into the same problem. I got an error message saying to check out the instructions at https://pytorch.org/get-started/locally/
i’m gonna try to figure it out if lmk if you have any luck.
…Sent from my iPhone
On Jul 10, 2021, at 10:41 PM, tiomaldy ***@***.***> wrote:
Its work now ,but i can t use my RTX 3080 because this version of torch or pytorch don t work with the new versions of RTX graphics card.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
Beta Was this translation helpful? Give feedback.
0 replies
-
you have to install the pytorch binaries with the cuda 11 runtime from the link in my last response or you can use “lambda stack” to install it. the link to find that is https://lambdalabs.com/blog/install-tensorflow-and-pytorch-on-rtx-30-series/
…Sent from my iPhone
On Jul 10, 2021, at 10:41 PM, tiomaldy ***@***.***> wrote:
Its work now ,but i can t use my RTX 3080 because this version of torch or pytorch don t work with the new versions of RTX graphics card.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
Beta Was this translation helpful? Give feedback.
57 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I want to use the GPU for train the model what i need to change in the code for use the GPU and not the cpu
Beta Was this translation helpful? Give feedback.
All reactions