You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would like to have a way to to confirm if a GPU is actually being utilized. Maybe some kind of command or option when running a given model to test/log individual machine performance.
The text was updated successfully, but these errors were encountered:
Hey @puddlejumper90 try ollama ps in the upcoming release. It will tell you how what percentage of the model is on the CPU/GPU/both. The output looks like:
NAME ID SIZE PROCESSOR UNTIL
mixtral:8x22b bf88270436ed 82 GB 100% GPU 4 minutes from now
llama3:latest 71a106a91016 5.9 GB 100% GPU About a minute from now
Would like to have a way to to confirm if a GPU is actually being utilized. Maybe some kind of command or option when running a given model to test/log individual machine performance.
The text was updated successfully, but these errors were encountered: