Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confirm GPU usage command #4458

Closed
puddlejumper90 opened this issue May 15, 2024 · 3 comments
Closed

Confirm GPU usage command #4458

puddlejumper90 opened this issue May 15, 2024 · 3 comments
Labels
feature request New feature or request

Comments

@puddlejumper90
Copy link

Would like to have a way to to confirm if a GPU is actually being utilized. Maybe some kind of command or option when running a given model to test/log individual machine performance.

@puddlejumper90 puddlejumper90 added the feature request New feature or request label May 15, 2024
@pdevine
Copy link
Contributor

pdevine commented May 15, 2024

Hey @puddlejumper90 try ollama ps in the upcoming release. It will tell you how what percentage of the model is on the CPU/GPU/both. The output looks like:

NAME         	ID          	SIZE  	PROCESSOR	UNTIL
mixtral:8x22b	bf88270436ed	82 GB 	100% GPU 	4 minutes from now
llama3:latest	71a106a91016	5.9 GB	100% GPU 	About a minute from now

@pdevine pdevine closed this as completed May 15, 2024
@puddlejumper90
Copy link
Author

lifesaver, thank you @pdevine!

@pdevine
Copy link
Contributor

pdevine commented May 16, 2024

@puddlejumper90 lmk how it goes. We just shipped the feature yesterday.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants