-
Notifications
You must be signed in to change notification settings - Fork 862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoMM] Log GPU info #3291
[AutoMM] Log GPU info #3291
Conversation
def get_gpu_message(detected_num_gpus: int, used_num_gpus: int): | ||
gpu_message = "" | ||
gpu_message += f"{detected_num_gpus} GPUs are detected, and {used_num_gpus} GPUs will be used.\n" | ||
for i in range(detected_num_gpus): | ||
free_memory, total_memory = torch.cuda.mem_get_info(i) | ||
gpu_message += f" - GPU {i} name: {torch.cuda.get_device_name(i)}\n" | ||
gpu_message += f" - GPU {i} memory: {free_memory * 1e-9:.2f}GB/{total_memory * 1e-9:.2f}GB (Free/Total)\n" | ||
gpu_message += f"CUDA version is {torch.version.cuda}.\n" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens if CUDA isn't installed (CPU torch)? Is that possible, or is CUDA always present even on CPU?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point. Added CUDA detection to print CUDA version.
Job PR-3291-8bbc2f6 is done. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, this looks great!
Issue #, if available:
#3284
Description of changes:
Log AG version, torch version, and GPU info.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.