Skip to content

Conversation

@walmartbaggg
Copy link
Contributor

Add more GPUs to the list, all fp16 performance.

Add more GPUs to the list, all fp16 performance.
Comment on lines 88 to 89
tflops: 17.972, // source: https://www.techpowerup.com/gpu-specs/a16-pcie.c3794
memory: [64], // this one you do not have to add, as its 4x.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it's a bit confusing as this is a bundle of 4 GPUs in a single card. I'm not sure if we should report the per-GPU specs instead, as the use cases of 4x16 GB are different to 1x64 GB cards.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright I will just remove it since it will be inaccurate anyways, since its basically 1 gpu, but 4 parts? I am always confused on those.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, another idea would be to use the per-unit specs: 4.49 tflops and 16 GB of RAM, but that could be somewhat confusing as well.

@walmartbaggg walmartbaggg requested a review from pcuenca October 21, 2024 00:24
Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! 🙌

@pcuenca
Copy link
Member

pcuenca commented Oct 23, 2024

I see the CI failing in other PRs as well, merging.

@pcuenca pcuenca merged commit da39402 into huggingface:main Oct 23, 2024
2 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants