-
Notifications
You must be signed in to change notification settings - Fork 862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoMM] Support model parameter number statistics #3289
Conversation
Job PR-3289-6861f0c is done. |
def model_size(self) -> float: | ||
precision_to_bits = {64: 64, 32: 32, 16: 16, "bf16": 16} | ||
if self._config is not None: | ||
precision = precision_to_bits.get(self._config.env.precision, 16) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wondering how mixed precision can be handled here? Shall we use total_size_in_bytes = sum(p.numel() * p.element_size() for p in self._model.parameters())
to address it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good suggestion. Updated.
from torch.nn.parameter import UninitializedParameter | ||
|
||
if isinstance(p, UninitializedParameter): | ||
warnings.warn( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wondering how frequent this gonna happen? If it is the case, shall we just run one dummy forward pass to make these layers initialized and then calculate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This refers to pytorch lightning: https://github.com/Lightning-AI/lightning/blob/984f49f7195ddc67e961c7c498ee6e19fc0cecb5/src/lightning/pytorch/utilities/model_summary/model_summary.py#L236
In some cases, parameters are uninitialized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, as long as we are consistent with lightning, shall be good
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Job PR-3289-8b00b1e is done. |
Issue #, if available:
Description of changes:
Support calling
predictor.total_parameters
,predictor.trainable_parameters
, andpredictor.model_size
.By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.