-
Notifications
You must be signed in to change notification settings - Fork 74
Adding a mini benchmark for cross entropy loss #4472
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Review updated until commit 0f0c72e Description
Changes walkthrough 📝
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
9db2924 to
8364a62
Compare
8364a62 to
beca294
Compare
benchmarks/python/core.py
Outdated
| if isinstance(out, torch.Tensor): | ||
| iobytes += out.element_size() * out.numel() | ||
|
|
||
| if isinstance(outputs, torch.Tensor) and outputs.dim() == 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in line 161 we are wrapping a single tensor output as outputs = [outputs].
So the check here would always be false?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure if that's quite right.
'''
a = torch.tensor(5)
print(isinstance(a, Iterable))
'''
This should be true.
So outputs could be a single torch tensor and not get wrapped. Please let me know if I am mistaken.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh you are right and I didn't know that!
@Priya2698 so it looks like line 158 - 161 isn't doing the right thing?
jjsjann123
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…ropy_new_benchmark
|
!test |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would this dynamic library seriously need to be in this directory?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for catching that! That was a mistake
This add a benchmark for cross entropy loss where we can vary the vocab size. There are no linear operators before cross entropy. The target/labels are pad/sliced to mimic other models we have seen. The input is cast to FP32 and squeeze also to mimic other Hugging Face models we have seen.
This add a benchmark for cross entropy loss where we can vary the vocab size.
There are no linear operators before cross entropy.
The target/labels are pad/sliced to mimic other models we have seen. The input is cast to FP32 and squeeze also to mimic other Hugging Face models we have seen.