-
Notifications
You must be signed in to change notification settings - Fork 683
Add minimum_length to extended header for BC #14320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/14320
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit a9baded with merge base 4a4f5a0 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
exir/_serialize/test/test_program.py
Outdated
# Magic bytes | ||
b"eh00" | ||
# uint32_t header size (little endian) | ||
# uint32_t header size (little endian) --> 32 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
? whats this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh its just the number 32 in little endian in hex
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The header size (x20) is 32 bytes. This is the new one with segment_data_size.
Below is the old one (x18), 24 bytes.
|
||
# To find the header, callers should provide at least this many bytes of | ||
# the head of the serialized Program data. | ||
NUM_HEAD_BYTES: ClassVar[int] = 64 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is 64 just a random number with some leeway for future expansion?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is in sync with the C++ kNumHeadBytes, I think the value 64 is arbitrary leeway.
Summary: pytorch#14320 Error happening when we have older PTE files with extended header size 24. When we call 'from_bytes', we expect header size 32 after adding segment_data_size field in D81938296 This is BC on C++ side because we have a minimum length. Add minimum length to python to make the change BC. Reviewed By: JacobSzwejbka Differential Revision: D82492169
e35c0ea
to
0fc66b5
Compare
Hi @lucylq -- this is concerning. Is the extended header a BC breaking change for the C++ runtime? We shouldn't make do any BC breaking. In other words, older .PTE files should be runnable with new runtime. This PR seems like only do mitigation for python side -- but it doesn't make sure older PTE has still be loadable. |
Hi @mnachin the PRs landed last week are not BC breaking for C++ side; we check for segment_data_size if it exists and not otherwise. There isn't an explicit test in executorch given we generate our PTE files on the fly (so they have segment_data_size), however there are tests from internal et users that cover this. I can check in an older PTE file if we want to explicitly test this in C++ as well? It was a breaking change for python (which is my bad), which this PR changes. |
Summary: pytorch#14320 Error happening when we have older PTE files with extended header size 24. When we call 'from_bytes', we expect header size 32 after adding segment_data_size field in D81938296 This is BC on C++ side because we have a minimum length. Add minimum length to python to make the change BC. Reviewed By: JacobSzwejbka, hyxu2006 Differential Revision: D82492169
0fc66b5
to
a6e0f9c
Compare
Summary: pytorch#14320 Error happening when we have older PTE files with extended header size 24. When we call 'from_bytes', we expect header size 32 after adding segment_data_size field in D81938296 This is BC on C++ side because we have a minimum length. Add minimum length to python to make the change BC. Reviewed By: JacobSzwejbka, hyxu2006 Differential Revision: D82492169
a6e0f9c
to
a9baded
Compare
Yeah, going back to release/0.6, generate an old .PTE file and save it something like "model_v_0_6.pte" and then load this in the new runtime. We should also look into automating this soon. |
Actually we should start with release/0.4 -- that's when released beta |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The python side looks good neverthess.
@pytorchbot cherry-pick --onto release/1.0 -c regression |
Cherry picking #14320The cherry pick PR is at #14334 and it is recommended to link a regression cherry pick PR with an issue. The following tracker issues are updated: Details for Dev Infra teamRaised by workflow job |
Differential Revision: D82492169 Pull Request resolved: pytorch#14320
Summary:
Error happening when we have older PTE files with extended header size 24.
When we call 'from_bytes', we expect header size 32 after adding segment_data_size field.
This is BC on C++ side because we have a minimum length.
Add minimum length to python to make the change BC.
Differential Revision: D82492169