Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make inductor config hashing more portable #127022

Closed
wants to merge 1 commit into from
Closed

Conversation

oulgen
Copy link
Contributor

@oulgen oulgen commented May 23, 2024

Summary: masnesral and I noticed that config contains non portable artifacts. Lets fix that.

Test Plan: adhoc testing

Differential Revision: D57748025

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @peterbell10 @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @chauhang

Copy link

pytorch-bot bot commented May 23, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/127022

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit aa5d81a with merge base b0871f9 (image):

UNSTABLE - The following job failed but was likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57748025

@oulgen oulgen requested review from aorenste and masnesral May 23, 2024 21:50
@oulgen oulgen added ciflow/trunk Trigger trunk jobs on your pull request topic: not user facing topic category labels May 23, 2024
@@ -656,11 +656,11 @@ def __init__(
self.system_info = CacheBase.get_system()

try:
self.inductor_config = config.save_config()
self.inductor_config = config.save_config_for_cache()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Naming this "save_config_for_hash()" might be slightly more accurate? Just mentioning; I don't have a strong opinion.

@@ -156,6 +156,19 @@ def save_config(self) -> bytes:
config.pop(key)
return pickle.dumps(config, protocol=2)

def save_config_for_cache(self) -> bytes:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It occurs to me that we could stick the whole dict into a field in the FxGraphDetails object rather than pickling it here and storing the serialized bytes in the details. The only advantage to doing that is the debug output we were inspecting today would show each field separately instead of just calling it "bytes"

oulgen added a commit that referenced this pull request May 23, 2024
Summary:

masnesral and I noticed that config contains non portable artifacts. Lets fix that.

Test Plan: adhoc testing

Reviewed By: masnesral

Differential Revision: D57748025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57748025

facebook-github-bot pushed a commit that referenced this pull request May 24, 2024
Summary:

masnesral and I noticed that config contains non portable artifacts. Lets fix that.

Test Plan: adhoc testing

Reviewed By: masnesral

Differential Revision: D57748025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57748025

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57748025

Summary:

masnesral and I noticed that config contains non portable artifacts. Lets fix that.

Test Plan: adhoc testing

Reviewed By: masnesral

Differential Revision: D57748025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57748025

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

titaiwangms pushed a commit to titaiwangms/pytorch that referenced this pull request May 28, 2024
Summary: masnesral and I noticed that config contains non portable artifacts. Lets fix that.

Test Plan: adhoc testing

Differential Revision: D57748025

Pull Request resolved: pytorch#127022
Approved by: https://github.com/masnesral
@github-actions github-actions bot deleted the export-D57748025 branch June 25, 2024 01:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants