Skip to content

Conversation

@lucylq
Copy link
Contributor

@lucylq lucylq commented Oct 16, 2024

Following changes in torchtune:

Update ET downstream and remove pad-max-tiles from preprocess.

Test Plan:
With AOTI tests commented out (not working atm), so testing eager/export/et:

python -m unittest examples/models/llama3_2_vision/preprocess/test_preprocess.py 
...

----------------------------------------------------------------------
Ran 4 tests in 21.129s

OK

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 16, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6295

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ No Failures

As of commit 0bbd026 with merge base 5c3439d (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 16, 2024
@lucylq lucylq changed the title Remove pad-max-tiles from preprocess Remove pad_max_tiles from preprocess Oct 16, 2024
@lucylq lucylq marked this pull request as ready for review October 16, 2024 17:49
@facebook-github-bot
Copy link
Contributor

@lucylq has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@lucylq merged this pull request in 2e67e3a.

@lucylq lucylq deleted the lfq.remove-pad-max-tiles-pp branch January 24, 2025 19:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants