Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix pre-processing for TVM VM inference #302

Merged
merged 6 commits into from
Feb 7, 2022
Merged

Conversation

zhiqwang
Copy link
Owner

@zhiqwang zhiqwang commented Feb 7, 2022

We use the torchvision layout resize in pre-processing for backward compatible. This is a follow-up PR of #293 .

@zhiqwang zhiqwang added enhancement New feature or request API Library use interface labels Feb 7, 2022
@codecov
Copy link

codecov bot commented Feb 7, 2022

Codecov Report

Merging #302 (5348e6e) into main (75ae710) will increase coverage by 0.01%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #302      +/-   ##
==========================================
+ Coverage   94.92%   94.93%   +0.01%     
==========================================
  Files          11       11              
  Lines         729      731       +2     
==========================================
+ Hits          692      694       +2     
  Misses         37       37              
Flag Coverage Δ
unittests 94.93% <100.00%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
test/test_models_transform.py 100.00% <100.00%> (ø)
test/test_onnx.py 94.11% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 75ae710...5348e6e. Read the comment docs.

@CLAassistant
Copy link

CLAassistant commented Feb 7, 2022

CLA assistant check
All committers have signed the CLA.

@zhiqwang zhiqwang changed the title Add torchvision layout pre-processing Fix pre-processing for TVM VM inference Feb 7, 2022
@zhiqwang zhiqwang added bug / fix Something isn't working and removed enhancement New feature or request labels Feb 7, 2022
@zhiqwang zhiqwang merged commit 1fea1a6 into main Feb 7, 2022
@zhiqwang zhiqwang deleted the fix-tvm-preprocess branch February 7, 2022 16:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Library use interface bug / fix Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants