Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove bad eval when onnx session used #2034

Merged
merged 2 commits into from
Dec 14, 2022
Merged

Remove bad eval when onnx session used #2034

merged 2 commits into from
Dec 14, 2022

Conversation

msaroufim
Copy link
Member

@msaroufim msaroufim commented Dec 13, 2022

Fixes #2033

I believe this error occured when I tried to merge the dynamo branch with the onnx branch since this line is no longer needed since eval happens directly if model is eager mode or ts

If the ONNX test was enabled by default we would have caught this when merging the torch.compile PR.

In a follow-up PR I'm going to

  • refactor the base handler a bit more
  • Make all the optimization-related dependencies mandatory
  • Make our requirements.txt more modular
  • Get back to doing releases with our non dev docker binaries

Tests

pip install onnx
(serve) ubuntu@ip-172-31-17-70:~/serve/test/pytest$ pytest test_onnx.py 
======================================================================== test session starts =========================================================================
platform linux -- Python 3.8.13, pytest-7.2.0, pluggy-1.0.0
rootdir: /home/ubuntu/serve
collected 5 items                                                                                                                                                    

test_onnx.py .....                                                                                                                                             [100%]

========================================================================= 5 passed in 2.08s ==========================================================================

torch.compile tests

(base) ubuntu@ip-172-31-17-70:~$ curl http://127.0.0.1:8080/predictions/densenet161 -T kitten_small.jpg

{
  "Siberian_husky": 0.006150901783257723,
  "grille": 0.005219979677349329,
  "submarine": 0.004688442684710026,
  "puck": 0.004634885583072901,
  "mountain_tent": 0.004277920816093683
2022-12-13T19:14:28,584 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000
2022-12-13T19:14:28,589 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml.
2022-12-13T19:14:28,590 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - [PID]2795
2022-12-13T19:14:28,590 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Torch worker started.
2022-12-13T19:14:28,591 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Python runtime: 3.8.13
2022-12-13T19:14:28,608 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9000.
2022-12-13T19:14:28,649 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - model_name: densenet161, batchSize: 1
2022-12-13T19:14:36,533 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Compiled model with backend inductor
2022-12-13T19:14:49,290 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Backend received inference at: 1670958889

@codecov
Copy link

codecov bot commented Dec 13, 2022

Codecov Report

Merging #2034 (8176c72) into master (3a0f8f8) will increase coverage by 0.01%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master    #2034      +/-   ##
==========================================
+ Coverage   53.36%   53.38%   +0.01%     
==========================================
  Files          71       71              
  Lines        3225     3224       -1     
  Branches       56       56              
==========================================
  Hits         1721     1721              
+ Misses       1504     1503       -1     
Impacted Files Coverage Δ
ts/torch_handler/base_handler.py 0.00% <ø> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@msaroufim msaroufim merged commit 580cc88 into master Dec 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0.7.0 cant load onnx model with bug: 'InferenceSession' object has no attribute 'eval'
3 participants