Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support batch inference for crnn and segocr #407

Merged
merged 2 commits into from Aug 3, 2021

Conversation

cuhk-hbsun
Copy link
Collaborator

support batch inference for all the resize modes in text recognition algorithms, including:

  1. Resize to a fix shape;
  2. Keep aspect ratio resize and pad to a fix shape;
  3. keep aspect ratio resize, shape is not fixed.

So that crnn and segocr (keep aspect ratio resize) can use batch inference during testing.
(Note: There may be a mismatch in eval metric for batch_size=1 and batch_size>1)

@codecov
Copy link

codecov bot commented Aug 3, 2021

Codecov Report

Merging #407 (4219569) into main (a2af2b4) will decrease coverage by 0.08%.
The diff coverage is 83.33%.

❗ Current head 4219569 differs from pull request most recent head 1b6c11d. Consider uploading reports for the commit 1b6c11d to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##             main     #407      +/-   ##
==========================================
- Coverage   83.21%   83.12%   -0.09%     
==========================================
  Files         137      137              
  Lines        9155     9162       +7     
  Branches     1299     1301       +2     
==========================================
- Hits         7618     7616       -2     
- Misses       1244     1252       +8     
- Partials      293      294       +1     
Flag Coverage Δ
unittests 83.12% <83.33%> (-0.09%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmocr/apis/inference.py 75.64% <ø> (-1.47%) ⬇️
mmocr/core/visualize.py 91.10% <0.00%> (-0.66%) ⬇️
mmocr/models/textrecog/convertors/seg.py 91.78% <100.00%> (+2.89%) ⬆️
...s/textrecog/recognizer/encode_decode_recognizer.py 91.95% <100.00%> (+0.59%) ⬆️
...mocr/models/textrecog/recognizer/seg_recognizer.py 89.55% <100.00%> (+0.48%) ⬆️
mmocr/datasets/pipelines/custom_format_bundle.py 88.23% <0.00%> (-5.89%) ⬇️
mmocr/datasets/pipelines/transforms.py 80.36% <0.00%> (-1.16%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a2af2b4...1b6c11d. Read the comment docs.

Copy link
Collaborator

@gaotongxiao gaotongxiao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Batch inference hurts the performance of both models. Users should be clearly warned in the docs.

@gaotongxiao gaotongxiao merged commit 3707d67 into open-mmlab:main Aug 3, 2021
gaotongxiao pushed a commit to gaotongxiao/mmocr that referenced this pull request Jul 15, 2022
* support batch inference for crnn and segocr
gaotongxiao pushed a commit to gaotongxiao/mmocr that referenced this pull request Jul 15, 2022
* support batch inference for crnn and segocr
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants