Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch inference is not support currently, as the image size might be different in a batch #3181

Closed
chenhuagg opened this issue Jul 10, 2023 · 7 comments

Comments

@chenhuagg
Copy link

No description provided.

@chenhuagg
Copy link
Author

File "D:\mmsegmentation-main\tools\train.py", line 104, in
main()
File "D:\mmsegmentation-main\tools\train.py", line 100, in main
runner.train()
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\mmengine\runner\runner.py", line 1706, in train
model = self.train_loop.run() # type: ignore
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\mmengine\runner\loops.py", line 284, in run
self.runner.val_loop.run()
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\mmengine\runner\loops.py", line 363, in run
self.run_iter(idx, data_batch)
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\mmengine\runner\loops.py", line 383, in run_iter
outputs = self.runner.model.val_step(data_batch)
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\mmengine\model\base_model\base_model.py", line 132, in val_step
data = self.data_preprocessor(data, False)
File "C:\Users\ss\AppData\Local\conda\conda\envs\gg\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "d:\mmsegmentation-main\mmseg\models\data_preprocessor.py", line 135, in forward
assert len(inputs) == 1, (
AssertionError: Batch inference is not support currently, as the image size might be different in a batch

@edsml-hmc122
Copy link

Please read the existing issues before posting: #2965 #2644 #1645 #1444 #1370 #125

@chenhuagg
Copy link
Author

I read the above existing questions, but none of them can solve my doubts. My dataset potsdam is obtained by using tools\dataset_converters. Size should be consistent. No matter how I adjust the batch_size, this error will be reported

@chenhuagg
Copy link
Author

@edsml-hmc122

@edsml-hmc122
Copy link

It is clearly stated in the previous issues that it doesn't matter whether your images are all the same size -- batched inference is not currently supported in any case.

@chenhuagg
Copy link
Author

@edsml-hmc122 ,How to set it up so that it is not batched?Set up batch size=1?

@edsml-hmc122
Copy link

The test and val dataloaders each need to have batch size 1. Most likely your val dataloader is the issue, but I can't really say without seeing the config

xiexinch added a commit that referenced this issue Jul 20, 2023
Thanks for your contribution and we appreciate it a lot. The following
instructions would make your pull request more healthy and more easily
get feedback. If you do not understand some items, don't worry, just
make the pull request and seek help from maintainers.

## Motivation

#3181
#2965
#2644
#1645
#1444
#1370
#125

## Modification

Remove the assertion at data_preprocessor

## BC-breaking (Optional)

Does the modification introduce changes that break the
backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the
downstream projects should modify their code to keep compatibility with
this PR.

## Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases
here, and update the documentation.

## Checklist

1. Pre-commit or other linting tools are used to fix the potential lint
issues.
2. The modification is covered by complete unit tests. If not, please
add more unit test to ensure the correctness.
3. If the modification has potential influence on downstream projects,
this PR should be tested with downstream projects, like MMDet or
MMDet3D.
4. The documentation has been modified accordingly, like docstring or
example tutorials.
nahidnazifi87 pushed a commit to nahidnazifi87/mmsegmentation_playground that referenced this issue Apr 5, 2024
Thanks for your contribution and we appreciate it a lot. The following
instructions would make your pull request more healthy and more easily
get feedback. If you do not understand some items, don't worry, just
make the pull request and seek help from maintainers.

## Motivation

open-mmlab#3181
open-mmlab#2965
open-mmlab#2644
open-mmlab#1645
open-mmlab#1444
open-mmlab#1370
open-mmlab#125

## Modification

Remove the assertion at data_preprocessor

## BC-breaking (Optional)

Does the modification introduce changes that break the
backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the
downstream projects should modify their code to keep compatibility with
this PR.

## Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases
here, and update the documentation.

## Checklist

1. Pre-commit or other linting tools are used to fix the potential lint
issues.
2. The modification is covered by complete unit tests. If not, please
add more unit test to ensure the correctness.
3. If the modification has potential influence on downstream projects,
this PR should be tested with downstream projects, like MMDet or
MMDet3D.
4. The documentation has been modified accordingly, like docstring or
example tutorials.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants