Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

为什么MOT只能支持batch_size=1 #9322

Open
1 task done
youqugit opened this issue Mar 12, 2025 · 1 comment
Open
1 task done

为什么MOT只能支持batch_size=1 #9322

youqugit opened this issue Mar 12, 2025 · 1 comment
Assignees

Comments

@youqugit
Copy link

youqugit commented Mar 12, 2025

问题确认 Search before asking

  • 我已经搜索过问题,但是没有找到解答。I have searched the question and found no related answer.

请提出你的问题 Please ask your question

在看代码时发现,MOT model only supports batch_size=1,并且有个assert严格限制mot的batch_size为1。是模型不支持批处理还是后面的追踪算法不支持啊?

@leo-q8
Copy link
Collaborator

leo-q8 commented Mar 25, 2025

是追踪算法不支持。模型是可以多batch预测的,但是paddledetection中跟踪的逻辑只考虑当前帧和当前帧以前的预测结果,所有不允许batch_size > 1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants