Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Time #5

Closed
bharatsubedi opened this issue Feb 14, 2022 · 3 comments
Closed

Inference Time #5

bharatsubedi opened this issue Feb 14, 2022 · 3 comments

Comments

@bharatsubedi
Copy link

I tried your network and got a good result but I faced the problem of inference speed. could you please let me know I can increase the speed of recognition?

@simplify23
Copy link
Owner

Compared with the parallel attention mechanism, our method has no advantage in inference speed, but it is not far behind the traditional encoder-decoder method. If speed of reasoning is required, try decoding 2-3 characters at one timestep, or distil a CTC student network. We plan to design a rapid inference version of CDistNet V2 later.

@bharatsubedi
Copy link
Author

changed the network structure and got faster inferencing time now.
Thank you very much.

@WongVi
Copy link

WongVi commented May 13, 2022

Compared with the parallel attention mechanism, our method has no advantage in inference speed, but it is not far behind the traditional encoder-decoder method. If speed of reasoning is required, try decoding 2-3 characters at one timestep, or distil a CTC student network. We plan to design a rapid inference version of CDistNet V2 later.

waiting rapid version

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants