-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference Time #5
Comments
Compared with the parallel attention mechanism, our method has no advantage in inference speed, but it is not far behind the traditional encoder-decoder method. If speed of reasoning is required, try decoding 2-3 characters at one timestep, or distil a CTC student network. We plan to design a rapid inference version of CDistNet V2 later. |
changed the network structure and got faster inferencing time now. |
waiting rapid version |
I tried your network and got a good result but I faced the problem of inference speed. could you please let me know I can increase the speed of recognition?
The text was updated successfully, but these errors were encountered: