Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Neon optimization for Int8 inference. #523

Closed
Alan-Turing-Ko opened this issue Aug 9, 2018 · 1 comment
Closed

Neon optimization for Int8 inference. #523

Alan-Turing-Ko opened this issue Aug 9, 2018 · 1 comment

Comments

@Alan-Turing-Ko
Copy link

I've checked Int8 inference on ARM platform and it worked very well.
Thank you for great library.
The one drawback on Int8 inference is speed.
On my side, it is twice slower than fp32 inference.
I checked code and found that no neon optimization for conv3x3 is implemented yet.
When will it be available?
I can't wait to see the amazing improvement.

@nihui
Copy link
Member

nihui commented Aug 20, 2018

conv3x3 int8 inference has been implemented in git master

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants