Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Will ppl.nn x86 support int8 inference? #56

Closed
shiwenloong opened this issue Jul 27, 2021 · 1 comment
Closed

Will ppl.nn x86 support int8 inference? #56

shiwenloong opened this issue Jul 27, 2021 · 1 comment

Comments

@shiwenloong
Copy link

As can be seen in openvino benchmark, int8 inference achieves better performance than fp32 on intel CPU. Will ppl.nn x86 support int8 inference?

@Alcanderian
Copy link
Contributor

ppl.nn x86 is planning to support more feature(a very long list), including vnni int8 inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants