Skip to content
This repository has been archived by the owner on Nov 20, 2020. It is now read-only.

Using BigDL 0.8 to speedup inference #2

Open
wzhongyuan opened this issue Apr 23, 2019 · 0 comments
Open

Using BigDL 0.8 to speedup inference #2

wzhongyuan opened this issue Apr 23, 2019 · 0 comments

Comments

@wzhongyuan
Copy link

wzhongyuan commented Apr 23, 2019

Hi Team,

Is this any plan to upgrade BigDL version to 0.8.0 which enables Intel DL Boost that significantly speed up Inference for both latency and throughput.

https://github.com/intel-analytics/BigDL/releases

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant