Skip to content

Can trained models run in browsers, just for model inference? #3100

Answered by njzjz
ShangChien asked this question in Q&A
Discussion options

You must be logged in to vote

It's not easy.

While node.js can directly bind with C/C++ and execute x86_64 binaries, browsers can only execute WebAssembly (Wasm). While there are successful examples for small packages like NumPy, the upstream TensorFlow C++ library is too hard to build. xref: pyodide/pyodide#50

Another resolution is to use tf.js, which does not support all OPs or customized OPs. At least, one needs to write customized OPs in the JS way. Contribution is welcome if anyone is willing to spend time on it.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ShangChien
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants