Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embed trained network in javascript web app for browser-based inference? #30

Closed
smart-fr opened this issue Dec 28, 2020 · 9 comments
Closed

Comments

@smart-fr
Copy link

Question from a beginner: I am wondering how to get a trained player post research phase with AlphaZero.jl to be used for inference in production phase, through a javascript web application that end users would run in their browser?
Is there an equivalent of importing a Keras network into TensorFlow.js that could leverage a Knet or Flux network trained with AlphaZero.jl?
Thanks!

@jonathan-laurent
Copy link
Owner

This is a good question and something where I believe the current Julia ML ecosystem could definitely improve.

Note that AlphaZero.jl is agnostic to the choice of a Deep Learning Framework. Therefore, what you primarily need is a way to export neural network models from your favorite framework into a format that can be loaded in Javascript. Here, two relevant Julia packages seem to be FluxJS.jl and ONNX.jl. However, these do not seem to be very mature and I am unsure whether or not they are still actively maintained. I would be very interested to hear from you if you manage to make them work for your purposes.

Also, using such packages should be sufficient if you just want to deploy the trained network (which should already be a decent player). However, to deploy the full AlphaZero agent, you would also need an MCTS implementation in Javascript. It might be possible to compile the Julia implementation in AlphaZero.jl into Javascript using something like julia-wasm.

@smart-fr
Copy link
Author

smart-fr commented Dec 29, 2020

what you primarily need is a way to export neural network models from your favorite framework into a format that can be loaded in Javascript.

Absolutely. Now, since my purpose is basically one application, namely to develop a player for my new game, this goal piles up with all the choices I have to make that for now led me to AlphaZero.jl -for the power of AlphaZero, the completeness of your implementation, the beauty of Julia. I'd like to stick with what AlphaZero.jl has to offer out of the box, and I see utils were developped to manage Flux and Knet models -but not Keras / TensorFlow models? (Even though I appreciate one could "easily" do so by developing new classes implementing the Network interface.) By the way, the Networks Library documentation seems incomplete?

Thank you for the links you provided. FluxJS.jl could be the middleman between Flux networks and TensorFlow.js; unfortunately it seems to have an issue with convolutional layers. ONNX.jl doesn't seem to convert models to TensorFlow.js or another inference javascript library?

Since I really need this piece for my puzzle -being able to embed a trained model in a web app, I'll keep searching and will share any findings. If there's another way to fill the gap... I'll be happy to read!

@jonathan-laurent
Copy link
Owner

As I thought and seeing the issue you mentioned, FluxJS.jl does not appear to be actively maintained indeed. Also, ONNX might not work with tensorflow.js directly but have you looked at this?

By the way, the Networks Library documentation seems incomplete?

Good catch. I have done some refactoring on master and I haven't updated the documentation yet!

@smart-fr
Copy link
Author

smart-fr commented Dec 30, 2020

have you looked at this?

I have. So we have ONNXjs covering the last mile, from a ONNX net to the user's browser.
Looking for a way to obtain ONNX nets from AlphaZero.jl out of the box, here is ONNXmutable which exports Flux models to ONNX format.
So it seems that a path could be:

  1. AlphaZero.jl to train a Flux model
  2. ONNXmutable to export the Flux model as an ONNX net
  3. ONNXjs to run Javascript model inference in end users' browsers.

Curious to read if anyone has tried this path or a similar one.
I will share my findings here.

@smart-fr
Copy link
Author

smart-fr commented Feb 2, 2021

I have implemented the Game Interface for my game and all tests in test_game.jl are successfully passed. :-)

I will now try and train a Flux model.
I read here that the Flux backend was broken and the fix was going to be released with v0.4 of AlphaZero.jl.
I can't seem to find a branch with v0.4. You will release it after Julia 1.6 is out, right?

If in the meantime I train a Knet agent, I guess I will be able to reuse the parameters set when time comes to train a Flux agent?

@jonathan-laurent
Copy link
Owner

v0.4 is almost ready and it is on the common-rl-intf branch. I am waiting before I release it officially because it does not work on Julia 1.5 indeed (due to some bugs in Julia itself). It introduces a few changes in GameInterface but adapting your code should not take long.

You should be able to use this branch right now with 1.6-beta1 if you want to. The main drawback is that I haven't finished updating all the documentation.

If in the meantime I train a Knet agent, I guess I will be able to reuse the parameters set when time comes to train a Flux agent?

This might be possible but it would probably require more work on your end than retraining a new model from scratch. ;-)

@smart-fr
Copy link
Author

smart-fr commented Feb 2, 2021

You should be able to use this branch right now with 1.6-beta1 if you want to. The main drawback is that I haven't finished updating all the documentation.

Thank you! I think I'll wait for the documentation. Do you have (future) release notes for v0.4? I see in the code changes that there will be more reporting and parameters consistency checking, this will certainly save us exploration time. :)

it would probably require more work on your end than retraining a new model from scratch. ;-)

I meant the parameters in the games/<your-game>/params.jl file; they can't be shared between Knet and a Flux models?

@jonathan-laurent
Copy link
Owner

jonathan-laurent commented Feb 2, 2021

Do you have (future) release notes for v0.4?

Not yet. I will write them while updating the documentation.

I meant the parameters in the games//params.jl file; they can't be shared between Knet and a Flux models?

These are agnostic to the DL framework and can be shared indeed.

@Bobingstern
Copy link

There is another option. After converting to ONNX using ONNXNaiveNasFlux you can convert to tensorflow using this and from tensorflow to tensorflow.js easily using tfjs-converter if you like using Tensorflow.js better

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants