Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Taichi.js] JavaScript Backend via Emscripten #394

Closed
yuanming-hu opened this issue Jan 19, 2020 · 22 comments
Closed

[Taichi.js] JavaScript Backend via Emscripten #394

yuanming-hu opened this issue Jan 19, 2020 · 22 comments
Assignees
Labels
feature request Suggest an idea on this project stale stale issues and PRs welcome contribution

Comments

@yuanming-hu
Copy link
Member

Is your feature request related to a problem? Please describe.
Allowing Taichi to generate JavaScript code will enable many more people to play with state-of-the-art computer graphics in their browsers.

Describe the solution you'd like
More investigation is needed. Emscripten or WASM seem good ways to go.

The kernel code will still be written in Python, yet a ti.export function will be added to dump a kernel into compiled JavaScript. Then users can load these js and run it in HTML5.

The JavaScript backend does not have to support full Taichi functionality. For example, we can omit some sparse data structure support.

Discussions on/contributions to this are warmly welcome! :-)

@yuanming-hu yuanming-hu added feature request Suggest an idea on this project welcome contribution labels Jan 19, 2020
@r03ert0
Copy link

r03ert0 commented Jan 19, 2020

I'd be happy to help!
Do you think something like gpu.js could be used to accelerate some computations?

@yuanming-hu
Copy link
Member Author

yuanming-hu commented Jan 19, 2020

Good question. I guess one thing to discuss here is "should we do pure JavaScript or leverage WebGL (fragment/compute shaders)?"

If we go WebGL, we get higher performance, but the computational pattern also gets restricted to pure array operations (e.g. a[i, j] = ti.sqrt(b[i, j])). This means we need some language restriction on the frontend and not every Taichi program gets compiled to WebGL. Not sure how compute shaders (see https://github.com/9ballsyndrome/WebGL_Compute_shader and https://www.khronos.org/registry/webgl/specs/latest/2.0-compute/) help with this.

If we go JavaScript, then it will run slower but we can support much more computational patterns. It's also easier since we can probably directly translate the generated LLVM IR into Javascript. I would suggest starting with this path.

@yuanming-hu
Copy link
Member Author

Let's narrow down the range to generating Javascript via Emscripten until WebGL compute shader is mature.

@yuanming-hu yuanming-hu changed the title [Taichi.js] JavaScript Backend [Taichi.js] JavaScript Backend via Emscripten Jan 31, 2020
@yuanming-hu
Copy link
Member Author

yuanming-hu commented Feb 1, 2020

It seems that Emscripten itself is switching to the LLVM WASM backend.
https://v8.dev/blog/emscripten-llvm-wasm

So one decision to be made: do we directly generate WASM via LLVM or go through Emscripten?

The former saves us from adding the dependency on Emscripten.
The latter can generate Javascript as well, which has better compatibility. Emscripten also seems better documented than LLVM WASM backend.

A question to Web experts: how well supported is WASM on current browsers? If everyone's browser already supports WASM (https://caniuse.com/#feat=wasm) then maybe we should directly use the LLVM WASM backend?

An old Rust thread on WASM: rust-lang/rust#33205

Inputs are welcome!

@github-actions
Copy link

Warning: The issue has been out-of-update for 50 days, marking stale.

@WenheLI
Copy link

WenheLI commented Sep 21, 2020

It seems that Emscripten itself is switching to the LLVM WASM backend.
https://v8.dev/blog/emscripten-llvm-wasm

So one decision to be made: do we directly generate WASM via LLVM or go through Emscripten?

The former saves us from adding the dependency on Emscripten.
The latter can generate Javascript as well, which has better compatibility. Emscripten also seems better documented than LLVM WASM backend.

A question to Web experts: how well supported is WASM on current browsers? If everyone's browser already supports WASM (https://caniuse.com/#feat=wasm) then maybe we should directly use the LLVM WASM backend?

An old Rust thread on WASM: rust-lang/rust#33205

Inputs are welcome!

@yuanming-hu
I think directly export taichi to wasm should be fine. The majority of browsers have supported this feature. And, asm.js could be used as a fallback to wasm. Therefore, it should be fine to use WASM in most cases.

@archibate
Copy link
Collaborator

In comparision to the Taichi -> LLVM -> WASM approach, it's worth mention that we already have some nice progress in the Taichi -> C -> WASM approach: https://github.com/taichi-dev/taichi.js

@WenheLI
Copy link

WenheLI commented Sep 21, 2020

In comparision to the Taichi -> LLVM -> WASM approach, it's worth mention that we already have some nice progress in the Taichi -> C -> WASM approach: https://github.com/taichi-dev/taichi.js

Cool! May I have some insights in terms of the future plan?

@archibate archibate self-assigned this Sep 21, 2020
@archibate
Copy link
Collaborator

Cool! May I have some insights in terms of the future plan?

Here's my plan:

  1. release the C backend (where Emscripten is based) on Windows and OS X too.
  2. make Taichi.js a powerful tool for creating heavy Web VFXs.
  3. setup a server that compiles Taichi kernel into WASM to run it on client, so that people could play Taichi online without installing Python.
  4. we may even consider utilizing WebGL after compute shader is mature there, with OpenGL backend.

@WenheLI
Copy link

WenheLI commented Sep 21, 2020

Cool! May I have some insights in terms of the future plan?

Here's my plan:

  1. release the C backend (where Emscripten is based) on Windows and OS X too.
  2. make Taichi.js a powerful tool for creating heavy Web VFXs.
  3. setup a server that compiles Taichi kernel into WASM to run it on client, so that people could play Taichi online without installing Python.
  4. we may even consider utilizing WebGL after compute shader is mature there, with OpenGL backend.

This is happening under https://github.com/taichi-dev/taichi.js ?
Interested in this project, just wondering if there is any starting point for collaboration?

@archibate
Copy link
Collaborator

This is happening under https://github.com/taichi-dev/taichi.js ?

Yes, except for 1 is actually happening under https://github.com/taichi-dev/taichi.

Interested in this project, just wondering if there is any starting point for collaboration?

Oh that would be great! Here's something we could do at this moment:

  1. Add more examples to our online demo.
  2. Setup a server for compiling Taichi and output WASM, may be based on Jupyter notebook.
  3. Fallback to asm.js when WASM not available.
  4. Add API documents for this project.

@yuanming-hu
Copy link
Member Author

yuanming-hu commented Sep 21, 2020

Thanks for all the discussions here!

On the compiler side, so far there are two approaches to generate WASM/JS.

  • The first one is Taichi->C->WASM, an initial step in this direction is nicely done by @archibate.
  • In the long run, we also want to explore Taichi->LLVM->WASM. This direction has not started, but it has great potential. Specifically, since the LLVM backend already has good support for all the feature extensions (especially sparse computation), following this path will allow the users to demonstrate really cool sparse computation tasks in the browser.

On the web development side, a cool thing we can do is host a TaichiHub website, that allows users to share their WASM/JS programs generated by Taichi. Good references are https://allrgb.com/ and https://www.shadertoy.com/ I can help raise some money for hosting the TaichiHub website if that's necessary :-)

@archibate
Copy link
Collaborator

Good references are https://allrgb.com/ and https://www.shadertoy.com/ I can help raise some money for hosting the TaichiHub website if that's necessary :-)

Hi, everyone! Here's my recent progress on TaichiHub: http://142857.red:3389/

@WenheLI
Copy link

WenheLI commented Sep 28, 2020

Good references are https://allrgb.com/ and https://www.shadertoy.com/ I can help raise some money for hosting the TaichiHub website if that's necessary :-)

Hi, everyone! Here's my recent progress on TaichiHub: http://142857.red:3389/

We can host the web & service on vercel. It provides global cdn and it's free! If you think it is a good option, I can help with the deployment.

@yuanming-hu
Copy link
Member Author

We can host the web & service on vercel. It provides global cdn and it's free! If you think it is a good option, I can help with the deployment.

Free services are always good :-) We do need to run a Python program on the server and potentially host a database to store the shader data - does vercel support that?

@WenheLI
Copy link

WenheLI commented Sep 28, 2020

We can host the web & service on vercel. It provides global cdn and it's free! If you think it is a good option, I can help with the deployment.

Free services are always good :-) We do need to run a Python program on the server and potentially host a database to store the shader data - does vercel support that?

vercel provides serverless function ability. It does support database connection & python runtime. We could use mongodb to store the data as mongodb also provides free host service. We can discuss the capability in detail. But in theory, it is totally doable.

@rexwangcc
Copy link
Collaborator

rexwangcc commented Sep 28, 2020

Good references are https://allrgb.com/ and https://www.shadertoy.com/ I can help raise some money for hosting the TaichiHub website if that's necessary :-)

Hi, everyone! Here's my recent progress on TaichiHub: http://142857.red:3389/

We can host the web & service on vercel. It provides global cdn and it's free! If you think it is a good option, I can help with the deployment.

@WenheLI TIL that zeit.co/now has been re-branded to vercel!

@yuanming-hu if we go serverless solution instead of containers or hosted service, looks like https://vercel.com/docs/serverless-functions/supported-languages provides services similar to AWS Lambda Functions. (we might deploy the website on it later as well in case we need to speed up the access speed)

@yuanming-hu
Copy link
Member Author

Good references are https://allrgb.com/ and https://www.shadertoy.com/ I can help raise some money for hosting the TaichiHub website if that's necessary :-)

Hi, everyone! Here's my recent progress on TaichiHub: http://142857.red:3389/

We can host the web & service on vercel. It provides global cdn and it's free! If you think it is a good option, I can help with the deployment.

TIL that zeit.co/now has been re-branded to vercel!

@yuanming-hu if we go serverless solution instead of containers or hosted service, looks like https://vercel.com/docs/serverless-functions/supported-languages provides services similar to AWS Lambda Functions. (we might deploy the website on it later as well in case we need to speed up the access speed)

Wow sounds really like a nice fit! They also seem to support Python dependencies: https://vercel.com/docs/runtimes#official-runtimes/python/python-dependencies we can then just add taichi to requirements.txt. The global CDN feature also sounds nice - it seems that China/US always has a 300+ms ping. @archibate what do you think?

@archibate
Copy link
Collaborator

@archibate what do you think?

It would be nice to have a free server! My concerns are if vercel support installing Emscripten (emcc) as dependencies?
Here's a list of requirements to host TaichiHub:

  1. A /tmp directory, the ActionRecorder needs a place to emit C source file.
  2. Install Emscripten, it should launch emcc -c /tmp/hello.c -o /tmp/hello.js when being requested.
  3. Store user shaders somewhere so that they could share and show up in the gallery.
  4. Cache the compiled .js and .wasm files for gallery shaders. Otherwise it's wasting resource if ~10 users are requesting the same shader.
  5. The performance shouldn't be too low for their free edition, idealy <5s for each compilation.

If they provide these support, congrats! We can host TaichiHub there.

@WenheLI
Copy link

WenheLI commented Sep 28, 2020

@archibate We only need to investigate if vercel will allow installations for external packages. For the cache, we can use a database to handle it. Persistent storage can also be achieved by the database.

And the worse case is we can not host compilation on vercel, we can still host the frontend on it. It gives good speed across the world.

@archibate
Copy link
Collaborator

archibate commented Sep 28, 2020

And the worse case is we can not host compilation on vercel, we can still host the frontend on it. It gives good speed across the world.

Separating frontend and backend is a nice idea! So here's our workflow:

  1. User request vercel for frontend webpages (accelerated by CDN).
  2. User click RUN button to send a request to the vercel server (accelerated CDN).
  3. The vercel server check mongodb for cached WASM, if not cached:
  4. The vercel server send a request to our non-free backend server.
  5. The backend server returns a WASM file as response.
  6. The vercel server cache that WASM file to mongodb.
  7. The vercel server return the WASM file to user client for execution.

The backend server could also be equipped with password so that only the vercel server can invoke it. WDYT?
If we reached agreement, I'll transform my current setup in 142857.red into a backend server, would you mind help me move the frontend to vercel?

We may even make the frontend server non-Python (non-Flask), as its only job is response HTMLs and redirect requests to our backend server, where Emscripten and Taichi are hosted.

@WenheLI
Copy link

WenheLI commented Sep 28, 2020

And the worse case is we can not host compilation on vercel, we can still host the frontend on it. It gives good speed across the world.

Separating frontend and backend is a nice idea! So here's our workflow:

  1. User request vercel for frontend webpages (accelerated by CDN).
  2. User click RUN button to send a request to the vercel server (accelerated CDN).
  3. The vercel server check mongodb for cached WASM, if not cached:
  4. The vercel server send a request to our non-free backend server.
  5. The backend server returns a WASM file as response.
  6. The vercel server cache that WASM file to mongodb.
  7. The vercel server return the WASM file to user client for execution.

The backend server could also be equipped with password so that only the vercel server can invoke it. WDYT?
If we reached agreement, I'll transform my current setup in 142857.red into a backend server, would you mind help me move the frontend to vercel?

We may even make the frontend server non-Python (non-Flask), as its only job is response HTMLs and redirect requests to our backend server, where Emscripten and Taichi are hosted.

Sounds like a plan, we can definitely do it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Suggest an idea on this project stale stale issues and PRs welcome contribution
Projects
None yet
Development

No branches or pull requests

6 participants