Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WASM support for training #1254

Open
Tracked by #1256
nathanielsimard opened this issue Feb 4, 2024 · 4 comments
Open
Tracked by #1256

WASM support for training #1254

nathanielsimard opened this issue Feb 4, 2024 · 4 comments
Labels
enhancement Enhance existing features

Comments

@nathanielsimard
Copy link
Member

nathanielsimard commented Feb 4, 2024

Support training on the wasm target. For this, we probably need to have alternative implementations of file checkpointers, file loggers, and similar components. It's not clear whether we want to support training in the browser, or if supporting wasm runtimes alone is sufficient. In the latter case, we could use an alternative file system API provided by such a runtime.

Requires #1253

@antimora
Copy link
Collaborator

antimora commented Feb 4, 2024

I am personally not in favor we support training in the browser. This is such niche use case that requires the code base to accommodate.

@Luni-4
Copy link
Collaborator

Luni-4 commented Feb 4, 2024

In my opinion we can support training in the browser at a later time in a new crate called burn-wasm. In this way we do not have to change the other burn crates, but we can create an independent piece of software which has the goal of R&D to test the potentiality of training neural networks inside a browser. So for me it is a yes, but with the label of R&D

@nathanielsimard
Copy link
Member Author

I think we can actually support training in the browser without having to change much of the architecture. It's similar to how we can support no-std: by having primitive type stubs in burn-common. So when #1250 is done, it wouldn't be that hard to support training on the web.

We have to keep in mind that supporting wasm can be beneficial as a deployment format, which can be used similarly to Docker, so it can be useful.

@ArthurBrussee
Copy link
Contributor

ArthurBrussee commented Mar 4, 2024

Also just to chime in - I imagine training massive LLMs or even big convnets in the browsers is niche. However, consider NeRF and gaussian splat like models, training in a browser could be totally fine & useful. Generally, being able to use ML as "just" good numerical optimization is nice! It's not all about massive models.

Tbf, those types of models might not need much from burn-train anyway.

@antimora antimora changed the title [burn-train] wasm support WASM support for training Mar 29, 2024
@antimora antimora added the enhancement Enhance existing features label Mar 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhance existing features
Projects
None yet
Development

No branches or pull requests

4 participants