Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wasm support #102

Closed
Boscop opened this issue Aug 25, 2023 · 15 comments · Fixed by #122
Closed

Wasm support #102

Boscop opened this issue Aug 25, 2023 · 15 comments · Fixed by #122
Labels
enhancement New feature or request

Comments

@Boscop
Copy link

Boscop commented Aug 25, 2023

Thanks for making this crate, it seems very useful :)

Currently the crate always depends on e.g. tokio which means it can't be compiled to wasm for use in frontends (or serverless wasm workers like on AWS/Cloudflare) that want to make OpenAI API requests.
It would be great if this crate could also be compiled to wasm.

@64bit
Copy link
Owner

64bit commented Aug 26, 2023

Thank you 😌

Given that reqwest support wasm, would like to have wasm support too.

I'm not very familar with wasm ecosystem, but seems like tokio has work in progress for it tokio-rs/tokio#4827

Perhaps via feature flag to swtich between tokio and wasm as initial starting point to support wasm?
Would love your input/ideas on implementation.

@64bit 64bit added the enhancement New feature or request label Aug 26, 2023
@Boscop
Copy link
Author

Boscop commented Aug 26, 2023

Yes, tokio could be an optional feature.
Wasm doesn't need the tokio runtime (and it wouldn't be desirable because of the bloat), but request works on wasm as well.
Not sure which other deps are only needed outside of wasm.
If there are more deps that aren't needed on wasm, you could put them all under one feature named "runtime".

@cosmikwolf
Copy link

cosmikwolf commented Sep 1, 2023

I would love to be able to implement a different async solution.

I am building a new app and was trying to make it a non async app that uses this app to make async calls by blocking with futures::executor with block_on, then I realized the tokio requirement.

I think futures-rs would be a great choice for async, as it is also compatible with no_std environments. I would love to have a no_std async access to the openai api.

https://github.com/rust-lang/futures-rs

My use case being personal devices that connect to the openAI API for voice to text, and then to chatGPT.

I think wasm devs would also appreciate using this crate

I may try and help here at some point in the near future. I am currently making a bot to make code upgrades automatically, using your library, so maybe I will point it this direction to test it out....

@64bit
Copy link
Owner

64bit commented Sep 4, 2023

Hi @cosmikwolf

It seems that support for different async executor should be a separate issue?
Is that somehow related to WASM as well?

@Doordashcon
Copy link
Contributor

Hello @64bit have little experience with WASM architecture but would like to pick this up in the coming week.

@64bit
Copy link
Owner

64bit commented Sep 7, 2023

Thank you @cosmikwolf and @Doordashcon for offering to contribute!

I'll let you guys coordinate on this thread.

To consider this resolved we should at least have one working example for AWS lambda or Cloudflare ( or both if you're feeling adventurous :))

@ifsheldon
Copy link
Contributor

ifsheldon commented Oct 15, 2023

+1. I skimmed through the code searching for tokio, and it seems most of the usage relates to files. So, I guess probably the easiest first step is to gate file-related ops behind a feature with optional tokio dependency. Those who want to upload/download audio/images have to wait for a while, but text only functions should just work on wasm I guess?

Update:
except this one (and only this one) I guess

pub(crate) async fn stream<O>(

@64bit
Copy link
Owner

64bit commented Oct 15, 2023

Getting started without streaming and files support, but testable through examples would still be a good first step!

@ifsheldon
Copy link
Contributor

Hi all! If you can help testing #120 and/or try it on wasm, it would be great.

@64bit
Copy link
Owner

64bit commented Nov 26, 2023

Updates from release notes in 0.17.0:

WASM support, it lives in experiments branch. To use it please pin directly to git sha in your Cargo.toml. Any discussion, issues, related to WASM are welcome in #102 . Any WASM related PRs are welcome in experiments branch.

@ifsheldon
Copy link
Contributor

ifsheldon commented Jan 8, 2024

I am maintaining the code for wasm support and I am trying to stabilize wasm target(s) in main. So perhaps we can discuss a more detailed plan here?

Current State

wasm32-unknown-unknown is already working. See the example openai-web-app. And to my knowledge, wasi support should just work since wasm32-unknown-unknown is the bare minimum.

Implementation Plan (not complete)

If you have something in mind, please make a comment or help out the implementation.

Tracking List:

@ifsheldon
Copy link
Contributor

Since I want to publish a crate that depends on WASM feature of async-openai, I need to publish WASM support first, so I'd like to publish my fork on crates.io with the name async-openai-wasm soon.

@64bit
Copy link
Owner

64bit commented Apr 28, 2024

Thank you for the heads up. I think that's a good way forward to keep it sustainable.

Do you plan to make it permanent? If so, you're welcome to link new crate in README. And so we can also close the experiments branch and remove related doc in README.

@ifsheldon
Copy link
Contributor

Yeah, I've made a PR. Thanks!

@64bit
Copy link
Owner

64bit commented Jun 5, 2024

WASM support has a new home https://github.com/ifsheldon/async-openai-wasm, hence closing

@64bit 64bit closed this as completed Jun 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
5 participants