Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Idea] Load model from File Blob #42

Closed
ngxson opened this issue May 16, 2024 · 0 comments · Fixed by #52
Closed

[Idea] Load model from File Blob #42

ngxson opened this issue May 16, 2024 · 0 comments · Fixed by #52

Comments

@ngxson
Copy link
Owner

ngxson commented May 16, 2024

With the introduction of heapfs, we can now do more low-level things.

The idea is to load File Blob directly into wllama's heap without creating any intermediate buffer.

This will ultimately allow us to use OPFS as mentioned in #38

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant