You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are various LLM inference libraries. WasmEdge already integrated llama.cpp, but we want to bring more to the community.
Details
Already supported:
PyTorch
TFLite
OpenVINO
llama.cpp
The support priority list:
Tier 1:
burn-rs
Tier 2:
Intel extension for transformers
whisper.cpp
RWKV
Tier 3:
vllm
CTranslate2
candle
mlx
Please feel free to add any comments and suggestions. We would like to hear the voice of the community.
Also, if you are interested in contributing the new LLM inference libraries, please tell and show to us :-)
Happy new year!
Appendix
No response
The text was updated successfully, but these errors were encountered:
Summary
There are various LLM inference libraries. WasmEdge already integrated llama.cpp, but we want to bring more to the community.
Details
Already supported:
The support priority list:
Please feel free to add any comments and suggestions. We would like to hear the voice of the community.
Also, if you are interested in contributing the new LLM inference libraries, please tell and show to us :-)
Happy new year!
Appendix
No response
The text was updated successfully, but these errors were encountered: