Skip to content

ROCm/vllm

Error
Looks like something went wrong!

About

A high-throughput and memory-efficient inference and serving engine for LLMs

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 85.3%
  • Cuda 9.4%
  • C++ 3.7%
  • Shell 0.7%
  • C 0.5%
  • CMake 0.3%
  • Other 0.1%