Skip to content

Pull requests: vllm-project/vllm

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Reviews
Assignee
Filter by who’s assigned
Sort

Pull requests list

[Core][Optimization] remove vllm-nccl
#5091 opened May 28, 2024 by youkaichao Loading…
New vllm CLI
#5090 opened May 28, 2024 by EthanqX Loading…
[Model] Support MAP-NEO model
#5081 opened May 28, 2024 by xingweiqu Loading…
[WIP] Hete spec decode
#5065 opened May 27, 2024 by jiqing-feng Draft
[Model] Add Internlm2 LoRA support
#5064 opened May 27, 2024 by Isotr0py Loading…
2
1
[Frontend] Add tokenize/detokenize endpoints
#5054 opened May 26, 2024 by sasha0552 Loading…
Chat method for offline llm
#5049 opened May 25, 2024 by nunjunj Loading…
Bump version to v0.4.3
#5046 opened May 25, 2024 by simon-mo Loading…
ci draft
#5040 opened May 24, 2024 by khluu Draft
[FRONTEND] OpenAI tools support named functions
#5032 opened May 24, 2024 by br3no Loading…
[BUGFIX] [FRONTEND] Correct chat logprobs
#5029 opened May 24, 2024 by br3no Loading…
ProTip! What’s not been updated in a month: updated:<2024-04-28.