Skip to content

Conversation

@kmaherx
Copy link
Contributor

@kmaherx kmaherx commented Oct 2, 2025

See related issue #155.

kmaherx and others added 3 commits October 2, 2025 14:17
Co-authored-by: Simon Schrader <simonschrader96@gmail.com>
Co-authored-by: Simon Schrader <simonschrader96@gmail.com>
Co-authored-by: Simon Schrader <simonschrader96@gmail.com>
@CLAassistant
Copy link

CLAassistant commented Oct 2, 2025

CLA assistant check
All committers have signed the CLA.

@kmaherx
Copy link
Contributor Author

kmaherx commented Oct 4, 2025

Accidentally committed some changes from an unrelated fix (which I'll open a separate issue for later).
Have rolled back to the original relevant commits (da8a0ed).

@SrGonao
Copy link
Collaborator

SrGonao commented Oct 5, 2025

Could you pin the version of VLLM to be at least 0.10.2? otherwise the new offline changes will not work

PR #18800: https://github.com/vllm-project/vllm/releases

Co-authored-by: Simon Schrader <simonschrader96@gmail.com>
@kmaherx
Copy link
Contributor Author

kmaherx commented Oct 5, 2025

Sure thing, have updated the vLLM version requirements.

@SrGonao SrGonao merged commit 8a79bb7 into EleutherAI:main Oct 8, 2025
3 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants