forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Bugfix] fix crash if max_tokens=None (vllm-project#2570)
- Loading branch information
1 parent
7ce229f
commit 33ef7a6
Showing
2 changed files
with
26 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
"""Tests for the SamplingParams class. | ||
""" | ||
from vllm import SamplingParams | ||
|
||
|
||
def test_max_tokens_none(): | ||
"""max_tokens=None should be allowed""" | ||
SamplingParams(temperature=0.01, top_p=0.1, max_tokens=None) | ||
|
||
|
||
if __name__ == "__main__": | ||
import pytest | ||
pytest.main([__file__]) |