-
-
Notifications
You must be signed in to change notification settings - Fork 9.3k
Closed as not planned
Labels
Description
Anything you want to discuss about vllm.
I was wondering why we need to explicitly pass tool parser as argument while loading models using vllm serve
I understand that the response format can be dependent on the model (like <python_tag> and [TOOL] for llama and mistral)
But I was taking a look at Huggingface TGI and they don't seem to have/need anything like that.
So would it be possible to have a default parser which looks for json-esque tokens when auto tool choice is enabled but no parser is passed?
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
saileshd1402 and Hritik003