Update lm-format-enforcer to 0.10.1 #4631
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This update adds support for configuration of lm-format-enforcer via environment variables, ideal for vLLM OpenAI Server.
It was requested by users of vLLM on the project's github as it opens options for them to receive better results when working through the OpenAI server without writing custom code.
There were no dependency or breaking API changes in the library between 0.9.8 (previous version of vLLM) and 0.10.1
From the project's README:
Configuration options
LM Format Enforcer makes use of several heuristics to avoid edge cases that may happen with LLM's generating structure outputs.
There are two ways to control these heuristics:
Option 1: via Environment Variables
There are several environment variables that can be set, that affect the operation of the library. This method is useful when you don't want to modify the code, for example when using the library through the vLLM OpenAI server.
LMFE_MAX_CONSECUTIVE_WHITESPACES
- How many consecutive whitespaces are allowed when parsing JsonSchemaObjects. Default: 12.LMFE_STRICT_JSON_FIELD_ORDER
- Should the JsonSchemaParser force the properties to appear in the same order as they appear in the 'required' list of the JsonSchema? (Note: this is consistent with the order of declaration in Pydantic models). Default: False.