Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update lm-format-enforcer to 0.10.1 #4631

Merged
merged 1 commit into from
May 6, 2024
Merged

Conversation

noamgat
Copy link
Contributor

@noamgat noamgat commented May 6, 2024

This update adds support for configuration of lm-format-enforcer via environment variables, ideal for vLLM OpenAI Server.
It was requested by users of vLLM on the project's github as it opens options for them to receive better results when working through the OpenAI server without writing custom code.

There were no dependency or breaking API changes in the library between 0.9.8 (previous version of vLLM) and 0.10.1

From the project's README:

Configuration options

LM Format Enforcer makes use of several heuristics to avoid edge cases that may happen with LLM's generating structure outputs.
There are two ways to control these heuristics:

Option 1: via Environment Variables

There are several environment variables that can be set, that affect the operation of the library. This method is useful when you don't want to modify the code, for example when using the library through the vLLM OpenAI server.

  • LMFE_MAX_CONSECUTIVE_WHITESPACES - How many consecutive whitespaces are allowed when parsing JsonSchemaObjects. Default: 12.
  • LMFE_STRICT_JSON_FIELD_ORDER - Should the JsonSchemaParser force the properties to appear in the same order as they appear in the 'required' list of the JsonSchema? (Note: this is consistent with the order of declaration in Pydantic models). Default: False.

This adds support for configuration via environment variables, ideal for vLLM OpenAI Server
@simon-mo simon-mo enabled auto-merge (squash) May 6, 2024 19:03
@simon-mo simon-mo merged commit bd99d22 into vllm-project:main May 6, 2024
59 checks passed
dtrifiro pushed a commit to opendatahub-io/vllm that referenced this pull request May 7, 2024
robertgshaw2-neuralmagic pushed a commit to neuralmagic/nm-vllm that referenced this pull request May 19, 2024
dtrifiro pushed a commit to dtrifiro/vllm that referenced this pull request May 21, 2024
Temirulan pushed a commit to Temirulan/vllm-whisper that referenced this pull request Sep 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants