Skip to content

Conversation

@codelion
Copy link
Member

fixes #278

Refactored all approach functions and classes to accept an optional request_config dictionary, allowing dynamic configuration of max_tokens for model requests. Updated server dispatch logic to pass request_config where appropriate. This change enables more flexible control over token limits for all LLM-based approaches.
@codelion codelion merged commit b8eafe4 into main Nov 20, 2025
3 checks passed
@codelion codelion deleted the fix-max-tokens branch November 20, 2025 10:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

optillm/plansearch.py bug

2 participants