Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: customize model's max_output_tokens #428

Merged
merged 1 commit into from
Apr 23, 2024
Merged

feat: customize model's max_output_tokens #428

merged 1 commit into from
Apr 23, 2024

Conversation

sigoden
Copy link
Owner

@sigoden sigoden commented Apr 23, 2024

clients:
   type: ollama
   api_base: http://localhost:11434
   models:
      - name: llama3
        max_input_tokens: 8192
        max_output_tokens: 4096

@sigoden sigoden merged commit d1aafa1 into main Apr 23, 2024
3 checks passed
@sigoden sigoden deleted the feat branch April 23, 2024 08:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant