Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

feat: Add support for inference parameters  #77

@hiro-v

Description

@hiro-v

Problem

  • I need to use Nitro as API for LLM
  • Add support for using temperature, max_token, penalty in API call
  • Add stream False
  • Add stop

Success Criteria

  • Fully supported

Additional context

  • Check from LMStudio
Screenshot_2023-10-16_at_6 52 14_PM

Metadata

Metadata

Assignees

Type

No type

Projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions