Skip to content

Open AI Formatted Response APIs

fingerthief edited this page May 11, 2024 · 2 revisions

Integration with Open AI Response Formatted APIs

MinimalChat supports integration with any API endpoint that returns responses formatted according to OpenAI's specifications. This feature allows users to connect with a variety of language models hosted externally, providing flexibility and extending the capabilities of the app.

Note

The API endpoint you enter in the settings should be the Base URL only. This means just the base address and port number

  • Example: https://192.168.5.30:1234 with nothing after the port number.

Configuring API Endpoints in MinimalChat

To integrate an external API, follow these steps in the settings panel of MinimalChat:

  1. Model Name: Enter the identifier for the language model you wish to use. This is typically a specific name or value that uniquely identifies the model on the hosting service.

    • Example: If using LM Studio, configuring the DeepSeek Coder Model hosted on Hugging Face, with the model name LoneStriker/deepseek-coder-7b-instruct-v1.5-GGUF. This name should be entered into the Model field. Details for this model can be found here.
  2. API Endpoint: Specify the URL where the API is hosted. This URL is the endpoint to which MinimalChat will send requests to generate responses from the model.

    • Example: If using LM Studio, the API endpoint might be something like http://192.168.0.45:1234.
  3. API Key: Some APIs require an authentication key to access. Enter the API key provided by the service hosting the model.

    • Example: For LM Studio, the required API key would be lm-studio.
  4. Max Tokens: Define the maximum number of tokens that can be generated in a response. This setting helps manage the length of responses based on the model’s capabilities and the context window size.

    • Note: The default setting typically allows for about half of the model's maximum token limit, though this can be adjusted based on specific needs or model restrictions.

Benefits of Using Open AI Formatted APIs

  • Flexibility: Connect with a wide range of models from different providers that adhere to OpenAI's response format.
  • Customization: Tailor the chat experience by selecting models that best fit the needs of your conversations or application.
  • Scalability: Easily switch between different models or update API settings to enhance capabilities as new models become available.

This integration feature empowers users to expand the functionality of MinimalChat beyond the built-in models, leveraging the vast landscape of AI language models available in the market.

Useful Applications and Services

API Services

  • Hugging Face Inference Endpoints Allow you to spin up a endpoint server with the hardware configuration and model you choose.
  • OpenRouter Is an amazing service that allows you to choose between a large number of open source models and easily connect to them via their api endpoint using the OpenAI formatted version.
  • TogetherAI Offers a good variety of models and in general has more programming focused models.

Software and Tools For Local Hosting

  • LM Studio is an awesome tool that allows you to download models to your machine from Hugging Face and load and host a local API endpoint to communicate with the model.
  • text-generation-webui Another great tool that allows to load local models and host a local API endpoint for interacting with the model. It also includes a nice UI for general use.