Skip to content
#

ollama-api

Here are 20 public repositories matching this topic...

Predictive Prompt is a simple Language Learning Model (LLM) chat window with retro styling. It dynamically populates a dropdown with available models from a local instance of Ollama and uses the streaming API to generate and display results in real-time. The output is rendered in markdown with syntax highlighting support.

  • Updated Oct 14, 2024
  • TypeScript

Improve this page

Add a description, image, and links to the ollama-api topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ollama-api topic, visit your repo's landing page and select "manage topics."

Learn more