internet llm - access your ollama (or any other local llm) instance from across the internet
-
Updated
Jun 25, 2024 - Go
internet llm - access your ollama (or any other local llm) instance from across the internet
HTTP proxy for accessing Vertex AI with the REST API interface of ollama. Optionally forwarding requests for other models to ollama. Written in Go.
Add a description, image, and links to the ollama-interface topic page so that developers can more easily learn about it.
To associate your repository with the ollama-interface topic, visit your repo's landing page and select "manage topics."