Skip to content

Latest commit

 

History

History
19 lines (12 loc) · 677 Bytes

README.md

File metadata and controls

19 lines (12 loc) · 677 Bytes

StreamDeploy LLM App Scaffold

This repo contains a pre-architected, production-ready LLM application that may be easily deployed to cloud thanks tothe independent containers used for the frontend, backend, and LLM service Ollama.

Run the following from the root of the llm-app-scaffold directory:

docker-compose up --build

To use a model from Ollama, run Ollama pull. For example, to use mistral, run the following:

ollama pull mistral

then run the docker compose command

LLM Application Scaffold Screenshot