Skip to content

genai-for-all/go-docker-genai-stack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Go Docker GenAI Stack 🩵🐳🤖🦜🔗🦙

👋 you can use this project with Visual Studio Code Dev Containers. Take a look at the .devcontainer.json file. The Docker image is defined int this repository https://github.com/genai-for-all/go-workspace.

Run all in containers

HTTP_PORT=8888 LLM=deepseek-coder OLLAMA_BASE_URL=http://ollama:11434 docker compose --profile container up

The first time only, you must wait for the complete downloading of the model.

Use the native Ollama install (like on macOS)

To do for the first time only:

LLM=deepseek-coder ollama pull ${LLM}
HTTP_PORT=8888 LLM=deepseek-coder OLLAMA_BASE_URL=http://host.docker.internal:11434 docker compose --profile webapp up

Use the GPU from the Ollama container on Linux or Windows

🚧 This is a work in progress

Query Ollama

curl -H "Content-Type: application/json" http://localhost:8080/prompt \
-d '{
  "question": "what are structs in Golang?"
}'

Rebuild the WebApp image

All in containers

HTTP_PORT=8888 LLM=deepseek-coder OLLAMA_BASE_URL=http://ollama:11434 docker compose --profile container up --build

Use the Ollama local install (like on macOS)

HTTP_PORT=8888 LLM=deepseek-coder OLLAMA_BASE_URL=http://host.docker.internal:11434 docker compose --profile webapp up --build

Development mode

For developping the application, use the watch command of Docker Compose

All in containers

HTTP_PORT=8888 LLM=deepseek-coder OLLAMA_BASE_URL=http://ollama:11434 docker compose --profile container watch

Use the Ollama local install (like on macOS)

HTTP_PORT=8888 LLM=deepseek-coder OLLAMA_BASE_URL=http://host.docker.internal:11434 docker compose --profile webapp watch

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages