PromptPanel
Accelerating your AI agent adoption
Documentation | DockerHub | GitHub
To get started running your first PromptPanel instance:
docker run --name promptpanel -p 4000:4000 -v PROMPT_DB:/app/database -v PROMPT_MEDIA:/app/media --pull=always promptpanel/promptpanel:latest
After running, your environment will be available at:
http://localhost:4000
Read more on running PromptPanel.
To run Ollama for local / offline inference with the following Docker Compose file:
curl -sSL https://promptpanel.com/content/media/manifest/docker-compose.yml | docker compose -f - up
which will run:
services:
promptpanel:
image: promptpanel/promptpanel:latest
container_name: promptpanel
restart: always
volumes:
- PROMPT_DB:/app/database
- PROMPT_MEDIA:/app/media
ports:
- 4000:4000
environment:
PROMPT_OLLAMA_HOST: http://ollama:11434
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: always
volumes:
PROMPT_DB:
PROMPT_MEDIA:
Your models, conversations, and logic are locked in walled-gardens.
Let's free your AI interface.
- Run any large language model, across any inference provider, any way you want. From commercial models like OpenAI, Anthropic, Gemini, or Cohere - to open source models, either hosted or running locally via Ollama.
- Access controls to assign users to agents without revealing your API tokens or credentials. Enable user sign-up and login with OpenID Connect (OIDC) single sign-on.
- Bring your own data and store it locally on your instance. Use it safely by pairing it with any language model, whether online or offline.
- Create custom agent plugins using Python, to customize your AI agent capabilities, and retrieval augmented generation (RAG) pipelines.
Get started developing using a one-click cloud development environment using GitPod:
This ./plugins
directory contains the community plugin agents found in PromptPanel as well as a sample agent as a template for you to get started with your own development.
- The
./hello_agent
directory gives you some scaffolding for a sample agent. - The other community plugin agents give you references to sample from.
- The
docker-compose-agent-dev.yml
file gives you a sample with the various mounts and environment variables we recommend for development.
To get more information about how to build your first plugin we recommend giving a read to:
Running DEV_PORT=4000 docker compose up -f docker-compose-agent-dev.yml
from this directory with a development port set will bring up a development environment you can use to start developing your agent plugin.
Command:
DEV_PORT=4000 docker compose up -f docker-compose-agent-dev.yml
With these settings, your development environment will be available at: http://localhost:4000
Feel free to get in contact with us at:
hello@promptpanel.com