Skip to content

sholgaat/blurt

Repository files navigation

blurt πŸ€–

Ramble an idea at your bot. Get a polished GitHub issue back.

Capture ideas before they disappear. blurt takes a rough message β€” a shower thought, a half-formed feature idea, a bug you noticed β€” and turns it into a structured GitHub issue, automatically.

No annoying forms. No more messy Notes. Just a message to your bot.

License Docker Python


Features

  • Discord + Telegram bots β€” send ideas from wherever you already are
  • Multiple LLM providers β€” Gemini, OpenAI, Anthropic, or local Ollama
  • Auto-structured output β€” title, summary, and tags generated from your raw input
  • GitHub issue creation β€” issues land directly in your repo
  • Self-hostable β€” runs with a single docker compose up -d
  • Minimal friction β€” no UI, no accounts, just a bot message

Example

You send:

what if the rocket could like, figure out on its own when to abort the landing burn, based on terrain or whatever, instead of us hardcoding the altitude threshold. some kind of adaptive thing. probably hard idk

blurt creates:

  • Title: Adaptive landing burn abort trigger based on terrain sensing
  • Summary: Instead of a hardcoded altitude threshold, the landing burn abort logic should dynamically adjust based on real-time terrain data. This would allow the system to respond to uneven or unexpected ground conditions rather than relying on a fixed value.
  • Tags: landing, autonomy, guidance
  • Issue: https://github.com/you/rocketship/issues/42

How it works

  1. Send a rough idea to your Discord or Telegram bot
  2. blurt passes it to your configured LLM, which generates a title, summary, and tags
  3. A GitHub issue is created in your repo
  4. The bot replies with the issue title and link

Getting started

You'll need:

Run the quickstart:

curl -fsSL https://raw.githubusercontent.com/sholgaat/blurt/main/quickstart.sh -o /tmp/blurt-setup.sh && bash /tmp/blurt-setup.sh

The script collects your credentials and tells you when to run docker compose up -d.

Prefer to inspect first:

curl -fsSL https://raw.githubusercontent.com/sholgaat/blurt/main/quickstart.sh -o blurt-setup.sh
# read blurt-setup.sh
bash blurt-setup.sh

Advanced

Build images from source

By default, Docker pulls pre-built images. To build locally instead, before running docker compose up -d:

mv docker-compose.override.yml.disabled docker-compose.override.yml

Use a local LLM with Ollama

Don't trust Big LLM with your ideas? No problem β€” bring your own.

Run the quickstart and select Ollama as your LLM provider when prompted. You'll be asked for the Ollama base URL β€” you can either:

  • Already have Ollama running? Provide the URL (e.g., http://192.168.1.100:11434)
  • Want to set it up with blurt? Download the Ollama compose file and include it when running:
curl -fsSL https://raw.githubusercontent.com/sholgaat/blurt/main/docker-compose.ollama.yml -o docker-compose.ollama.yml
docker compose -f docker-compose.yml -f docker-compose.ollama.yml up -d
docker compose exec ollama ollama pull phi3:mini

The Docker network will make Ollama accessible at http://ollama:11434 from the backend.

Model data is persisted in the ollama-data volume across restarts.

About

Ramble at your bot, get a polished GitHub issue back! πŸ€–

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors