Skip to content

Lotus-agent/Lotus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

34 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🌸 Lotus

Autonomous AI Control Agent for Windows & macOS β€” driven from Telegram, powered by local AI.

Windows macOS Python Swift Telegram Ollama

Lotus turns your computer into an AI-controlled remote workstation. Send natural-language commands from Telegram, and Lotus operates your desktop β€” opening files, playing music, taking screenshots, running research, and chatting with you using a private, local LLM. No cloud, no data leakage, no surprises.


⚑ Pick your platform

Windows macOS
Latest version v2.2.0-STABLE v2.0.1
Installer LotusSetup.exe (Inno Setup, autonomous) Lotus-2.0.1.dmg (drag-to-install)
GUI System-tray control panel Native Swift menu-bar app (Lotus.app)
Background service Hidden pythonw.exe via scheduled task launchd user agent (com.lotus.botservice)
Source Windows-MCP/ Mac-MCP/
Lead @SatyamPote @JayashBhandary
Read this for install / dev Windows guide β†’ macOS guide β†’

TL;DR for first-time users:

  1. Get a Telegram bot token from @BotFather and your Telegram user ID from @userinfobot.
  2. Install Ollama and pull a small model: ollama pull qwen2.5:3b.
  3. Follow the platform-specific install guide above.
  4. DM your bot. Try dashboard, take screenshot, or play lo-fi.

Table of Contents


What is Lotus?

Lotus is a cross-platform AI agent that bridges your computer and Telegram. It runs as a background service, listens to messages from a small set of allowed Telegram users, interprets them, and translates them into real actions on your machine β€” file lookups, app automation, music playback, screen capture, deep research, and conversational AI.

The intelligence layer is fully local: Lotus integrates with Ollama to run open-weight LLMs (Llama, Qwen, Phi, etc.) on your own hardware. Nothing about your files, your queries, or your conversations leaves the machine β€” except, of course, the Telegram messages you choose to send.

The two flavors share the same philosophy and the same MCP-style tool surface, but each is implemented natively for its platform. Pick the right guide above for installation, build instructions, troubleshooting, and platform-specific architecture details.


πŸš€ Feature Tour

These features are available on both Windows and macOS.

βš–οΈ Strict Priority Routing

Commands are matched against a hardened priority chain β€” System > Files > Music > Research β€” before the LLM ever sees them. This guarantees that literal commands like open report.pdf or volume up execute deterministically and don't get hallucinated into something else. The AI is invoked only for genuinely ambiguous or open-ended queries.

πŸ” Multi-Source Research Engine

The research <topic> command runs a tiered pipeline:

  1. Wikipedia β€” primary structured source, fast and citation-friendly
  2. DuckDuckGo Instant Answer API β€” fallback for current events
  3. Web scraping with markdownify β€” final fallback for arbitrary URLs

Results are aggregated, the LLM produces a structured summary, and you receive a professional PDF report plus inline images via Telegram.

πŸ—£οΈ Voice Feedback

Every action emits a spoken confirmation in a clear, natural female voice. Local TTS plays through your speakers; the same audio is sent as a Telegram Voice Note for remote acknowledgement when you're away from the machine.

πŸ“¦ Managed Storage with Auto-Cleaning

Lotus reserves a 2 GB sandbox under your user data dir for downloads, research artifacts, and screen recordings. An LRU cleanup keeps it under quota β€” your disk doesn't fill up if you forget about it.

🎡 Stable Music System

Single-instance enforcement: only one player is ever alive at a time, so queueing a new song cleanly stops the previous one.

  • play <query> β€” searches and streams via yt-dlp
  • pause / resume / stop
  • next / prev β€” through the session queue
  • volume up / volume down β€” system mixer hooks

πŸ€– Private Local AI (Ollama)

Default model: qwen2.5:3b β€” runs comfortably on a recent MacBook or any PC with 8 GB RAM. Want bigger? Swap to llama3.1:8b, phi4, or any model in the Ollama library β€” Lotus picks it up without code changes.

πŸ“Ή Screen & Media Tools

  • take screenshot β€” instant PNG of the current desktop
  • record screen <seconds> β€” captures video (ffmpeg under the hood)
  • download <youtube-url> β€” pulls audio or video via yt-dlp

πŸ–ΌοΈ Polished Telegram UI

Every reply is wrapped in a clean monospaced frame with a header banner. File listings, dashboards, and research summaries are visually distinct and pleasant to read on mobile.


πŸ› οΈ Command Reference

A condensed catalog. Type help to your bot for an in-chat version. All commands work identically on Windows and macOS.

πŸ“‚ File Management

Command What it does
find <query> Fuzzy search across user dirs and the storage sandbox
open <filename> Open the file in its default application
send <filename> Upload the file to Telegram
ls List the current working directory
cd <path> Change the bot's working directory
tree Print a directory tree (depth-limited)

🎡 Media & Music

Command What it does
play <song name> Search + stream audio
pause / resume / stop Standard playback control
volume up / down System volume nudge
next / prev Skip in the session queue
now playing Show current track and elapsed time

πŸ” Research & Intelligence

Command What it does
research <topic> Wikipedia β†’ DDG β†’ scrape β†’ PDF
list research Most recent reports with timestamps
say <text> Speak text via local TTS + Telegram voice note
chat <prompt> One-shot LLM completion (Ollama)

πŸ–₯️ System Control

Command What it does
dashboard Battery, CPU, RAM, disk, uptime, IP
lock / sleep / shutdown Power management
take screenshot Capture the desktop as PNG
record screen <seconds> Capture a screen video

Platform-specific commands (e.g. whatsapp send on Windows) are documented in the per-platform READMEs.


πŸ—οΈ Architecture

Lotus is a three-tier system on both platforms.

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                       Telegram                         β”‚  ← user
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                           β”‚  long-poll updates
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚            bot_service.py (background)                 β”‚
β”‚                                                        β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”‚
β”‚  β”‚ telegram_bot     β”‚   β”‚ control_api (HTTP)     β”‚    β”‚
β”‚  β”‚  - command parse β”‚   β”‚  - GET /api/status     β”‚    β”‚
β”‚  β”‚  - priority chainβ”‚   β”‚  - GET /api/logs       β”‚    β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β”‚  - POST /api/restart   β”‚    β”‚
β”‚           β”‚             β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”‚
β”‚  β”‚ MCP-style tool surface (mac_mcp / win_mcp)   β”‚     β”‚
β”‚  β”‚  - desktop (mouse, keyboard, screenshot)     β”‚     β”‚
β”‚  β”‚  - filesystem (find, open, ls, cd)           β”‚     β”‚
β”‚  β”‚  - media (yt-dlp, ffmpeg, mpv)               β”‚     β”‚
β”‚  β”‚  - research (wiki, DDG, scrape, PDF)         β”‚     β”‚
β”‚  β”‚  - tts / voice                               β”‚     β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
            β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚      Ollama daemon     β”‚    β”‚    Native GUI        β”‚
β”‚   (local LLM, http)    β”‚    β”‚ Lotus.app / Tray     β”‚
β”‚                        β”‚    β”‚ (status + control)   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
  • Telegram bot β€” python-telegram-bot, long-polling, gated by an allowlist of user IDs. Anything from outside the list is dropped.
  • MCP server β€” fastmcp-based tool surface that's exposed to both the bot loop and (optionally) external MCP clients like Claude Desktop.
  • Control API β€” a tiny uvicorn HTTP server on localhost:40510, used by the GUI to query status and trigger restarts. Bound to loopback only.
  • GUI β€” a thin client over the control API. The bot service is authoritative; the GUI never owns state.
  • Ollama β€” out-of-process local LLM server. Lotus speaks to it over HTTP at http://127.0.0.1:11434.

The platform-specific process supervision and bundle layout are documented in each platform's README:

Why Telegram?

  • Universal β€” the same client works on iOS, Android, web, and desktop.
  • Free message API β€” no SMS / Twilio dependencies.
  • Bot tokens are revocable β€” if a token leaks you regenerate it via BotFather.
  • End-to-end optional β€” Lotus uses standard bot API, but you can layer a private channel or Telegram MTProxy if you want extra hop secrecy.

βš™οΈ Configuration

Bot config (config.json)

The schema is identical on both platforms. The location differs:

Platform Path
Windows %LOCALAPPDATA%\Lotus\config.json
macOS ~/Library/Application Support/Lotus/config.json
Field Type Description
name string Display name (used in greetings: "Hello <name>")
telegram_token string Bot token from BotFather
allowed_user_id string Comma-separated list of Telegram user IDs allowed to issue commands
model_name string Ollama model identifier β€” must already be ollama pull'd
created_at string ISO-8601 timestamp written by the wizard

Example:

{
  "name": "Jayash",
  "telegram_token": "1234567890:ABC-defGhIjKlmNoPqRStUvWxYz1234567890",
  "allowed_user_id": "1327255784,9876543210",
  "model_name": "qwen2.5:3b",
  "created_at": "2026-05-09 00:33:21"
}

Shared environment variables

Variable Default Effect
LOTUS_CONTROL_PORT 40510 Port the local control API listens on
ANONYMIZED_TELEMETRY true Set to false to disable optional PostHog event reporting

Platform-specific environment variables are documented in each platform's README.

Control API (localhost-only)

PORT=40510   # or read from the platform's port file

curl http://127.0.0.1:$PORT/api/status     # service health + uptime
curl http://127.0.0.1:$PORT/api/logs       # last 100 log lines
curl http://127.0.0.1:$PORT/api/config     # current config (token redacted)
curl -X POST http://127.0.0.1:$PORT/api/restart
curl -X POST http://127.0.0.1:$PORT/api/stop

The native GUI on each platform uses this same surface β€” there is no private API.


πŸ” Privacy & Security

Lotus is designed to be private by default:

  • βœ… Local LLM only β€” Ollama runs on your machine. Prompts and conversations never touch a third-party API.
  • βœ… Allowlist authentication β€” Telegram user IDs not in allowed_user_id are silently ignored. The bot does not respond, log, or acknowledge them.
  • βœ… Loopback control API β€” bound to 127.0.0.1 only; not exposed on any network interface.
  • βœ… No outbound telemetry by default for the macOS app's installer steps. Set ANONYMIZED_TELEMETRY=false in .env to also disable the bot's optional PostHog events.
  • βœ… Token storage β€” config.json is mode 0600 after the wizard writes it. The macOS Swift GUI redacts tokens in the Settings view.

Threat model (briefly)

Concern Mitigation
Bot token leaks Revoke via BotFather, regenerate, rewrite config.json
Allowed user phone gets compromised Remove their ID from allowed_user_id, restart
Local code execution by a permitted user Lotus is a remote-control agent; trust the allowlist accordingly
Network sniffer on home wifi Telegram traffic is TLS; control API is loopback
Malicious DMG / EXE Verify the SHA-256 from the Release page against SHA256SUMS.txt

πŸ“ Repository Layout

Lotus/
β”œβ”€β”€ README.md                      ← you are here (connector)
β”‚
β”œβ”€β”€ Windows-MCP/                   # Windows AI agent
β”‚   β”œβ”€β”€ README.md                  ← Windows install + dev guide
β”‚   └── ...                        # Inno Setup, tray app, command engine
β”‚
β”œβ”€β”€ Mac-MCP/                       # macOS native menu-bar app + bot
β”‚   β”œβ”€β”€ README.md                  ← macOS install + dev guide
β”‚   β”œβ”€β”€ ControlPanel/              # Swift Package β€” Lotus.app source
β”‚   β”œβ”€β”€ src/mac_mcp/               # Python MCP server + Telegram bot
β”‚   β”œβ”€β”€ bot_service.py             # bot service entry point
β”‚   β”œβ”€β”€ pyproject.toml
β”‚   └── SETUP.md
β”‚
β”œβ”€β”€ release-notes/                 # per-release curated notes (vX.Y.Z.md)
β”‚   β”œβ”€β”€ README.md
β”‚   β”œβ”€β”€ TEMPLATE.md
β”‚   β”œβ”€β”€ v1.0.0.md
β”‚   β”œβ”€β”€ v2.0.0.md
β”‚   └── v2.0.1.md
β”‚
└── .github/
    β”œβ”€β”€ workflows/swift.yml        # macOS build & release pipeline
    └── RELEASING.md               # pipeline runbook

πŸ‘₯ Contributors

Lotus is the product of two leads, one on each platform:

Satyam Pote
Satyam Pote

Project creator Β· Windows lead
@SatyamPote

Designed and built the original Lotus agent, the Windows tray app, the priority routing engine, the multi-source research pipeline, and the Inno Setup deployment story.
Jayash Bhandary
Jayash Bhandary

macOS lead
@JayashBhandary
πŸ“§ findjayash@gmail.com
πŸ’Ό LinkedIn Β· πŸ“Έ Instagram

Designed and built the macOS native menu-bar app (`Lotus.app`), the universal-binary build pipeline, the standalone DMG installer with bundled `uv` runtime, the writable runtime-dir architecture, and the GitHub Actions release workflow.

Contributing

Pull requests and issues are welcome. Please:

  1. Open an issue first for anything non-trivial β€” saves you from building something we'd want differently.
  2. For UI work, include a screenshot or short screen recording.
  3. For new MCP tools, add a docstring describing the user-facing command, expected arguments, and what state it touches.
  4. Keep curated release notes in release-notes/ up to date β€” they ship as the GitHub Release body.

πŸ“œ License

This project is released under the MIT License β€” see LICENSE for the full text.

The bundled uv binary used by the macOS installer is distributed under the MIT/Apache-2.0 license by Astral. Ollama models you pull are subject to their respective upstream licenses.


Built for stability. Built for privacy. Built for both Windows and Mac.

🌸

Windows guide β†’ Β· macOS guide β†’

About

LOTUS OS is a local Windows automation agent controlled via Telegram. It can manage apps, files, system actions, and messaging with secure, real-time commands.

Topics

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors