MIRQ Client Lua Plugins bring local-first AI chat, guild collaboration helpers, and workflow automation directly into MIRQ channels and DMs.
If you are looking for:
- AI chat in MIRQ with local model support
- Team collaboration assistants for shared channels
- Privacy-aware plugin automation with local memory
this repository is the starting point.
| Component | What it does | Main triggers | Local dependency |
|---|---|---|---|
hello-world |
Minimal plugin sanity check | (none) | none |
lmstudio-assistant |
Local AI assistant with memory, skills, and optional MCP tool calls | /ai ..., @ai ... |
LM Studio API (127.0.0.1:1234) |
otacon-assistant |
OpenClaw-to-MIRQ integration assistant with trust controls, privacy modes, and guild context | /otacon ..., otacon ..., @openclaw ... |
Otacon bridge (127.0.0.1:8787) |
otacon-bridge |
Python localhost bridge for otacon-assistant |
HTTP endpoints | Python 3 + openclaw CLI |
- Copy plugin folders into your MIRQ client plugin directory.
- Open MIRQ and click the
Pluginsbutton in the footer. - Click
Run / Reloadin the plugin editor. - Start the local backend(s):
otacon-assistant: run the bridgelmstudio-assistant: run LM Studio local server
MIRQ client-side plugins are loaded from:
- Windows:
%LOCALAPPDATA%\mirq\plugins\ - macOS:
~/Library/Application Support/mirq/plugins/ - Linux:
~/.local/share/mirq/plugins/
Each plugin must contain:
plugin.jsonmain.lua
Copy-Item -Recurse -Force .\hello-world "$env:LOCALAPPDATA\mirq\plugins\"
Copy-Item -Recurse -Force .\lmstudio-assistant "$env:LOCALAPPDATA\mirq\plugins\"
Copy-Item -Recurse -Force .\otacon-assistant "$env:LOCALAPPDATA\mirq\plugins\"- MIRQ discovers plugin subfolders with
plugin.json. - Manifest fields are validated (
id,entry,apiVersion,capabilities). - Supported API version is
1.x. - Runtime hooks include
onLoad,onUnload,onMessageCreate, and optionalonTick. - Capability gates are enforced (for example
chat:write,dm:write,http:localhost). mirq.http.post(...)is restricted to localhost for client plugin security.- Plugin local memory/skills persist in plugin state (
.mirq_state.json). - Plugin callbacks run with an instruction budget to reduce runaway scripts.
- In client startup flow, bundled
otacon-assistantandlmstudio-assistantare synced into runtime plugin storage when available. - Plugin-authored messages are visually identified in the MIRQ UI flow.
otacon-assistant is designed specifically to integrate the OpenClaw assistant with MIRQ chat channels and DMs.
OpenClaw:
- Website:
https://openclaw.ai/ - CLI is used by the local bridge (
openclaw agent ...) to generate responses.
Default bridge endpoints in otacon-assistant/main.lua:
- Sync:
http://127.0.0.1:8787/mirq/assist(default mode) - Async:
http://127.0.0.1:8787/mirq/submit+http://127.0.0.1:8787/mirq/outbox
To switch to async mode, set transport_mode = "submit_async" in otacon-assistant/main.lua.
Commands:
/otacon help/otacon enable | /otacon disable/otacon privacy <strict|balanced|open>(admin)/otacon whoactive/otacon profile [user_id]/otacon remember <fact>/otacon admin add <user_id> | remove <user_id> | list(admin)/otacon trust set <user_id> <blocked|low|normal|high>(admin)
Behavior highlights:
- Works in channels and DMs.
- Adds requester trust/admin and guild context into bridge payload.
- Includes privacy guardrails before forwarding sensitive requests.
- Supports channel and guild allowlists in config.
From this repo:
python otacon-bridge/otacon_bridge.pyOptional bridge environment settings:
OTACON_ALLOWED_GUILD_IDSOTACON_ALLOWED_CHANNEL_IDSOTACON_SESSION_MODE(authororchannel)OTACON_AGENT_TIMEOUT_SECONDS
Default model endpoint in lmstudio-assistant/main.lua:
http://127.0.0.1:1234/v1/chat/completions
Default MCP endpoint:
http://127.0.0.1:8788/mcp/call
Commands:
/ai help/ai enable | /ai disable/ai remember <fact>/ai profile/ai skill set <name> <text>/ai skill get <name> | /ai skill list | /ai skill del <name>/ai mcp <server> <tool> <json-args>
Behavior highlights:
- Responds to
/ai ...,@ai ..., and DMs. - Builds lightweight profile/history and channel memory context.
- Supports local reusable skills and optional localhost MCP tool routing.
Minimal starter plugin used to verify load and logging:
hello-world/main.lua- Logs a startup message with
mirq.log(...).
- Plugin not loading:
- Confirm
plugin.jsonis valid JSON. - Confirm
entryfile exists. - Confirm
apiVersionis compatible (1.x).
- Confirm
Otacon bridgeerrors:- Ensure bridge is running on
127.0.0.1:8787. - Verify allowlist env vars are not blocking your guild/channel.
- Ensure bridge is running on
LM Studioerrors:- Ensure local server is running on
127.0.0.1:1234. - Confirm a model is loaded and OpenAI-compatible endpoint is enabled.
- Ensure local server is running on
- HTTP permission errors:
- Client plugins only allow localhost HTTP calls by design.
client-lua-plugins/
hello-world/
lmstudio-assistant/
otacon-assistant/
otacon-bridge/