Control AI coding assistants (Claude Code, Codex) remotely from Telegram, GitHub, and more. Built for developers who want to code from anywhere with persistent sessions and flexible workflows/systems.
Quick Start: Core Configuration β’ AI Assistant Setup β’ Platform Setup β’ Start the App β’ Usage Guide
- Multi-Platform Support: Interact via Telegram, GitHub issues/PRs, and more in the future
- Multiple AI Assistants: Choose between Claude Code or Codex (or both)
- Persistent Sessions: Sessions survive container restarts with full context preservation
- Codebase Management: Clone and work with any GitHub repository
- Flexible Streaming: Real-time or batch message delivery per platform
- Generic Command System: User-defined commands versioned with Git
- Docker Ready: Simple deployment with Docker Compose
System Requirements:
- Docker & Docker Compose (for deployment)
- Node.js 20+ (for local development only)
Accounts Required:
- GitHub account (for repository cloning via
/clonecommand) - At least one of: Claude Pro/Max subscription OR Codex account
- At least one of: Telegram account OR GitHub account (for interaction)
π Production Deployment: This guide covers local development setup. To deploy remotely for 24/7 operation on a cloud VPS (DigitalOcean, AWS, Linode, etc.), see the Cloud Deployment Guide.
Get started:
git clone https://github.com/coleam00/remote-agentic-coding-system
cd remote-agentic-coding-systemCreate environment file:
cp .env.example .envSet these required variables:
| Variable | Purpose | How to Get |
|---|---|---|
DATABASE_URL |
PostgreSQL connection | See database options below |
GH_TOKEN |
Repository cloning | Generate token with repo scope |
GITHUB_TOKEN |
Same as GH_TOKEN |
Use same token value |
PORT |
HTTP server port | Default: 3000 (optional) |
WORKSPACE_PATH |
Clone destination | Default: ./workspace (optional) |
GitHub Personal Access Token Setup:
- Visit GitHub Settings > Personal Access Tokens
- Click "Generate new token (classic)" β Select scope:
repo - Copy token (starts with
ghp_...) and set both variables:
# .env
GH_TOKEN=ghp_your_token_here
GITHUB_TOKEN=ghp_your_token_here # Same valueDatabase Setup - Choose One:
Option A: Remote PostgreSQL (Supabase, Neon)
Set your remote connection string:
DATABASE_URL=postgresql://user:password@host:5432/dbnameRun migrations manually after first startup:
# Download the migration file or use psql directly
psql $DATABASE_URL < migrations/001_initial_schema.sqlThis creates 3 tables:
remote_agent_codebases- Repository metadataremote_agent_conversations- Platform conversation trackingremote_agent_sessions- AI session management
Option B: Local PostgreSQL (via Docker)
Use the with-db profile for automatic PostgreSQL setup:
DATABASE_URL=postgresql://postgres:postgres@postgres:5432/remote_coding_agentDatabase will be created automatically when you start with docker compose --profile with-db.
You must configure at least one AI assistant. Both can be configured if desired.
π€ Claude Code
Recommended for Claude Pro/Max subscribers.
Get OAuth Token (Preferred Method):
# Install Claude Code CLI first: https://docs.claude.com/claude-code/installation
claude setup-token
# Copy the token starting with sk-ant-oat01-...Set environment variable:
CLAUDE_CODE_OAUTH_TOKEN=sk-ant-oat01-xxxxxAlternative: API Key (if you prefer pay-per-use credits):
- Visit console.anthropic.com/settings/keys
- Create a new key (starts with
sk-ant-) - Set environment variable:
CLAUDE_API_KEY=sk-ant-xxxxxSet as default assistant (optional):
If you want Claude to be the default AI assistant for new conversations without codebase context, set this environment variable:
DEFAULT_AI_ASSISTANT=claudeπ€ Codex
Authenticate with Codex CLI:
# Install Codex CLI first: https://docs.codex.com/installation
codex login
# Follow browser authentication flowExtract credentials from auth file:
On Linux/Mac:
cat ~/.codex/auth.jsonOn Windows:
type %USERPROFILE%\.codex\auth.jsonSet all four environment variables:
CODEX_ID_TOKEN=eyJhbGc...
CODEX_ACCESS_TOKEN=eyJhbGc...
CODEX_REFRESH_TOKEN=rt_...
CODEX_ACCOUNT_ID=6a6a7ba6-...Set as default assistant (optional):
If you want Codex to be the default AI assistant for new conversations without codebase context, set this environment variable:
DEFAULT_AI_ASSISTANT=codexHow Assistant Selection Works:
- Assistant type is set per codebase (auto-detected from
.claude/commands/or.codex/folders) - Once a conversation starts, the assistant type is locked for that conversation
DEFAULT_AI_ASSISTANT(optional) is used only for new conversations without codebase context
You must configure at least one platform to interact with your AI assistant.
π¬ Telegram
Create Telegram Bot:
- Message @BotFather on Telegram
- Send
/newbotand follow the prompts - Copy the bot token (format:
123456789:ABCdefGHIjklMNOpqrsTUVwxyz)
Set environment variable:
TELEGRAM_BOT_TOKEN=123456789:ABCdefGHI...Configure streaming mode (optional):
TELEGRAM_STREAMING_MODE=stream # stream (default) | batchFor streaming mode details, see Advanced Configuration.
π GitHub Webhooks
Requirements:
- GitHub repository with issues enabled
GITHUB_TOKENalready set in Core Configuration above- Public endpoint for webhooks (see ngrok setup below for local development)
Step 1: Generate Webhook Secret
On Linux/Mac:
openssl rand -hex 32On Windows (PowerShell):
-join ((1..32) | ForEach-Object { '{0:x2}' -f (Get-Random -Maximum 256) })Save this secret - you'll need it for steps 3 and 4.
Step 2: Expose Local Server (Development Only)
Using ngrok (Free Tier)
# Install ngrok: https://ngrok.com/download
# Or: choco install ngrok (Windows)
# Or: brew install ngrok (Mac)
# Start tunnel
ngrok http 3000
# Copy the HTTPS URL (e.g., https://abc123.ngrok-free.app)
# β οΈ Free tier URLs change on restartKeep this terminal open while testing.
Using Cloudflare Tunnel (Persistent URLs)
# Install: https://developers.cloudflare.com/cloudflare-one/connections/connect-apps/install-and-setup/
cloudflared tunnel --url http://localhost:3000
# Get persistent URL from Cloudflare dashboardPersistent URLs survive restarts.
For production deployments, use your deployed server URL (no tunnel needed).
Step 3: Configure GitHub Webhook
Go to your repository settings:
- Navigate to:
https://github.com/owner/repo/settings/hooks - Click "Add webhook"
- Note: For multiple repositories, you'll need to add the webhook to each one individually
Webhook Configuration:
| Field | Value |
|---|---|
| Payload URL | Local: https://abc123.ngrok-free.app/webhooks/githubProduction: https://your-domain.com/webhooks/github |
| Content type | application/json |
| Secret | Paste the secret from Step 1 |
| SSL verification | Enable SSL verification (recommended) |
| Events | Select "Let me select individual events": β Issues β Issue comments β Pull requests |
Click "Add webhook" and verify it shows a green checkmark after delivery.
Step 4: Set Environment Variables
WEBHOOK_SECRET=your_secret_from_step_1Important: The WEBHOOK_SECRET must match exactly what you entered in GitHub's webhook configuration.
Step 5: Configure Streaming (Optional)
GITHUB_STREAMING_MODE=batch # batch (default) | streamFor streaming mode details, see Advanced Configuration.
Usage:
Interact by @mentioning @remote-agent in issues or PRs:
@remote-agent can you analyze this bug?
@remote-agent /command-invoke prime
@remote-agent review this implementation
First mention behavior:
- Automatically clones the repository to
/workspace - Detects and loads commands from
.claude/commands/or.agents/commands/ - Injects full issue/PR context for the AI assistant
Subsequent mentions:
- Resumes existing conversation
- Maintains full context across comments
Choose the Docker Compose profile based on your database setup:
Option A: With Remote PostgreSQL (Supabase, Neon, etc.)
Starts only the app container (requires DATABASE_URL set to remote database in .env):
# Start app container
docker compose --profile external-db up -d --build
# View logs
docker compose logs -f appOption B: With Local PostgreSQL (Docker)
Starts both the app and PostgreSQL containers:
# Start containers
docker compose --profile with-db up -d --build
# Wait for startup (watch logs)
docker compose logs -f app-with-db
# Database tables are created automatically via init scriptOption C: Local Development (No Docker)
Run directly with Node.js (requires local PostgreSQL or remote DATABASE_URL in .env):
npm run devStop the application:
docker compose --profile external-db down # If using Option A
docker compose --profile with-db down # If using Option BOnce your platform adapter is running, you can use these commands:
| Command | Description | Example |
|---|---|---|
/help |
Show available commands | /help |
/clone <url> |
Clone a GitHub repository | /clone https://github.com/user/repo |
/repos |
List cloned repositories | /repos |
/status |
Show conversation state | /status |
/getcwd |
Show current working directory | /getcwd |
/setcwd <path> |
Change working directory | /setcwd /workspace/repo |
/command-set <name> <path> |
Register a custom command | /command-set analyze .claude/commands/analyze.md |
/load-commands <folder> |
Bulk load commands from folder | /load-commands .claude/commands |
/command-invoke <name> [args] |
Execute custom command | /command-invoke plan "Add dark mode" |
/commands |
List registered commands | /commands |
/reset |
Clear active session | /reset |
π Initial Setup
You: /clone https://github.com/anthropics/anthropic-sdk-typescript
Bot: β
Repository cloned successfully!
π Codebase: anthropic-sdk-typescript
π Path: /workspace/anthropic-sdk-typescript
π Detected .claude/commands/ folder
You: /load-commands .claude/commands
Bot: β
Loaded 5 commands:
β’ prime - Research codebase
β’ plan - Create implementation plan
β’ execute - Implement feature
β’ validate - Run validation
β’ commit - Create git commit
π¬ Asking Questions
You: What files are in this repo?
Bot: π Let me analyze the repository structure for you...
[Claude streams detailed analysis]
π§ Working with Commands
You: /command-invoke prime
Bot: π Starting codebase research...
[Claude analyzes codebase structure, dependencies, patterns]
You: /command-invoke plan "Add retry logic to API calls"
Bot: π Creating implementation plan...
[Claude creates detailed plan with steps]
βΉοΈ Checking Status
You: /status
Bot: π Conversation Status
π€ Platform: telegram
π§ AI Assistant: claude
π¦ Codebase: anthropic-sdk-typescript
π Repository: https://github.com/anthropics/anthropic-sdk-typescript
π Working Directory: /workspace/anthropic-sdk-typescript
π Active Session: a1b2c3d4...
π Registered Commands:
β’ prime - Research codebase
β’ plan - Create implementation plan
β’ execute - Implement feature
β’ validate - Run validation
β’ commit - Create git commit
π Reset Session
You: /reset
Bot: β
Session cleared. Starting fresh on next message.
π¦ Codebase configuration preserved.
Create an issue or comment on an existing issue/PR:
@your-bot-name can you help me understand the authentication flow?
Bot responds with analysis. Continue the conversation:
@your-bot-name can you create a sequence diagram for this?
Bot maintains context and provides the diagram.
Streaming Modes Explained
Messages are sent in real-time as the AI generates responses.
Configuration:
TELEGRAM_STREAMING_MODE=stream
GITHUB_STREAMING_MODE=streamPros:
- Real-time feedback and progress indication
- More interactive and engaging
- See AI reasoning as it works
Cons:
- More API calls to platform
- May hit rate limits with very long responses
- Creates many messages/comments
Best for: Interactive chat platforms (Telegram)
Only the final summary message is sent after AI completes processing.
Configuration:
TELEGRAM_STREAMING_MODE=batch
GITHUB_STREAMING_MODE=batchPros:
- Single coherent message/comment
- Fewer API calls
- No spam or clutter
Cons:
- No progress indication during processing
- Longer wait for first response
- Can't see intermediate steps
Best for: Issue trackers and async platforms (GitHub)
Concurrency Settings
Control how many conversations the system processes simultaneously:
MAX_CONCURRENT_CONVERSATIONS=10 # Default: 10How it works:
- Conversations are processed with a lock manager
- If max concurrent limit reached, new messages are queued
- Prevents resource exhaustion and API rate limits
- Each conversation maintains its own independent context
Check current load:
curl http://localhost:3000/health/concurrencyResponse:
{
"status": "ok",
"active": 3,
"queued": 0,
"maxConcurrent": 10
}Tuning guidance:
- Low resources: Set to 3-5
- Standard: Default 10 works well
- High resources: Can increase to 20-30 (monitor API limits)
Health Check Endpoints
The application exposes health check endpoints for monitoring:
Basic Health Check:
curl http://localhost:3000/healthReturns: {"status":"ok"}
Database Connectivity:
curl http://localhost:3000/health/dbReturns: {"status":"ok","database":"connected"}
Concurrency Status:
curl http://localhost:3000/health/concurrencyReturns: {"status":"ok","active":0,"queued":0,"maxConcurrent":10}
Use cases:
- Docker healthcheck configuration
- Load balancer health checks
- Monitoring and alerting systems (Prometheus, Datadog, etc.)
- CI/CD deployment verification
Custom Command System
Create your own commands by adding markdown files to your codebase:
1. Create command file:
mkdir -p .claude/commands
cat > .claude/commands/analyze.md << 'EOF'
You are an expert code analyzer.
Analyze the following aspect of the codebase: $1
Provide:
1. Current implementation analysis
2. Potential issues or improvements
3. Best practices recommendations
Focus area: $ARGUMENTS
EOF2. Load commands:
/load-commands .claude/commands
3. Invoke your command:
/command-invoke analyze "security vulnerabilities"
Variable substitution:
$1,$2,$3, etc. - Positional arguments$ARGUMENTS- All arguments as a single string$PLAN- Previous plan from session metadata$IMPLEMENTATION_SUMMARY- Previous execution summary
Commands are version-controlled with your codebase, not stored in the database.
βββββββββββββββββββββββββββββββββββββββββββββββ
β Platform Adapters (Telegram, GitHub) β
ββββββββββββββββββββ¬βββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββ
β Orchestrator β
β (Message Routing & Context Management) β
ββββββββββββββββ¬βββββββββββββββββββββββββββββββ
β
βββββββββ΄βββββββββ
β β
βΌ βΌ
βββββββββββββββ ββββββββββββββββββββ
β Command β β AI Assistant β
β Handler β β Clients β
β (Slash) β β (Claude/Codex) β
βββββββββββββββ ββββββββββ¬ββββββββββ
β β
ββββββββββ¬ββββββββββ
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββ
β PostgreSQL (3 Tables) β
β β’ Codebases β’ Conversations β’ Sessions β
βββββββββββββββββββββββββββββββββββββββββββββββ
- Adapter Pattern: Platform-agnostic via
IPlatformAdapterinterface - Strategy Pattern: Swappable AI assistants via
IAssistantClientinterface - Session Persistence: AI context survives restarts via database storage
- Generic Commands: User-defined markdown commands versioned with Git
- Concurrency Control: Lock manager prevents race conditions
3 tables with `remote_agent_` prefix
-
remote_agent_codebases- Repository metadata- Commands stored as JSONB:
{command_name: {path, description}} - AI assistant type per codebase
- Default working directory
- Commands stored as JSONB:
-
remote_agent_conversations- Platform conversation tracking- Platform type + conversation ID (unique constraint)
- Linked to codebase via foreign key
- AI assistant type locked at creation
-
remote_agent_sessions- AI session management- Active session flag (one per conversation)
- Session ID for resume capability
- Metadata JSONB for command context
Check if application is running:
docker compose ps
# Should show 'app' or 'app-with-db' with state 'Up'Check application logs:
docker compose logs -f app # If using --profile external-db
docker compose logs -f app-with-db # If using --profile with-dbVerify bot token:
# In your .env file
cat .env | grep TELEGRAM_BOT_TOKENTest with health check:
curl http://localhost:3000/health
# Expected: {"status":"ok"}Check database health:
curl http://localhost:3000/health/db
# Expected: {"status":"ok","database":"connected"}For local PostgreSQL (with-db profile):
# Check if postgres container is running
docker compose ps postgres
# Check postgres logs
docker compose logs -f postgres
# Test direct connection
docker compose exec postgres psql -U postgres -c "SELECT 1"For remote PostgreSQL:
# Verify DATABASE_URL
echo $DATABASE_URL
# Test connection directly
psql $DATABASE_URL -c "SELECT 1"Verify tables exist:
# For local postgres
docker compose exec postgres psql -U postgres -d remote_coding_agent -c "\dt"
# Should show: remote_agent_codebases, remote_agent_conversations, remote_agent_sessionsVerify GitHub token:
cat .env | grep GH_TOKEN
# Should have both GH_TOKEN and GITHUB_TOKEN setTest token validity:
# Test GitHub API access
curl -H "Authorization: token $GH_TOKEN" https://api.github.com/userCheck workspace permissions:
# Use the service name matching your profile
docker compose exec app ls -la /workspace # --profile external-db
docker compose exec app-with-db ls -la /workspace # --profile with-dbTry manual clone:
docker compose exec app git clone https://github.com/user/repo /workspace/test-repo
# Or app-with-db if using --profile with-dbVerify webhook delivery:
- Go to your webhook settings in GitHub
- Click on the webhook
- Check "Recent Deliveries" tab
- Look for successful deliveries (green checkmark)
Check webhook secret:
cat .env | grep WEBHOOK_SECRET
# Must match exactly what you entered in GitHubVerify ngrok is running (local dev):
# Check ngrok status
curl http://localhost:4040/api/tunnels
# Or visit http://localhost:4040 in browserCheck application logs for webhook processing:
docker compose logs -f app | grep GitHub # --profile external-db
docker compose logs -f app-with-db | grep GitHub # --profile with-dbClean and rebuild:
# Stop containers (use the profile you started with)
docker compose --profile external-db down # or --profile with-db
# Clean build
rm -rf dist node_modules
npm install
npm run build
# Restart (use the profile you need)
docker compose --profile external-db up -d --build # or --profile with-dbCheck for type errors:
npm run type-checkCheck logs for specific errors:
docker compose logs app # If using --profile external-db
docker compose logs app-with-db # If using --profile with-dbVerify environment variables:
# Check if .env is properly formatted (include your profile)
docker compose --profile external-db config # or --profile with-dbRebuild without cache:
docker compose --profile external-db build --no-cache # or --profile with-db
docker compose --profile external-db up -d # or --profile with-dbCheck port conflicts:
# See if port 3000 is already in use
# Linux/Mac:
lsof -i :3000
# Windows:
netstat -ano | findstr :3000