A prototype demonstrating a Symfony application using a PHP library to communicate with LLM agents via WebSocket and JSON-RPC.
┌─────────────────┐ Library ┌──────────────────┐ WebSocket ┌─────────────────┐
│ Symfony App │ ←──────────→ │ PHP Library │ ←─────────────→ │ Python │
│ (Web Demo) │ Integration │ (Communication) │ JSON-RPC │ (LLM Logic) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
- Symfony App: Web application with FrankenPHP and chatbot interface
- PHP-Python Library: Library for communicating with LLM services
- Python LLM Agent: Simulates an LLM agent that makes function calls
- PostgreSQL: Database for persistence
- Docker and Docker Compose v2
- PHP 8.4+ with Composer
- Python 3.11+ with aiohttp
- PostgreSQL 18 (optional)
php_python_proto/
├── php-python-lib/ # PHP library for LLM agent communication
├── symfony-app/ # Symfony 7.3 application with FrankenPHP
│ └── frankenphp/ # FrankenPHP configuration (Caddyfile, entrypoint)
├── agent-python/ # Python LLM agent server
└── compose.yaml # Docker Compose configuration
- Copy environment file
cp .env.example .env- Start all services
docker compose up -d- Access the application
- Symfony app: http://localhost (HTTP) or https://localhost (HTTPS)
- Python agent: ws://localhost:9000/ws
- PostgreSQL: localhost:5432
- Caddy metrics: http://localhost:2019/metrics
- View logs
docker compose logs -f symfony-app- Stop services
docker compose downcd php-python-lib
composer install
cd symfony-app
composer installpip install aiohttp# Start in detached mode
docker compose up -d
# Or with logs
docker compose upcd agent-python
python agent_server.pycd symfony-app
php -S localhost:8000 -t public- Open your browser and go to: http://localhost (Docker) or http://localhost:8000 (local)
- Enter your name
- Send messages to the LLM agent (try words or phrases like "hello", "test message", etc.)
- Watch the agent analyze your text using the available functions:
getStringLength: Gets the length of textcountWords: Counts words in textreverseString: Reverses the text
- FrankenPHP: Modern PHP application server with HTTP/2, HTTP/3, and automatic HTTPS
- PostgreSQL 18: Database server
- Python 3.11: LLM agent server with aiohttp
- Caddy: Built-in web server (via FrankenPHP)
- Hot-reload for development
- Multi-stage Docker builds (dev/prod)
- Health checks for all services
- Persistent database volume
- Automatic HTTPS certificates
The library provides an AgentClient class that:
- Connects to the Python WebSocket server
- Registers PHP functions that the agent can call
- Handles JSON-RPC communication
- Returns raw message exchange data for debugging
- Provides a simple chatbot web interface
- Runs on FrankenPHP with worker mode for optimal performance
- Uses the PHP library to communicate with the agent
- Displays both the agent response and raw JSON-RPC messages
- Implements demo functions for text analysis
- Simulates an LLM that makes function calls
- Uses WebSocket for real-time communication
- Implements JSON-RPC 2.0 protocol