A comprehensive AI-powered research assistant for querying Taiwan's Legislative Yuan (立法院) data. Built with Google's Gemini 2.5 Pro model and the pydantic-ai framework, it provides conversational access to legislative information through CLI, web interface, and API.
🚀 Live Demo: https://andydai.github.io/lybot/ | API: https://lybot-z5pc.onrender.com/v1
- 40+ Specialized Tools for comprehensive legislative data analysis
- Traditional Chinese Interface with natural language understanding
- Modern Web UI with real-time streaming responses and tool visualization
- OpenAI-Compatible API for seamless integration with existing tools
- CLI Tool for terminal-based interactions
- Comprehensive Coverage:
- 👥 Legislators: profiles, committees, attendance
- 📜 Bills: search, details, co-signers, progress tracking
- 🗳️ Voting Records: extraction from gazette PDFs
- 💬 Interpellations: speeches and position analysis
- 📊 Analytics: party statistics, cross-party cooperation, performance metrics
- Python >=3.12
- Node.js 18+ (for web frontend)
- Google API Key for Gemini model
- Clone the repository:
git clone https://github.com/yourusername/lybot.git
cd lybot
- Install Python dependencies using
uv
(modern Python package manager):
# Install uv if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install project dependencies
uv sync
- Set up your API key and LLM MODEL:
# GOOGLE
export GEMINI_API_KEY="your-api-key-here"
# OPENAI
export OPENAI_API_KEY="..."
# AZURE
export AZURE_OPENAI_ENDPOINT="..."
export AZURE_OPENAI_API_KEY="..."
export OPENAI_API_VERSION="..."
# LLM Model
export LLM_MODEL="azure:gpt-4.1" # AZURE
export LLM_MODEL="google-gla:gemini-2.5-flash" # GOOGLE
export LLM_MODEL="openai:gpt-4o" # OpenAI
- Run
uv run python api.py
The LyBot API can be accessed locally or via the deployed instance:
Default Production URL: https://lybot-z5pc.onrender.com/v1
For Frontend (Web UI):
# Create a .env file in the frontend directory
cd frontend
echo "VITE_API_BASE_URL=http://localhost:8000/v1" > .env
# Or use a custom deployment
echo "VITE_API_BASE_URL=https://your-custom-api.com/v1" > .env
For Python Clients:
# Using OpenAI client
from openai import OpenAI
# Local development
client = OpenAI(
base_url="http://localhost:8000/v1/",
api_key="not-needed" # No API key required for local
)
# Production deployment
client = OpenAI(
base_url="https://lybot-z5pc.onrender.com/v1/",
api_key="not-needed"
)
python main.py
- Start the API server:
./run_api.sh
# or manually: uvicorn api:app --host 0.0.0.0 --port 8000
- In a new terminal, start the frontend:
cd frontend
npm install # first time only
./run_frontend.sh
# or manually: npm run dev
- Open http://localhost:5173 in your browser
# See example_client.py for comprehensive examples
from openai import OpenAI
# Configure the client with your API URL
client = OpenAI(
base_url="http://localhost:8000/v1/", # Local development
# base_url="https://lybot-z5pc.onrender.com/v1/", # Production
api_key="not-needed" # No API key required
)
response = client.chat.completions.create(
model="gemini-2.0-flash-thinking-exp-01-21",
messages=[{"role": "user", "content": "誰是台北市第七選區的立委?"}]
)
print(response.choices[0].message.content)
- Search by constituency, party, or name
- Get detailed profiles and committee memberships
- Track proposed bills and meeting attendance
- Analyze party statistics
- Search bills by keyword, proposer, or session
- Get bill details and co-signers
- Analyze legislator bill proposals
- Track bill progress
- Search gazette records
- Extract voting records from PDFs
- Calculate attendance rates
- Compare legislator performance
- Cross-party cooperation analysis
- Voting alignment tracking
- Activity ranking
- Performance comparisons
lybot/
├── main.py # CLI entry point with pydantic-ai agent
├── api.py # FastAPI server with OpenAI-compatible endpoints
├── tools/ # Legislative data tools (40+ tools)
│ ├── legislators.py # Legislator queries and profiles
│ ├── bills.py # Bill search and analysis
│ ├── gazettes.py # Gazette and voting records
│ ├── interpellations.py # Speech records
│ ├── meetings.py # Meeting attendance
│ └── analysis.py # Advanced analytics
├── prompts/ # Agent system prompts (Traditional Chinese)
├── models.py # Data models and types
├── frontend/ # React + TypeScript web interface
│ ├── src/
│ │ ├── components/ # UI components (shadcn/ui)
│ │ └── lib/ # API client and utilities
│ └── dist/ # Production build
├── example_client.py # API usage examples
├── run_api.sh # API server launcher
└── pyproject.toml # Project dependencies (managed by uv)
- Backend: Python 3.12+, FastAPI, pydantic-ai, asyncio
- AI Model: Google Gemini 2.5 Pro (with experimental thinking mode)
- Frontend: React, TypeScript, Vite, shadcn/ui, Tailwind CSS v4
- Package Management: uv (Python), npm (Node.js)
- Data Source: Taiwan Legislative Yuan API
- Create a new tool in the
tools/
directory - Follow the existing async pattern with proper error handling
- Register the tool in the agent configuration
- Update type definitions if needed
cd frontend
npm run dev # Development server with hot reload
npm run build # Production build
npm run preview # Preview production build
Key Features:
- Real-time streaming with Server-Sent Events (SSE)
- Tool call visualization showing AI's reasoning process
- Responsive design with mobile support
- Dark/light theme toggle
- Enhanced markdown rendering for legislative transcripts
The API follows OpenAI's chat completions format, making it compatible with most LLM client libraries.
Endpoints:
POST /v1/chat/completions
- Main chat endpointGET /v1/models
- List available models- Full CORS support for web clients
POST /v1/chat/completions
- Main chat endpoint (streaming/non-streaming)GET /v1/models
- List available models
- GOOGLE_API_KEY: Required for Gemini model access
- Frontend: Configure in
frontend/.env
if needed - API Port: Default 8000, configurable in
run_api.sh
# Interactive mode
python main.py
# Example queries:
> 誰是台北市第七選區的立委?
> 民進黨有多少席次?
> 最近有哪些關於環保的法案?
> 分析國民黨和民進黨在勞工議題上的立場差異
> 找出跨黨派合作的法案
- Modern chat interface with message history
- Real-time streaming responses with thinking process
- Tool call visualization showing data sources
- Professional Taiwan Legislative Yuan branding
- Dark/light theme support
- Optimized for Chinese typography
Using the Production Deployment:
# No local setup required - use the deployed API
from openai import OpenAI
client = OpenAI(
base_url="https://lybot-z5pc.onrender.com/v1/",
api_key="not-needed"
)
# Basic query
response = client.chat.completions.create(
model="gemini-2.0-flash-thinking-exp-01-21",
messages=[{"role": "user", "content": "民進黨有幾位立委?"}]
)
print(response.choices[0].message.content)
# Streaming example
for chunk in client.chat.completions.create(
model="gemini-2.0-flash-thinking-exp-01-21",
messages=[{"role": "user", "content": "分析立委出席率"}],
stream=True
):
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
Using httpx for direct API calls:
import httpx
# Production API
response = httpx.post(
"https://lybot-z5pc.onrender.com/v1/chat/completions",
json={
"model": "gemini-2.0-flash-thinking-exp-01-21",
"messages": [{"role": "user", "content": "誰是高雄市第一選區立委?"}],
"stream": False
}
)
print(response.json()["choices"][0]["message"]["content"])
This project is licensed under the MIT License - see LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
- Follow existing code patterns and async conventions
- Add tests for new features
- Update documentation as needed
- Use Traditional Chinese for user-facing strings
- Ensure compatibility with the 11th Legislative Yuan term data
- Data Source: Taiwan Legislative Yuan Open Data
- AI Model: Google Gemini 2.5 Pro with experimental thinking mode
- UI Components: shadcn/ui
- Framework: pydantic-ai for structured AI interactions
Made with ❤️ for Taiwan's democratic transparency