A production-ready starter project demonstrating how to build full-stack applications with AI agents deployed to Vercel, orchestrated by Inngest, and powered by Next.js 15. This architecture pattern enables you to build scalable AI-powered applications with long-running workflows.
🔗 GitHub Repository: https://github.com/brookr/serverless-agents
This starter demonstrates a modern serverless architecture for AI applications:
- AI Agents as Vercel Functions: Python FastAPI agents that deploy as serverless functions
- Next.js Frontend: Modern React application with App Router
- Inngest Orchestration: Reliable workflow management for long-running AI tasks
- Vercel Blob Storage: Persistent storage for generated content
- Two-Stage Processing: Separation of concerns with research and formatting agents
This pattern solves common challenges in AI application development:
- Serverless Scalability: Agents scale automatically with demand
- Long-Running Jobs: Inngest handles workflows that exceed typical timeout limits
- Cost Efficiency: Pay only for actual usage, no idle servers
- Developer Experience: Local development mirrors production exactly
- Type Safety: Full TypeScript support across the stack
- Node.js 18+ and npm
- Python 3.9+
- OpenAI API key
- Vercel account (free tier works)
# Clone the repository
git clone https://github.com/brookr/serverless-agents.git
cd serverless-agents
# Install Node.js dependencies
npm install
# Install Python dependencies
pip install -r requirements.txt
Create a .env.local
file:
# Required
OPENAI_API_KEY=your_openai_api_key_here
NEWSLETTER_READ_WRITE_TOKEN=your_vercel_blob_token
# Optional (Inngest will auto-generate if not provided)
INNGEST_SIGNING_KEY=your_inngest_signing_key
You'll need three terminals:
Terminal 1 - Python Agent Server:
uvicorn api.agents:app --reload --reload-dir api --port 8000
Terminal 2 - Next.js Development:
npm run dev
Terminal 3 - Inngest Dev Server:
npx inngest-cli@latest dev --no-discovery -u http://localhost:3000/api/inngest
- Main App: http://localhost:3000
- Python API: http://localhost:8000
- Inngest Dashboard: http://localhost:8288
├── src/
│ ├── app/ # Next.js app directory
│ │ ├── api/
│ │ │ ├── inngest/ # Inngest webhook endpoint
│ │ │ └── newsletter/ # API routes for the example app
│ │ └── newsletter/
│ │ └── [slug]/ # Dynamic pages
│ ├── components/ # React components
│ │ └── ui/ # shadcn/ui components
│ ├── inngest/
│ │ ├── client.ts # Inngest client configuration
│ │ └── functions.ts # Workflow definitions
│ └── lib/
│ └── *.ts # Utilities and helpers
├── api/
│ └── agents.py # FastAPI agent definitions
├── public/ # Static assets
├── vercel.json # Vercel routing configuration
├── requirements.txt # Python dependencies
└── package.json # Node.js dependencies
The api/agents.py
file demonstrates how to structure AI agents:
# Example agent structure
research_agent = Agent(
name="Research Agent",
model="gpt-4.1",
instructions="...",
tools=[WebSearchTool()]
)
@app.post("/research")
async def research_endpoint(request: TopicsRequest):
# Agent logic here
# Example agent structure
formatting_agent = Agent(
name="Formatting Agent",
model="o3", # Your OpenAI Org must be verified to use o3. Else use a different model.
...
)
The src/inngest/functions.ts
shows how to orchestrate long-running tasks:
export const generateNewsletter = inngest.createFunction(
{ id: "generate-newsletter" },
{ event: "newsletter/generate" },
async ({ event, step }) => {
// Multi-step workflow with retries and error handling
}
);
API routes in src/app/api/
demonstrate the integration pattern:
- Webhook endpoint for Inngest
- Status checking endpoints
- Trigger endpoints for workflows
The vercel.json
file is critical for proper Python API routing. Build python after NextJS, and configure the routes to point to the python file:
{
"builds": [
{ "src": "package.json", "use": "@vercel/next" },
{ "src": "api/agents.py", "use": "@vercel/python" }
],
"routes": [
{ "src": "^/api/agents/(.*)", "dest": "/api/agents.py" },
{ "src": "^/api/agents$", "dest": "/api/agents.py" }
]
}
This configuration ensures that all /api/agents/*
paths are routed to your Python FastAPI application.
This starter uses a newsletter generator as an example, but the architecture supports any AI-powered application:
-
Document Processing
- Replace newsletter generation with document analysis
- Use multiple agents for extraction, summarization, etc.
-
Data Pipeline
- Implement ETL workflows with AI enhancement
- Chain multiple processing steps
-
Content Generation
- Build blog post generators, report writers, etc.
- Add review and approval workflows
-
Multi-Modal Applications
- Integrate image generation APIs
- Process audio/video with AI agents
api/agents.py
- Define your AI agents and their capabilitiessrc/inngest/functions.ts
- Create your workflow logicsrc/app/api/
- Add your API endpointssrc/app/
- Build your UI
-
Click the Deploy Button
- This will fork the repository to your GitHub account
- You'll be prompted to enter the required environment variables:
OPENAI_API_KEY
- Your OpenAI API keyNEWSLETTER_READ_WRITE_TOKEN
- Generate this in Vercel Blob storage settings
-
Configure Inngest Integration
⚠️ Critical StepAfter deployment completes:
- Go to your new project in the Vercel dashboard
- Navigate to Settings → Integrations
- Search for and install the Inngest integration
- Follow the setup wizard to connect your Inngest account
- This automatically adds
INNGEST_EVENT_KEY
andINNGEST_SIGNING_KEY
Note: Without this step, the API routes will return 404 errors!
-
Redeploy Your Project
- After installing Inngest, go to the Deployments tab
- Click the three dots menu on the latest deployment
- Select Redeploy
- Wait for the deployment to complete
-
Verify Everything Works
- Visit your deployed site
- Try generating a newsletter
- Check that all API endpoints respond:
/api/newsletter/[slug]
- Newsletter generation/api/agents/ping
- Python API health check/api/agents/research
- AI research agent/api/agents/format
- AI formatting agent/api/inngest
- Inngest webhook
o3-2025-04-16
", then you may need to verify your organization on OpenAI. If you don't verify, you can switch your model from "o3" to something else like "gpt-4.1" to get this working.
If you prefer to deploy manually:
# Clone your fork
git clone https://github.com/YOUR_USERNAME/serverless-agents
cd serverless-agents
# Login to Vercel
vercel login
# Deploy (you'll be prompted for env vars)
vercel --prod
Then follow steps 2-4 above to configure Inngest.
Variable | Description | When Added |
---|---|---|
OPENAI_API_KEY |
OpenAI API key for AI agents | During deployment |
NEWSLETTER_READ_WRITE_TOKEN |
Vercel Blob storage token | During deployment |
INNGEST_EVENT_KEY |
Event security key | Auto-added by Inngest |
INNGEST_SIGNING_KEY |
Webhook signing key | Auto-added by Inngest |
- 404 on API routes: Make sure you've installed the Inngest integration and redeployed
- 500 on Python agents: Check that
OPENAI_API_KEY
is set correctly - Newsletter not generating: Verify all environment variables are present in Vercel dashboard
- User Interaction → Next.js frontend
- API Route → Triggers Inngest workflow
- Inngest Function → Orchestrates the process
- Agent Calls → Vercel Functions execute AI tasks
- Storage → Results saved to Vercel Blob
- Response → User sees the result
- Reliability: Automatic retries and error handling
- Observability: Built-in monitoring and debugging
- Scalability: Handles thousands of concurrent workflows
- Developer Experience: Great local development tools
- Zero Config: Python agents deploy automatically
- Auto-scaling: Handle any load without infrastructure management
- Cost Effective: Pay per execution, not for idle time
- Global Edge: Deploy close to your users
This project is licensed under CC0 1.0 Universal - see the LICENSE.md file.
- Explore the Code: Start with
api/agents.py
andsrc/inngest/functions.ts
- Run Locally: Get the development environment running
- Modify for Your Use Case: Replace the newsletter example with your own AI workflow
- Deploy: Push to production on Vercel
- Share: Let us know what you build!
We welcome contributions! See our Contributing Guidelines for details.
📦 Repository: https://github.com/brookr/serverless-agents
Author: @brookr
Contributors:
- @chrisg220
- YOUR NAME HERE!
Built as a reference architecture for modern AI applications. Use this as a starting point for your own serverless agent projects.