A minimal application demonstrating how to build an OpenAI Apps SDK compatible MCP server with widget rendering in ChatGPT.
This project shows how to integrate an application with the ChatGPT Apps SDK using the Model Context Protocol (MCP). It includes a working MCP server that exposes tools and resources that can be called from ChatGPT, with responses rendered natively in ChatGPT.
The core MCP server implementation that exposes tools and resources to ChatGPT.
Key features:
- Tool registration with OpenAI-specific metadata
- Resource registration that serves HTML content for iframe rendering
- Cross-linking between tools and resources via
templateUri
OpenAI-specific metadata:
{
"openai/outputTemplate": widget.templateUri, // Links to resource
"openai/toolInvocation/invoking": "Loading...", // Loading state text
"openai/toolInvocation/invoked": "Loaded", // Completion state text
"openai/widgetAccessible": false, // Widget visibility
"openai/resultCanProduceWidget": true // Enable widget rendering
}Full configuration options: OpenAI Apps SDK MCP Documentation
A utility to convert Markdown content into PDF documents entirely in Node.js without headless browsers.
Features:
- Pure JS generation: Uses
pdf-libandmarkedfor lightweight serverless deployment. - Rich formatting: Supports headers, code blocks, lists, blockquotes, and inline styling.
- Custom Layout Engine: properly handles text wrapping and pagination.
- Multiple Storage Backends: Supports Vercel Blob, AWS S3, n8n webhooks, and local storage.
Critical: Set assetPrefix to ensure static assets are fetched from the correct origin when running inside an iframe.
The application supports multiple storage backends for PDF files, controlled via environment variables:
Environment Variables:
UPLOAD_VIA_VERCEL=true- Enable Vercel Blob storage (requiresBLOB_READ_WRITE_TOKEN)UPLOAD_VIA_S3=true- Enable AWS S3 storage (requires AWS credentials and bucket configuration)UPLOAD_VIA_N8N_GDRIVE=true- Enable n8n webhook uploadN8N_WEBHOOK_URL- Custom n8n webhook URL (defaults to the provided secondspring URL)
Storage Behavior:
- Development Mode: In development (
NODE_ENV=development), the system always uses local disk storage regardless of upload flags - Production Mode: When upload flags are set, the system uses composite storage that tries multiple methods in order:
- Vercel Blob (if
UPLOAD_VIA_VERCEL=true) - n8n Webhook (if
UPLOAD_VIA_N8N_GDRIVE=true) - AWS S3 (if
UPLOAD_VIA_S3=true)
- Vercel Blob (if
- The first successful upload method is used; if all methods fail, an error is thrown
- If no upload flags are set in production, the system falls back to Vercel Blob (if token exists) or local storage
The application supports uploading files to AWS S3 using access keys. This is useful for production deployments where you want to store generated PDFs in your own S3 bucket.
Step 1: Create AWS S3 Bucket
- Log in to the AWS Console
- Navigate to S3 and create a new bucket
- Note your bucket name and region (e.g.,
my-pdf-bucket,us-east-1)
Step 2: Create IAM User with S3 Access
- Go to IAM → Users → Create User
- Create a user (e.g.,
pdf-uploader) - Attach a policy that grants S3 permissions:
json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": ["s3:PutObject", "s3:GetObject", "s3:GetBucketLocation"], "Resource": [ "arn:aws:s3:::your-bucket-name/*", "arn:aws:s3:::your-bucket-name" ] } ] } - Create access keys for the user: - Go to the user → Security credentials tab - Click "Create access key" - Save the Access Key ID and Secret Access Key (you won't be able to see the secret again)
Step 3: Configure Environment Variables
Set the following environment variables in your deployment (Vercel, local .env, etc.):
Option A: Standard AWS SDK Variables (Recommended)
UPLOAD_VIA_S3=true
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_S3_BUCKET_NAME=your-bucket-name
AWS_REGION=us-east-1Option B: Custom Variable Names If you prefer to use custom environment variable names:
UPLOAD_VIA_S3=true
AWS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_S3_BUCKET_NAME=your-bucket-name
AWS_REGION=us-east-1Step 4: Configure Bucket for Public Access (Optional)
If you want the generated URLs to be publicly accessible:
- Go to your S3 bucket → Permissions tab
- Edit "Block public access" settings (if needed)
- Add a bucket policy to allow public read access:
json { "Version": "2012-10-17", "Statement": [ { "Sid": "PublicReadGetObject", "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::your-bucket-name/*" } ] }
Note: If you don't configure public access, the URLs will be generated but may return 403 Forbidden errors when accessed directly. You can still use the files programmatically with the same credentials.
Step 5: Verify Configuration
The S3 storage implementation includes several automatic features:
- Automatic Region Detection: If the bucket region differs from your configured
AWS_REGION, the system automatically detects and uses the correct region - Endpoint Error Retry: If an endpoint error occurs (wrong region), the system automatically retries with the correct region
- Default Region: If
AWS_REGIONis not set, defaults tous-east-1 - Region Parsing: Supports comments in region variable (e.g.,
AWS_REGION=us-east-1 # my region) - URL Generation: Generates public URLs in the format:
https://bucket-name.s3.region.amazonaws.com/filename.pdf(for most regions)https://bucket-name.s3.amazonaws.com/filename.pdf(for us-east-1)
- Content Type: Files are uploaded with
Content-Type: application/pdf
Example .env file:
# Enable S3 upload
UPLOAD_VIA_S3=true
# AWS Credentials
AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
# S3 Configuration
AWS_S3_BUCKET_NAME=my-pdf-bucket
AWS_REGION=us-east-1Security Best Practices:
- Never commit access keys to version control
- Use environment variables or secrets management (Vercel Secrets, AWS Secrets Manager, etc.)
- Rotate access keys regularly
- Use IAM policies with least privilege (only grant necessary S3 permissions)
- Consider using IAM roles instead of access keys when running on AWS infrastructure
Troubleshooting:
- 403 Forbidden errors: Check that your IAM user has
s3:PutObjectands3:GetBucketLocationpermissions - Region mismatch errors: The system should auto-detect and correct this, but ensure
AWS_REGIONmatches your bucket region if issues persist - Endpoint errors: Usually indicates a region mismatch - the system will automatically retry with the correct region
- Missing credentials: Ensure both
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEYare set (orAWS_KEY_IDif using custom names) - Bucket not found: Verify
AWS_S3_BUCKET_NAMEmatches your actual bucket name exactly - Development mode: Remember that in development, S3 uploads are disabled and local storage is used instead
The <NextChatSDKBootstrap> component patches browser APIs to work correctly within the ChatGPT iframe:
What it patches:
history.pushState/history.replaceState- Prevents full-origin URLs in historywindow.fetch- Rewrites same-origin requests to use the correct base URL<html>attribute observer - Prevents ChatGPT from modifying the root element
npm install
# or
pnpm installInitialize Database:
make setup
# or
npx tsx scripts/init-db.ts
pnpm db:pushReset Database (Drops All Tables):
make db-reset
# Interactive confirmation required
# For non-interactive use (CI/CD):
make db-reset-forceGenerate Migrations:
make db-generate
# or
pnpm db:generateApply Migrations:
make db-migrate
# or
pnpm db:migrateSetup Row Level Security (RLS):
make setup-rls
# or
npx tsx scripts/setup-rls.tsNote: RLS is automatically applied when running make setup.
Row Level Security (RLS) Policies:
- SELECT: Users can view their own files (where
user_idmatchesauth.uid()) or public files (whereuser_idis NULL) - INSERT: Users can create files with their own
user_idor public files (NULLuser_id) - UPDATE: Users can only update files they own
- DELETE: Users can only delete files they own
- Service Role: Has full access (typically used by server-side applications)
Important:
- When using Supabase's service role connection string (server-side), RLS is bypassed by default
- RLS policies apply when using authenticated user connections (client-side)
- The
user_idfield should match Supabase'sauth.uid()for RLS to work correctly
npm run dev
# or
pnpm devOpen http://localhost:3000 to see the app.
The MCP server is available at:
http://localhost:3000/mcp
The easiest way to test your MCP server is using the MCP Inspector:
Option 1: Streamable HTTP (Recommended)
This connects directly to your running Next.js server - no separate script needed!
# Terminal 1: Start the dev server
make dev
# or: pnpm dev
# Terminal 2: Start the inspector UI
make inspector
# or: pnpm inspectorThen in the Inspector UI:
- Select "Streamable HTTP" as the transport type
- Enter the URL:
http://localhost:3000/mcp - Click Connect
Option 2: STDIO Transport (Legacy)
For direct STDIO-based testing without the HTTP server:
make inspector-stdio
# or: pnpm inspector:stdioNote: The Streamable HTTP method is preferred because it tests the exact same code path that ChatGPT will use in production.
- Deploy your app.
- In ChatGPT, navigate to Settings → Connectors → Create and add your MCP server URL with the
/mcppath (e.g.,https://your-app.com/mcp)
Note: Connecting MCP servers to ChatGPT requires developer mode access. See the connection guide for setup instructions.
app/
├── mcp/
│ └── route.ts # MCP server with tool/resource registration
├── layout.tsx # Root layout with SDK bootstrap
├── page.tsx # Homepage content
└── globals.css # Global styles
middleware.ts # CORS handling
- Tool Invocation: ChatGPT calls a tool registered in
app/mcp/route.ts - Resource Reference: Tool response includes
templateUripointing to a registered resource - Widget Rendering: ChatGPT fetches the resource HTML and renders it in an iframe
- Client Hydration: The app hydrates inside the iframe with patched APIs
- Navigation: Client-side navigation uses patched
fetch