A FastMCP server that integrates with Perplexity's API to provide web search and grounded AI answers.
-
search- Ground yourself first- Find relevant sources before asking questions
- Returns URLs, titles, and snippets
- Use this when you don't know about a topic
-
ask- Get AI answers (DEFAULT)- AI-synthesized answers with web grounding
- Uses the
sonarmodel (fast and cost-effective) - Includes citations and optional images/related questions
-
ask_more- Dig deeper- More comprehensive analysis for complex questions
- Uses the
sonar-promodel (more capable but pricier) - Use when
askdoesn't provide sufficient depth
- Python 3.10 or higher
- A Perplexity API key
- uv (recommended) or pip
Using uv (recommended):
uv pip install -e .Or using pip:
pip install -e .Copy the example environment file:
cp .env.example .envEdit .env and add your Perplexity API key:
PERPLEXITY_API_KEY=your_api_key_here
Test the server locally:
uv run fastmcp run server.pyOr with the fastmcp CLI:
fastmcp run server.pyInstall the server for use with Claude Desktop:
fastmcp install claude-code server.pyOr manually add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"perplexity": {
"command": "uv",
"args": ["run", "fastmcp", "run", "/absolute/path/to/server.py"],
"env": {
"PERPLEXITY_API_KEY": "your_api_key_here"
}
}
}
}Deploy to fastmcp.cloud for easy hosting:
git init
git add .
git commit -m "Initial commit"
git remote add origin https://github.com/yourusername/perplexity-mcp.git
git push -u origin main- Visit fastmcp.cloud
- Sign in with GitHub
- Create a new project and connect your repo
- Configure:
- Entrypoint:
server.py - Environment Variables: Add
PERPLEXITY_API_KEY
- Entrypoint:
- Deploy!
Your server will be available at https://your-project-name.fastmcp.app/mcp
FastMCP Cloud automatically:
- ✅ Detects dependencies from
pyproject.toml - ✅ Deploys on every push to
main - ✅ Creates preview deployments for PRs
- ✅ Handles HTTP transport and authentication
1. Don't know about a topic? → Use search()
search("latest AI research papers on transformers")
2. Found sources? → Use ask() to understand
ask("What are the key innovations in transformer models?")
3. Need more depth? → Use ask_more()
ask_more("Explain the mathematical foundations of attention mechanisms in transformers")
query: Search query stringmax_results: Number of results (default: 10)recency: Filter by time -"day","week","month", or"year"domain_filter: Include/exclude domains- Include:
["wikipedia.org", "github.com"] - Exclude:
["-reddit.com", "-pinterest.com"]
- Include:
query: Question to askreasoning_effort:"low","medium"(default), or"high"search_mode:"web"(default),"academic", or"sec"recency: Time filterdomain_filter: Domain filterreturn_images: Include images (default: False)return_related_questions: Include follow-up questions (default: False)
Same parameters as ask(), but uses the more powerful sonar-pro model.
- Start with
search: Free/cheap way to find sources - Default to
ask: Usessonar(cost-effective) - Escalate to
ask_more: Only when needed (more expensive)
perplexity-mcp/
├── server.py # Main FastMCP server
├── pyproject.toml # Dependencies
├── .env.example # Environment template
└── README.md # This file
See what FastMCP Cloud will see:
fastmcp inspect server.pyThis server uses two Perplexity API endpoints:
- Search API (
/search) - Returns ranked search results - Chat Completions API (
/chat/completions) - Returns AI-generated answers
Supported models:
sonar- Fast, cost-effectivesonar-pro- More comprehensive
If you get authentication errors:
- Verify your API key at https://www.perplexity.ai/settings/api
- Check that
PERPLEXITY_API_KEYis set correctly - Make sure there are no extra spaces or quotes
If requests timeout:
- The default timeout is 30s for search, 60s for chat
- Complex questions may take longer
- Consider using
reasoning_effort="low"for faster responses
Test individual tools:
uv run fastmcp dev server.pyThis opens an interactive development interface.
MIT
Contributions welcome! Please open an issue or PR.