A conversational deep-research system built with CrewAI Flow. It routes between casual chat and web research — when you ask a question worth searching, it generates queries and launches a research agent with Firecrawl to produce a cited answer.
User message → @start() starting_flow
→ @router() classify_and_respond (single LLM call)
→ "casual_chat" → present_chat_response() (friendly reply)
→ "search" → execute_search() (inline Agent + Firecrawl)
- You send a message — the router classifies your intent in a single
gpt-4.1-minicall - Casual chat — greetings, thank-yous, or meta-questions get a quick conversational reply
- Search — anything factual triggers 3–5 search queries, handed to an inline Agent with Firecrawl tools
- The agent searches and scrapes — using
FirecrawlSearchToolandFirecrawlScrapeWebsiteTool - You get a cited answer — every factual claim has a numbered inline citation
State persists across runs via @persist(), so you can refine your topic over multiple invocations.
- Python >=3.10, <3.14
- uv for dependency management
crewai installCreate a .env file in the project root:
OPENAI_API_KEY=your_openai_api_key
FIRECRAWL_API_KEY=your_firecrawl_api_keycrewai runThe default user message is set in FlowState.user_message. Modify the kickoff() function or deploy to CrewAI AMP to accept dynamic input.
The research_frontend/ directory contains a Streamlit chat UI that connects to a deployed CrewAI AMP instance.
cd research_frontend
uv sync
uv run streamlit run app.pyBefore running, configure the AMP connection. Create research_frontend/.streamlit/secrets.toml:
CRW_API_URL = "https://your-crew.crewai.com"
CRW_API_TOKEN = "your_api_token"The frontend features:
- In-memory multi-chat management (new chats, switch, delete)
- Kickoff/poll pattern against the CrewAI AMP API
- Dark theme with coral accent, CrewAI branding
- Animated thinking indicator while research is in progress
The Streamlit frontend lives in the research_frontend/ subdirectory. Heroku deploys it using git subtree.
heroku create your-app-nameheroku config:set CRW_API_URL=https://your-crew.crewai.com
heroku config:set CRW_API_TOKEN=your_api_tokengit subtree push --prefix research_frontend heroku mainHeroku picks up .python-version and pyproject.toml automatically — uv handles dependency installation natively, so no requirements.txt is needed. The Procfile starts Streamlit on $PORT.
| Model | Purpose |
|---|---|
Message |
Chat message with role, content, timestamp |
RouterOutput |
Structured LLM response: user_intent (search or casual_chat) + search queries or chat response |
FlowState |
Persisted state: message history, search queries, chat response, final response |
| Method | Decorator | What it does |
|---|---|---|
starting_flow |
@start() |
Appends user message to history |
classify_and_respond |
@router(starting_flow) |
Single LLM call — classifies intent AND generates response/plan |
present_chat_response |
@listen("casual_chat") |
Prints the chat reply from state |
execute_search |
@listen("search") |
Runs inline Agent with Firecrawl tools, produces cited answer |
- Single router LLM call — no separate calls for classification vs response generation
- Inline Agent — no Crew overhead for a single-agent research task
- Firecrawl tools —
FirecrawlSearchToolfor web search,FirecrawlScrapeWebsiteToolfor deep page content - Mandatory citations — every factual claim must have an inline source URL
gpt-4.1-mini— used for both routing (temperature 0.1) and research (temperature 0.2)
deep_research_template/
├── pyproject.toml # Backend dependencies (CrewAI flow)
├── .env # OPENAI_API_KEY, FIRECRAWL_API_KEY
├── src/
│ └── deep_research_agent/
│ └── main.py # Flow, state models, router, agent
└── research_frontend/ # Streamlit chat UI (deployed separately)
├── pyproject.toml # Frontend dependencies (streamlit, requests)
├── Procfile # Heroku: streamlit run app.py
├── .python-version # Python 3.13 for Heroku/uv
├── .streamlit/
│ ├── config.toml # Dark theme config
│ └── secrets.toml # CRW_API_URL, CRW_API_TOKEN (local only)
├── app.py # Chat UI, session state, sidebar
├── api.py # AMP API client (kickoff/poll)
└── assets/
└── crewai_logo.svg # Branding
| Issue | Fix |
|---|---|
| Missing API keys | Ensure OPENAI_API_KEY and FIRECRAWL_API_KEY are in .env |
| Frontend can't connect | Check CRW_API_URL and CRW_API_TOKEN in secrets.toml or Heroku config vars |
| Wrong Python version | Use Python >=3.10, <3.14 (backend) or 3.13 (frontend/Heroku) |
| Dependencies missing | Run crewai install (backend) or uv sync (frontend) |
| Stale persisted state | Delete the .crewai persistence directory and re-run |
| Heroku deploy fails | Make sure you push the subtree: git subtree push --prefix research_frontend heroku main |