🎯 Project #1: Local Multi-Provider Chat Agent (Ollama-first)
Source code: /chat-agent-api
Build a simple HTTP service that:
Uses Ollama as the primary model (e.g., llama3), but keeps the code ready to switch providers via init_chat_model-style config.
Exposes:
- Single-turn Q&A (/ask).
- Multi-turn chat with message history using HumanMessage and AIMessage.
- Wraps everything in a small "agent" abstraction (no tools yet) using create_agent so you can later add tools.
Example usage flow:
POST /chat with {
"session_id": "abc",
"message": "Tell me about Luna City"
}
-
Service loads past messages for session_id, calls the agent, streams tokens back to the client.
-
All LLM calls go through Ollama instead of OpenAI.
🎯 Project #2: YouTube Content Researcher API
Source code: /youtube-researcher-api
100% Local Ollama - no external APIs
What it does:
- /research → Analyzes video topics, suggests titles/thumbnails
- /trending → Finds hot YouTube niches (AI/IoT/embedded)
- /competitors → Analyzes competitor channels