A comprehensive GenAI toolkit featuring local DeepSeek model implementation, document processing (RAG), and integration with various services including Airbnb booking and OpenAI API.
graph TD
subgraph "Frontend Interfaces"
A1[Streamlit UI - Main Chat]
A2[Streamlit UI - RAG]
A3[Gradio UI - Airbnb]
end
subgraph "AI Processing Layer"
B1[DeepSeek Local]
B2[OpenAI API]
B3[LangChain]
end
subgraph "Document Processing"
C1[PDF Loader]
C2[Text Splitter]
C3[Vector Store]
end
subgraph "External Services"
D1[MCP Server]
D2[Airbnb API]
D3[OpenAI Services]
end
A1 --> B1
A2 --> C1
A3 --> D1
C1 --> C2
C2 --> C3
C3 --> B3
B3 --> B1
B3 --> B2
D1 --> D2
B2 --> D3
sequenceDiagram
participant User
participant UI as Streamlit UI
participant PDF as PDF Processor
participant VS as Vector Store
participant AI as DeepSeek AI
User->>UI: Upload PDF
UI->>PDF: Process Document
PDF->>VS: Store Vectors
User->>UI: Ask Question
UI->>VS: Search Similar Content
VS->>AI: Provide Context
AI->>UI: Generate Response
UI->>User: Display Answer
- π€ Local DeepSeek model integration (1.5b and 3b variants)
- π RAG implementation with AI Health coach assistant Agent for PDF processing to analyze images and and documents and provide recommendation
- π Airbnb booking assistant with MCP server
- π OpenAI API integration
- π Custom-styled UI interfaces
- π Vector store for document retrieval
# Key configurations
llm_engine = ChatOllama(
model=selected_model,
base_url="http://localhost:11434",
temperature=0.3
)# Core components
EMBEDDING_MODEL = OllamaEmbeddings(model="deepseek-r1:1.5b")
DOCUMENT_VECTOR_DB = InMemoryVectorStore(EMBEDDING_MODEL)
LANGUAGE_MODEL = OllamaLLM(model="deepseek-r1:1.5b")# Core components
EMBEDDING_MODEL = OllamaEmbeddings(model="deepseek-r1:1.5b")
DOCUMENT_VECTOR_DB = InMemoryVectorStore(EMBEDDING_MODEL)
LANGUAGE_MODEL = OllamaLLM(model="deepseek-r1:1.5b")agent = Agent(
instructions="""You help book apartments on Airbnb.""",
llm="gpt-4o-mini",
tools=MCP("npx -y @openbnb/mcp-server-airbnb")
)# Clone repository
git clone <repository-url>
# Install dependencies
pip install -r requirements.txt
# Required packages
streamlit
langchain_core
langchain_community
langchain_ollama
pdfplumber
praisonaiagents
mcp
openai
gradio
requests- Start Main Chat Interface
streamlit run app.py- Launch Document Assistant
streamlit run rag_deep.py- Launch AI Health Coach Agent Assistant
streamlit run ai_recipe_assistant.py- Start Airbnb Search Agent Assistant
python airbnb_search.py# Available models
models = ["deepseek-r1:1.5b", "deepseek-r1:3b"]
# RAG Configuration
CHUNK_SIZE = 1000
CHUNK_OVERLAP = 200- π Secure API key management
- π’ Local model execution
- π No data persistence
- π Safe document handling
π Note: Ensure all API keys and configurations are properly set before running the applications.
This README now includes:
- Detailed system architecture diagrams
- Component interaction flows
- Code snippets from actual implementation
- Clear installation and usage instructions
- Security considerations