Cerno is an open-source workspace for conducting deep, multi-step research and analysis using autonomous AI agents. Designed for developers and researchers who demand analytical transparency, Cerno exposes every reasoning step—from prompt decomposition to final synthesis—so you can observe, debug, and steer complex agentic workflows with confidence.
- Highlights
- Local-First Principles
- Active Development & Community
- Prerequisites
- Getting Started
- Post-Migration Setup
- Docker Installation
- CLI Reference
- Project Structure
- Screenshots
- Use Cases
- Roadmap
- Security & Privacy
- Metrics & Benchmarks
- Contributing
- License
- Model-Agnostic Core: Effortlessly switch between premier LLMs (OpenAI, Google Gemini, Anthropic, DeepSeek) or run local models via Ollama.
- Zero-Config Setup: One CLI, one command—automatically create a virtual environment, install dependencies, and configure your workspace.
- Transparent Execution Plan: Visualize each agent task as it moves through Pending → Running → Success/Error states in real time.
- Verifiable Artifacts: Every source, webpage, and generated file (reports, code, data) is tracked and organized for easy auditing.
- Adaptive Depth: Simple queries spawn lightweight plans; complex directives trigger multi-agent, multi-tool orchestrations.
- Token & Cost Optimization: A manager-worker agent architecture balances quality and cost. Get a complete cost breakdown upon task completion.
- Local-First Ethos: Work offline, retain full control of your data, and avoid vendor lock-in. Cerno’s local-first architecture ensures your research stays where you want it: on your machine.
Cerno embraces a local-first philosophy:
- Data Sovereignty: All research artifacts—notes, reports, intermediate files—live on your local drive by default.
- Offline Capability: Core features work without internet. Use local LLMs (via Ollama) for research when connectivity is limited.
- Privacy & Security: Sensitive prompts and outputs never leave your machine unless explicitly configured.
- Interoperability: Write, export, and share results in standard formats (Markdown, Jupyter notebooks, JSON) without proprietary lock-in.
Cerno is under active development—we’re constantly pushing new features, performance optimizations, and integrations. Your feedback is invaluable:
- Bug Reports: Found an issue? Please open an issue on GitHub with detailed steps to reproduce.
- Feature Requests: Have a great idea? Share it as an issue or discussion ticket.
- Contributions: We welcome pull requests! See our CONTRIBUTING.md for guidelines on setting up your dev environment, coding standards, and how to submit changes.
Let’s build something amazing together! 🚀
- Python ≥ 3.10
- Node.js ≥ 18.x & npm
-
Clone the repo (or download the repo)
git clone https://github.com/divagr18/Cerno-Agentic-Local-Deep-Research.git cd Cerno-Agentic-Local-Deep-Research
-
Run Migrations
# macOS/Linux chmod +x cerno ./cerno migrate #for detailed logs ./cerno migrate --verbose # Windows .\cerno migrate #for detailed logs .\cerno migrate --verbose
-
Post-Migration Setup
After applying migrations, follow these steps to configure your environment and launch Cerno:
-
Copy the
.env
templatecp .env.example .env
Creates a fresh
.env
file. Open it and fill in your API keys (e.g.,OPENAI_API_KEY
,GEMINI_API_KEY
) or local model settings for Ollama.For now, we only support OpenAI, Gemini, Anthropic, Deepseek and local models on Ollama that support tool calling, but support for more models is coming in the next release.
-
Activate the virtual environment
venv\Scripts\activate # Windows PowerShell/CMD source venv/bin/activate # macOS/Linux
Ensures that Cerno’s dependencies and CLI are available in your current shell.
-
Start Cerno
cerno start
Launches both the Django backend and the React frontend. Once running, open http://localhost:5173 in your browser.
-
List all commands
cerno --help
Displays all available CLI commands and options.
Prefer containerized workflows? Follow these steps:
-
Clone and set up
.env
as above. -
Build and launch with Docker Compose:
docker-compose up --build
-
Visit http://localhost:5173.
Command | Description |
---|---|
cerno --help |
Show all commands and usage details |
cerno setup |
Re-run the full automated setup |
cerno migrate |
Apply database migrations |
cerno start |
Launch backend & frontend |
cerno start --no-frontend |
Launch only the Django backend |
├── cerno # CLI bootstrap scripts
├── cerno_cli.py # Click-based command definitions
├── api/ # Django backend
│ ├── core/ # Settings, wsgi, asgi
│ ├── api/ # Views, serializers, URLs
│ └── agents/ # Agent definitions & tools
├── frontend/ # React + Vite app
├── agent_outputs/ # Generated reports, code, data
├── knowledge_sources/# Ingested docs for knowledge base
├── pyproject.toml # Dependencies & CLI entry point
└── docker-compose.yml
- Academic Research: Automate literature reviews, data extraction, and report generation.
- Market Analysis: Compile insights from news sources, financial data, and social media.
- Competitive Intelligence: Track competitor tooling and summarize key findings.
- Product Development: Prototype multi-agent workflows for user testing and iterative design.
- v1.1 (Q3 2025): More integrations, advanced visualization modules, and collaborative workspaces.
- v1.2 (Q4 2025): Plugin support, permissioned sharing, and audit trails.
- Future: Community-driven integrations, mobile-first UI, and expanded local model support.
- Encrypted Secrets: API keys and sensitive data encrypted at rest.
- Audit Logs: Full history of agent actions and user interactions.
Contributions are welcome! Fork, develop, and submit a pull request. For major features, please open an issue first to discuss design and scope.
Distributed under the MIT License. See LICENSE for details.