Skip to content

weijt606/olscope

Owlscope

           ,_,                      ___           __  _____
          (O,O)                    / __ \_      _/ / / ___/____ __   ___   ___
          (   )                   / / / / | /| / /  \__ \/ ___/ __ \/ __ \/ _ \
          -"-"-                  / /_/ /| |/ |/ /  ___/ / /__/ /_/ / /_/ /  __/
        .-"""""-.                \____/ |__/|__/  /____/\___/\____/ .___/\___/
                                                                 /_/

Python FastAPI Next.js PostgreSQL Redis Qdrant

Open-source intelligence engine for discovering, evaluating, and adopting open-source libraries, frameworks, and agent skills.

English | 简体中文

FeaturesQuick StartUsage GuidelinesIntegration TestingAPI ReferenceSelf HostingContributingCode of ConductChangelogDesign DocRoadmapSecuritySupportRelease Notes


What is Owlscope?

Owlscope is an open-source intelligence engine that helps developers and AI Agents discover, evaluate, and choose the best open-source projects, libraries, and Agent Skills.

Whether you're a senior engineer doing architecture selection, a Vibe Coder asking AI to build your app, or an AI Agent automating a workflow — when facing questions like "Which library should I use? Is this project still maintained? What Agent Skill fits best?" — Owlscope delivers deeply analyzed, trustworthy recommendations in seconds.

Key Differentiators

  • 🔍 Deep Semantic Search: Understands developer intent, not just keywords
  • 🌐 Full-Spectrum Coverage: Bridges traditional open-source (GitHub, npm, PyPI) and Agent ecosystems (Qveris, MCP Hub)
  • 📊 Multi-Dimensional Evaluation: 7-dimension scoring for libraries + dedicated Agent Skill evaluation
  • 🤖 Agent-Native: MCP protocol support — AI Agents can call Owlscope as a tool
  • 👶 Beginner-Friendly: Adaptive output that adjusts to user expertise level
  • 📦 Self-Hostable: Full functionality via docker compose up
  • 💡 Idea De-duplication: Input an idea/PRD and detect whether similar open-source implementations already exist

Product Preview

The screenshots below show the current English UI experience.

Owlscope English interface 1 Owlscope English interface 2 Owlscope English interface 3

Owlscope English interface 4 Owlscope English interface 5 Owlscope English interface 6

Features

For Professional Developers

  • Natural language + structured search across open-source projects
  • Multi-dimensional project evaluation (activity, security, community health, etc.)
  • Side-by-side comparison of alternatives
  • Dependency health audit
  • Idea/PRD validation against GitHub and open-source ecosystems to avoid rebuilding solved products

For Vibe Coders / Beginners

  • Conversational search — describe what you want to build in plain language
  • Step-by-step guides with difficulty ratings
  • Complete tech stack recommendations

For AI Agents

  • MCP Protocol native support
  • Structured JSON API responses
  • Agent Skill discovery across platforms (Qveris, MCP Hub, etc.)

Quick Start

Using Docker Compose (Recommended)

git clone https://github.com/weijt606/olscope.git
cd owlscope

# Option A (CLI, current shell only): set one or more provider keys directly
export DEEPSEEK_API_KEY="your_deepseek_key"
# export OPENAI_API_KEY="your_openai_key"
# export ANTHROPIC_API_KEY="your_anthropic_key"

# Option B (config files, persistent): keep defaults in .env and secrets in .env.local
cp .env.example .env
cp .env.local.example .env.local
cat >> .env.local <<'EOF'
DEEPSEEK_API_KEY=your_deepseek_key
# OPENAI_API_KEY=your_openai_key
# ANTHROPIC_API_KEY=your_anthropic_key
EOF

# Start all services
docker compose up -d

Compose loads both .env and .env.local for the API service, and values in .env.local override duplicate keys.

Important: Owlscope does not provide shared AI provider keys. You must supply your own API keys for LLM-powered features.

The Web App will be available at http://127.0.0.1:3100 and the API at http://127.0.0.1:8010.

Local Development

Backend (Python)

# Install dependencies
pip install -e ".[dev]"

# Option A (CLI, current shell only)
export DEEPSEEK_API_KEY="your_deepseek_key"

# Option B (config files, persistent)
cp .env.example .env
cp .env.local.example .env.local
cat >> .env.local <<'EOF'
DEEPSEEK_API_KEY=your_deepseek_key
# OPENAI_API_KEY=your_openai_key
# ANTHROPIC_API_KEY=your_anthropic_key
EOF

# Start the API server
uvicorn src.api.main:app --reload --host 127.0.0.1 --port 8010

Web App (Next.js)

cd web
npm install
npm run dev

CLI

# Install CLI
pip install -e ".[cli]"

# Search for projects
owlscope search "Python async HTTP client with HTTP/2"

# Ops preflight (local-first)
owlscope ops preflight

# Ops deploy (local direct mode, docker as fallback)
owlscope ops deploy --mode local

# Include frontend startup and checks
owlscope ops deploy --mode local --with-web

# Run checks without leaving background processes
owlscope ops deploy --mode local --with-web --no-detached

# Ops deploy via docker explicitly
owlscope ops deploy --mode docker

# Stop processes started by CLI deploy
owlscope ops stop

Usage Guidelines

  • Bring your own AI API keys (BYOK): set provider keys in .env.local before using LLM-backed features.
  • Recommended minimum: configure at least one provider key such as DEEPSEEK_API_KEY, OPENAI_API_KEY, or ANTHROPIC_API_KEY.
  • Setup methods:
    • CLI method (temporary): export DEEPSEEK_API_KEY="..." in the same shell session before starting API/web.
    • Config file method (persistent): keep non-secret defaults in .env, and write real keys only to .env.local.
  • Security: never commit .env.local or API keys to Git history, screenshots, or issues.
  • Cost control: use lighter models for iterative workflows and monitor token usage in your provider dashboard.
  • Fallback behavior: if external model services are unavailable, some retrieval flows still work with deterministic fallbacks.

Pre-release key hygiene quick check:

  • Ensure no env secret files are tracked: git ls-files .env .env.local should return nothing.
  • Ensure tracked files do not contain token-like strings (example): git grep -nE "(sk-[A-Za-z0-9]{20,}|ghp_[A-Za-z0-9]{20,}|xoxb-[A-Za-z0-9-]{20,})" -- .

Integration Testing

Run this when you want to validate the real /api/v1/compare endpoint with a black-box test.

# Start required infra
docker compose up -d postgres redis

# Enable integration test execution and run the compare black-box case
export OWLSCOPE_RUN_INTEGRATION=1
pytest -q -m integration tests/integration/test_compare_blackbox.py

Notes:

  • The integration suite is excluded from default test runs.
  • CI always runs this test in a dedicated job with PostgreSQL + Redis services.

Release Preflight

Use these commands before deployment:

# Validate production environment vars
python scripts/validate_env.py --mode prod

# One-command release validation (tests + build + smoke)
bash scripts/release_check.sh

# Include black-box integration test in the same run
OWLSCOPE_RELEASE_CHECK_INTEGRATION=1 bash scripts/release_check.sh

Production-readiness docs:

  • Deployment runbook: docs/deployment.md
  • Migration/rollback: docs/migrations.md
  • Smoke checklist: docs/smoke-test.md
  • Release checklist: docs/release-checklist.md

Production launch commands:

docker compose -f docker-compose.yml -f docker-compose.prod.yml up -d --build
bash scripts/post_deploy_check.sh

CLI-first alternative:

owlscope ops deploy --mode local
# if needed: owlscope ops deploy --mode docker
# include web checks: owlscope ops check --with-web
# stop local managed processes: owlscope ops stop

API Reference

REST API

# Search
curl -X POST http://127.0.0.1:8010/api/v1/search \
  -H "Content-Type: application/json" \
  -d '{"query": "lightweight Python web framework"}'

# Evaluate a project
curl http://127.0.0.1:8010/api/v1/evaluate/github:library:encode/httpx

# Compare projects
curl -X POST http://127.0.0.1:8010/api/v1/compare \
  -H "Content-Type: application/json" \
  -d '{"projects": ["github:library:fastapi/fastapi", "github:library:pallets/flask", "github:library:django/django"]}'

# Assess whether an idea is already implemented
curl -X POST http://127.0.0.1:8010/api/v1/idea/assess \
  -H "Content-Type: application/json" \
  -d '{"idea": "AI coding workflow assistant for startup teams", "product_doc": "Need repo indexing, recommendation, and integration guidance"}'

# Export assessment report as Markdown
curl -X POST "http://127.0.0.1:8010/api/v1/idea/assess/export?format=markdown" \
  -H "Content-Type: application/json" \
  -d '{"idea": "AI coding workflow assistant for startup teams", "product_doc": "Need repo indexing, recommendation, and integration guidance"}'

# Export assessment report as JSON envelope
curl -X POST "http://127.0.0.1:8010/api/v1/idea/assess/export?format=json" \
  -H "Content-Type: application/json" \
  -d '{"idea": "AI coding workflow assistant for startup teams", "product_doc": "Need repo indexing, recommendation, and integration guidance"}'

# Batch assess multiple ideas
curl -X POST "http://127.0.0.1:8010/api/v1/idea/assess/batch" \
  -H "Content-Type: application/json" \
  -d '{"items":[{"idea":"Open-source API mocking tool","product_doc":"Need scenario replay"},{"idea":"PR review assistant for OSS maintainers","product_doc":"Need triage automation"}],"limit":6,"max_concurrency":2,"per_item_timeout_seconds":30}'

# Export batch assessment report as Markdown
curl -X POST "http://127.0.0.1:8010/api/v1/idea/assess/batch/export?format=markdown" \
  -H "Content-Type: application/json" \
  -d '{"items":[{"idea":"Open-source API mocking tool","product_doc":"Need scenario replay"},{"idea":"PR review assistant for OSS maintainers","product_doc":"Need triage automation"}],"limit":6,"max_concurrency":2,"per_item_timeout_seconds":30}'

# Export batch assessment report as JSON envelope
curl -X POST "http://127.0.0.1:8010/api/v1/idea/assess/batch/export?format=json" \
  -H "Content-Type: application/json" \
  -d '{"items":[{"idea":"Open-source API mocking tool","product_doc":"Need scenario replay"},{"idea":"PR review assistant for OSS maintainers","product_doc":"Need triage automation"}],"limit":6,"max_concurrency":2,"per_item_timeout_seconds":30}'

# Response includes:
# - verdict + existing_project_probability
# - action_recommendation (build|fork|integrate) + action_rationale
# - decision_signals for explainability
# - similar_projects with evidence_snippets
# - export endpoint supports markdown/json report output
# - batch endpoint returns per-idea results + verdict counts
# - batch supports max_concurrency and per_item_timeout_seconds
# - batch export endpoint supports markdown/json report output

Full API documentation available at http://127.0.0.1:8010/docs (Swagger UI) or http://127.0.0.1:8010/redoc (ReDoc).

MCP Protocol

Owlscope exposes MCP tools for AI Agent integration:

Tool Description
owlscope_search Search open-source projects and Agent Skills
owlscope_evaluate Deep evaluation of a specific project/Skill
owlscope_compare Compare multiple projects/Skills
owlscope_check_deps Dependency health check
owlscope_alternatives Find alternative solutions
owlscope_discover_skills Discover Agent Skills from registries
owlscope_stack_suggest Get complete tech stack recommendations

Self Hosting

Owlscope is designed for easy self-hosting. See the Self-Hosting Guide for detailed instructions.

LLM Configuration

Owlscope supports 100+ LLM providers via LiteLLM. Configure your preferred models in src/config/llm.yaml:

llm:
  adapter: litellm
  providers:
    light:
      model: "deepseek/deepseek-chat"
    standard:
      model: "openai/gpt-4o"
    batch:
      model: "ollama/qwen2.5:14b"

Tech Stack

Component Technology
Backend Python (FastAPI)
Frontend Next.js + TailwindCSS + shadcn/ui
Vector DB Qdrant
Database PostgreSQL
Cache Redis
LLM Adapter LiteLLM
Task Queue Celery + Redis
CLI Typer

Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

License

Owlscope is licensed under the Apache License 2.0.


Built with 🦉 by the Owlscope Team

About

Owlscope helps developers discover, evaluate, and reuse open-source tools while validating new ideas faster.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors