Enjoy the News itself. Leave others to Horizon
📡 Your own AI-powered news radar. Generates daily briefings in English & Chinese. | 构建你专属的 AI 新闻雷达
|
Ranked Daily Briefing
|
Context, Summary & Discussion
|
Good news is scattered; bad news is endless. Horizon gives you a personal first pass over Hacker News, Reddit, Telegram, RSS, and GitHub: it fetches, deduplicates, scores, filters, and enriches stories with background context and community discussion.
But Horizon is not just another summarizer. AI is great at reducing noise, but news still needs human taste: the sources you trust, the comments that change how you read a story, and the hidden gems only people can share. Horizon keeps that human layer in the loop with customizable sources, thresholds, models, languages, delivery channels, comment summaries, and a community source hub.
- 📡 Watch Your Own Sources — Track Hacker News, RSS, Reddit, Telegram, Twitter/X, and GitHub releases or user activity in one pipeline
- 🤖 Turn Noise Into a Reading List — Score each item from 0-10 with Claude, GPT, Gemini, DeepSeek, Doubao, MiniMax, or any OpenAI-compatible API
- 🔗 Merge Repeated Stories — Deduplicate the same story across platforms before it reaches your briefing
- 🔍 Understand the Background — Add web-researched context for unfamiliar concepts, companies, projects, and technical terms
- 💬 Read the Conversation — Collect and summarize community comments from Hacker News, Reddit, and other supported sources
- 🌐 Publish in Two Languages — Generate English and Chinese daily briefings from the same source set
- 📝 Ship a Daily Site — Publish generated Markdown as a GitHub Pages daily briefing site
- 📧 Deliver by Email — Run a self-hosted SMTP/IMAP newsletter with automatic subscribe and unsubscribe handling
- 🔔 Push to Chat or Automations — Send templated results to Feishu/Lark, DingTalk, Slack, Discord, or custom webhook endpoints
- 🧙 Start From Your Interests — Use the setup wizard to generate a personalized source configuration
- ⚙️ Tune the Radar — Customize sources, thresholds, models, languages, and delivery channels from one JSON config
%%{init: {
"theme": "base",
"themeVariables": {
"fontFamily": "ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, Segoe UI, sans-serif",
"fontSize": "18px",
"primaryTextColor": "#2d2a3e",
"primaryBorderColor": "#e0dbd3",
"lineColor": "#7c7891",
"tertiaryColor": "#faf8f5",
"clusterBkg": "#f3f0eb",
"clusterBorder": "#e0dbd3"
}
}}%%
flowchart LR
classDef config fill:#fbbf24,stroke:#d4a017,color:#2d2a3e,stroke-width:1.5px;
classDef source fill:#ede7fb,stroke:#6d4aaa,color:#2d2a3e,stroke-width:1.5px;
classDef process fill:#ffe8db,stroke:#e0652e,color:#2d2a3e,stroke-width:1.5px;
classDef output fill:#f9d7e5,stroke:#be185d,color:#2d2a3e,stroke-width:1.5px;
config["⚙️ Config<br/>sources, thresholds, models, outputs"]
subgraph sources["Configured Sources"]
rss["📡 RSS"]
hn["📰 Hacker News"]
reddit["💬 Reddit"]
telegram["✈️ Telegram"]
twitter["🐦 Twitter / X"]
github["🐙 GitHub"]
end
fetch["📥 Fetch"]
dedup["🧹 Deduplicate"]
score["🤖 AI Score & Filter"]
enrich["🔎 Enrich"]
summary["📝 Summarize"]
subgraph outputs["Outputs"]
direction TB
site["🌐 Pages"]
email["📧 Email"]
webhook["🔔 Webhooks"]
mcp["🧩 MCP"]
end
config --> fetch
rss --> fetch
hn --> fetch
reddit --> fetch
telegram --> fetch
twitter --> fetch
github --> fetch
fetch --> dedup --> score --> enrich --> summary
config --> score
config --> summary
config --> outputs
summary --> site
summary --> email
summary --> webhook
summary --> mcp
class config config
class rss,hn,reddit,telegram,twitter,github source
class fetch,dedup,score,enrich,summary process
class site,email,webhook,mcp output
- Define — Configure sources, thresholds, models, languages, and delivery from one JSON config.
- Fetch — Pull latest content from all configured sources concurrently.
- Deduplicate — Merge items pointing to the same story or URL across platforms.
- Score & Filter — Use AI to rank items and keep only those above your threshold.
- Enrich — Search the web for background context and collect community discussion for important items.
- Summarize — Generate a structured Markdown briefing with summaries, tags, and references.
- Deliver — Publish the result to GitHub Pages, email, webhooks such as Feishu, MCP, or local files.
Option A: Local Installation
git clone https://github.com/Thysrael/Horizon.git
cd horizon
# Install with uv (recommended)
uv sync
# Install test/development extras when needed
uv sync --extra dev
# Or with pip
pip install -e .dev is currently defined as an optional extra in pyproject.toml, so use uv sync --extra dev for pytest and other development dependencies.
Option B: Docker
git clone https://github.com/Thysrael/Horizon.git
cd horizon
# Configure environment
cp .env.example .env
cp data/config.example.json data/config.json
# Edit .env and data/config.json with your API keys and preferences
# Run with Docker Compose
docker-compose run --rm horizon
# Or run with custom time window
docker-compose run --rm horizon --hours 48Option A: Interactive wizard (recommended)
uv run horizon-wizardThe wizard asks about your interests (e.g. "LLM inference", "嵌入式", "web security") and auto-generates data/config.json.
Option B: Manual configuration
cp .env.example .env # Add your API keys
cp data/config.example.json data/config.json # Customize your sourcesMinimal manual configuration:
For the full reference, see the Configuration Guide.
uv run horizon # Run with default 24h window
uv run horizon --hours 48 # Fetch from last 48 hoursdocker-compose run --rm horizon # Run with default 24h window
docker-compose run --rm horizon --hours 48 # Fetch from last 48 hoursThe generated report will be saved to data/summaries/.
Horizon works great as a GitHub Actions cron job. See .github/workflows/daily-summary.yml for a ready-to-use workflow that generates and deploys your daily briefing to GitHub Pages automatically.
| Source | What it fetches | Comments |
|---|---|---|
| Hacker News | Top stories by score | Yes (top N comments) |
| RSS / Atom | Any RSS or Atom feed | — |
| Subreddits + user posts | Yes (top N comments) | |
| Telegram | Public channel messages | — |
| Twitter / X | Tweets from specific users | Yes (top N replies) |
| GitHub | User events & repo releases | — |
Horizon can publish or deliver the generated briefing in several ways:
| Channel | What it does |
|---|---|
| GitHub Pages Daily Site | Copies generated Markdown into docs/ so GitHub Pages can publish a daily-updated briefing site |
| Email Subscription | Sends the daily briefing to subscribers and handles subscribe/unsubscribe requests through SMTP/IMAP |
| Webhook Notification | Pushes success or failure results to Feishu/Lark, DingTalk, Slack, Discord, or any custom webhook endpoint |
| MCP Server | Exposes Horizon pipeline steps as tools so AI assistants can fetch, score, filter, enrich, summarize, and run the full workflow |
For setup details, see the Configuration Guide. For MCP tool references and client setup, see src/mcp/README.md and src/mcp/integration.md.
| Guide | Description |
|---|---|
| Configuration | AI providers, sources, filtering, email, webhook, GitHub Pages, and MCP setup |
| Scoring | How Horizon evaluates and ranks news items |
| Scrapers | Source scraper details and extension notes |
| MCP Tools | Tool reference for MCP-compatible clients |
Horizon already supports the full daily briefing loop: multi-source collection, AI scoring, deduplication, enrichment, comment summaries, bilingual generation, GitHub Pages publishing, email delivery, webhook delivery, Docker deployment, MCP integration, and the setup wizard.
Planned improvements:
- More source types, such as Twitter/X and Discord
- Custom scoring prompts per source
- Publish releases on GitHub
- Publish the package to PyPI for
pip install
Contributions are welcome! Feel free to open issues or submit pull requests.
Want to share valuable source discoveries with the Horizon community? Please submit them through horizon1123.top.
Great candidates: niche RSS discoveries, active subreddit trends, notable GitHub updates, or Telegram channel highlights in your area of expertise.
- Special thanks to LINUX.DO for providing a promotion platform.
- Special thanks to HelloGitHub for valuable guidance and suggestions.





{ "ai": { "provider": "openai", "model": "gpt-4", "api_key_env": "OPENAI_API_KEY" }, "sources": { "rss": [ { "name": "Simon Willison", "url": "https://simonwillison.net/atom/everything/" } ] }, "filtering": { "ai_score_threshold": 6.0, "max_items_to_analyze": 5, "enrich_important_items": false } }