Skip to content

n0an/hackernewsai-cli

Repository files navigation

HackerNewsAI App Icon

HackerNewsAI CLI

AI-powered Hacker News digest in your terminal

Go AI Bubble Tea License


A terminal UI tool that fetches top Hacker News stories and generates a concise AI digest — so you can catch up on tech news without leaving your terminal.

Features

  • AI Digest — AI summarizes the top stories into a scannable, themed overview
  • Multi-Provider — Anthropic, OpenAI, Gemini, Ollama, or any OpenAI-compatible API
  • Story Browser — Browse and open stories directly from the terminal
  • Markdown Rendering — Beautiful terminal-native markdown via Glamour
  • Keyboard-Driven — Navigate, refresh, and open links without touching the mouse
  • Configurable — Customize story count, model, and theme via config file

Installation

Homebrew

brew tap n0an/tap
brew install hackernews

From Source

git clone https://github.com/n0an/hackernewsai-cli.git
cd hackernewsai-cli
go install ./cmd/hackernews/

Note: Make sure ~/go/bin is in your PATH. Add this to your ~/.zshrc (or ~/.bashrc):

export PATH="$HOME/go/bin:$PATH"

Setup

Set an API key for your preferred provider:

# Pick one:
export ANTHROPIC_API_KEY=your-key    # Anthropic (default)
export OPENAI_API_KEY=your-key       # OpenAI
export GEMINI_API_KEY=your-key       # Google Gemini

For local models (Ollama, LM Studio, etc.), no API key is needed — just configure the endpoint.

Usage

hackernews

Keyboard Shortcuts

Key Action
tab Switch between Digest and Stories views
j / k Scroll down / up
enter / o Open story URL in browser
c Open HN comments page
r Refresh and regenerate digest
q Quit

Configuration

Settings are stored in ~/.hackernews-tui.yaml:

# Provider: anthropic, openai, gemini, ollama, openai-compatible
provider: anthropic
model: claude-sonnet-4-6
story_count: 30
theme: auto

Provider Examples

Anthropic (default):

provider: anthropic
model: claude-sonnet-4-6

OpenAI:

provider: openai
model: gpt-4o-mini
# Reasoning models (o1, o3, gpt-5) are also supported
# model: gpt-5.2

Gemini:

provider: gemini
model: gemini-2.0-flash

Ollama (local):

provider: ollama
ollama_model: llama3.2
ollama_endpoint: http://localhost:11434/v1  # optional, this is the default

OpenAI-compatible (LM Studio, llama.cpp, Jan, etc.):

provider: openai-compatible
local_endpoint: http://localhost:1234/v1
local_model: qwen3-4b

Architecture

cmd/hackernews/         — Entry point
internal/
  hackernews/           — HN Firebase API client (concurrent fetching)
  digest/               — AI digest generation (prompt + formatting)
  provider/             — LLM providers (Anthropic, OpenAI, Gemini, Ollama, OpenAI-compatible)
  config/               — YAML config file management
  tui/                  — Bubble Tea UI (views, styles, key handling)

Tech Stack

Component Library
TUI Framework Bubble Tea
Markdown Rendering Glamour
Styling Lip Gloss
AI (Anthropic) anthropic-sdk-go
AI (OpenAI) openai-go
AI (Gemini) google genai

Related Projects

License

MIT License

About

AI-powered Hacker News digest in your terminal

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors