readcli is an EPUB reading CLI with a built-in, spoiler-safe, Q&A RAG agent.
It is designed to let you:
- read
.epubbooks in the terminal - track reading progress as a percentage
- ask questions without leaking unread content
- choose between retrieval-based and full-context answering modes
git clone https://github.com/AlexGoncalves21/readcli.git
cd readcli
go mod tidy
go build -o readcli ./cmd/readcliOpenAI is only required for ask-rag and ask-llm.
export OPENAI_API_KEY="<your_key>"Optional local env file:
set -a; source .env; set +a./readcli library add /path/to/book.epub
./readcli books
./readcli read 1
./readcli progress set 1 42%
./readcli ask-rag 1 "Who is Raskolnikov?"./readcli library add /path/to/book.epub
./readcli books
./readcli read 1
./readcli read 1 --next
./readcli read 1 --full
./readcli progress 1
./readcli progress set 1 42%
./readcli context 1 --limit 8
./readcli ask-rag 1 "Who is Raskolnikov?"
./readcli ask-llm 1 "Summarize what we know so far"ask-rag:
- embeds only pages up to the current progress boundary
- uses hybrid retrieval: vector similarity plus full-text keyword search
- reranks candidates and expands to neighbor pages for better narrative continuity
- returns answers with citations
- should be the default choice for cost and scale
ask-llm:
- sends all readable text up to the current boundary to the model
- warns when the prompt size approaches the configured context window
- is useful for synthesis over everything read so far
Local SQLite storage lives at ~/.readcli/readcli.db by default.
Stored data includes:
- books
- pages
- progress
- embeddings
- full-text index data when FTS is available
All retrieval and generation is constrained to the current progress boundary (page <= boundary).
- EPUB is reflowable, so exact Kindle page parity is not guaranteed.
- Progress uses character-position ratio for better Kindle
%alignment. - The first
ask-ragrun for a book can be slower while allowed pages are embedded.