StackScribe is a next-generation CLI tool and NPM module that automatically generates professional, high-quality comments for your code. It scans your codebase, identifies functions and APIs, and leverages powerful Large Language Models (LLMs) to write detailed explanations—saving you hours of manual documentation.
Currently supports JavaScript and TypeScript. Support for additional languages like Python, Java, and more is planned in future releases.
Getting started with StackScribe is quick, whether as a CLI tool or a module in your project.
Install the package
npm install stackscribe
Commands Guide
npx stackscribe --help
# To see Version
npx stackscribe --version
Step 1: Configure Your API Key (One-time setup)
Ollama runs locally and requires no key. For other providers, set your API key:
# Groq example
npx stackscribe config --provider groq --apiKey YOUR_GROQ_API_KEY
# OpenAI
npx stackscribe config --provider openai --apiKey YOUR_OPENAI_API_KEY
# Gemini
npx stackscribe config --provider gemini --apiKey YOUR_GEMINI_API_KEY
Step 2: Run on Your Codebase
Generate comments for a single file or an entire directory:
# Run on './src' directory using Groq
npx stackscribe run --path ./src --provider groq --model llama3-8b-8192
# Run on a Python file using Gemini
npx stackscribe run --path ./my_script.py --provider gemini --model gemini-1.5-pro
# Use the default provider from your config
npx stackscribe run --path ./src/myFile.js
Integrate StackScribe directly into scripts, CI/CD pipelines, or other tools.
Step 1: Install the package
npm install stackscribe
Step 2: Import and use
import { main } from "stackscribe";
// Run on './src' directory using default provider
main("./src");
// Specify a provider
main("./src", "gemini");
// Specify provider and model
main("./src", "groq", "llama3-8b-8192");
StackScribe is designed to be simple, powerful, and easy to integrate into modern development workflows.
-
🤖 Intelligent Code Analysis Uses AST parsing to accurately detect all functions and API calls in JavaScript/TypeScript code, ignoring irrelevant sections. Other languages are fully processed.
-
🔗 Multi-Provider LLM Support Choose from OpenAI, Google Gemini, Groq, or run offline with Ollama. Specify models for each provider for maximum control.
-
✍️ High-Quality Comment Generation Generates professional, list-style explanations of function logic, purpose, and expected inputs/outputs.
/**
* Handles user login by validating credentials and issuing a JWT.
* 1. Extracts email and password from the request body.
* 2. Validates credentials against the database.
* 3. On success, generates access and refresh tokens.
* 4. Returns the tokens to the client.
* @param {object} req - Express request object
* @param {object} res - Express response object
*/
function loginUser(req, res) {
// ... function logic
}
stackscribe/
├─ bin/
│ └─ stackscribe.ts # CLI commands (yargs)
├─ src/
│ ├─ index.ts # Main logic (CLI + module)
│ ├─ parser.ts # AST parsing
│ ├─ annotator.ts # Insert comments
│ ├─ llm/
│ │ ├─ openai.ts # LLM wrappers
│ │ ├─ gemini.ts
│ │ ├─ groq.ts
│ │ └─ ollama.ts
│ ├─ config.ts # API keys & config
│ └─ utils.ts # Helper functions
├─ package.json
└─ README.md
- Code Parsing:
@babel/parser
,@babel/traverse
- Code Generation:
recast
(preserves formatting) - LLM SDKs:
openai
,@google/generative-ai
,groq-sdk
,ollama
- CLI Framework:
yargs
- Phase 1 (MVP): JS/TS support with OpenAI — ✅ Completed
- Phase 2 (Multi-Provider): Gemini, Groq, Ollama integration — ✅ Completed
- Phase 3 (Config): Robust CLI +
stackscribe.json
for project defaults — 🟡 Partially Completed - Phase 4 (Multi-Language): Python, Java, etc. support — 🟡 Partially Completed