Turn your rough, lazy prompt comments into highly detailed, professional prompts β powered by a local AI running on your own machine. No cloud, no API keys, 100% private.
Imagine you're coding and you write a quick comment like:
// make a login page with validation
You select that text, press a keyboard shortcut, and the extension rewrites it into something like:
Create a responsive login page component using React with TypeScript.
Include email and password input fields with the following validation rules:
- Email must be a valid email format
- Password must be at least 8 characters with one uppercase letter and one number
Use functional components. Style with Tailwind CSS. Use pnpm as the package manager.
Display inline error messages below each field. Add a "Remember Me" checkbox and a
"Forgot Password?" link. Handle form submission with proper loading state...
It takes your rough idea and turns it into a detailed, context-aware prompt that any AI coding assistant can understand perfectly.
Before using this extension, you need one thing installed on your computer: a local AI model server. We recommend Ollama β it's free and easy.
- Go to https://ollama.com
- Click Download and choose your operating system (Mac, Windows, or Linux)
- Follow the installer β it's a standard app installation, just click "Next" a few times
- Once installed, Ollama runs quietly in the background (you'll see a small llama icon in your menu bar on Mac)
Open your Terminal app (on Mac, search for "Terminal" in Spotlight) and type:
ollama pull llama3This downloads the llama3 model (~4 GB). Wait for it to finish β it only needs to download once.
π‘ Tip: If you have a powerful computer with lots of RAM (32 GB+), you can try larger models like
llama3:70bfor even better results. If your computer is older, try the smallerllama3:8b.
In your Terminal, type:
ollama listYou should see llama3 (or whichever model you downloaded) in the list. If you do, you're all set!
- Go to the GitHub Releases page
- Find the latest release (e.g.,
v0.1.0) - Under Assets, click on
prompt-enhancer-local-0.1.0.vsixto download it - Save it somewhere easy to find (like your Downloads folder)
- Open VS Code
- Open the Command Palette β press
Cmd + Shift + P(Mac) orCtrl + Shift + P(Windows/Linux) - Type "Install from VSIX" β you'll see an option called "Extensions: Install from VSIX...", click it
- A file browser will open β navigate to where you saved the
.vsixfile and select it - Wait a few seconds β you'll see a notification saying the extension was installed
- Click "Reload Now" (or restart VS Code) to activate the extension
The process is the same β open the Command Palette (Cmd + Shift + P), search for "Install from VSIX", and select the downloaded file.
- Open the Extensions sidebar β click the puzzle piece icon on the left sidebar, or press
Cmd + Shift + X(Mac) - Search for "Prompt Enhancer"
- You should see it listed as installed β
That's it! No cloning, no terminal commands, no compiling. The extension is now permanently installed and ready to use.
-
Open any code file in VS Code / Antigravity IDE
-
Type a rough prompt anywhere β as a comment, inline, wherever you like
# add error handling to this function with retries -
Select (highlight) that text with your mouse or keyboard
-
Press the keyboard shortcut:
Operating System Shortcut Mac Cmd + Shift + RWindows / Linux Ctrl + Shift + E -
Wait a moment β you'll see a notification saying "Enhancing promptβ¦" at the bottom-right
-
Done! β your selected text is replaced with a detailed, professional prompt
You type: "add dark mode toggle"
β
Extension reads: File language (e.g., TypeScript)
Project rules from .enhancerc.json
β
Sends to Ollama: A carefully constructed instruction asking the AI
to expand your rough text into a detailed prompt
β
You get back: A multi-line, detailed prompt that respects your
project's specific tools and conventions
You can create a file called .enhancerc.json in the root folder of any project to tell the extension about your project's rules and conventions. The enhanced prompts will then automatically include these rules.
- In your project's root folder, create a file named
.enhancerc.json - Add your project-specific rules as key-value pairs:
{
"packageManager": "pnpm",
"styling": "Tailwind CSS",
"framework": "Next.js 14 with App Router",
"testing": "Vitest + React Testing Library",
"customRules": "Use functional components. Prefer server components when possible."
}π‘ A sample file is included as
.enhancerc.example.jsonβ you can copy and rename it:cp .enhancerc.example.json .enhancerc.json
You can add any rules that are relevant to your project. Common examples:
| Key | Example Value |
|---|---|
packageManager |
"npm", "pnpm", "yarn" |
styling |
"Tailwind CSS", "CSS Modules", "styled-components" |
framework |
"Next.js 14", "React 18", "Vue 3" |
language |
"TypeScript strict mode", "Python 3.12" |
testing |
"Jest", "Vitest", "Pytest" |
customRules |
Any free-text instructions for the AI |
If this file doesn't exist, the extension works fine β it just won't include project-specific rules.
You can also configure global settings in VS Code:
- Open Settings β press
Cmd + ,(Mac) orCtrl + ,(Windows/Linux) - Search for "Prompt Enhancer"
- You'll see two settings:
| Setting | Default | What It Does |
|---|---|---|
| LLM Endpoint | http://localhost:11434/api/generate |
The URL where Ollama (or another local LLM) is running. Change this only if you run Ollama on a different port or machine. |
| Model | llama3 |
The AI model to use. Change this if you downloaded a different model (e.g., mistral, codellama, llama3:70b). |
If the default shortcut conflicts with another extension or you simply prefer a different one:
- Open Keyboard Shortcuts β press
Cmd + K, Cmd + S(Mac) orCtrl + K, Ctrl + S - Search for "Prompt Enhancer"
- Click the βοΈ pencil icon next to the command
- Press your desired key combination
- Press Enter to save
Open package.json and find the keybindings section:
"keybindings": [
{
"command": "promptEnhancer.enhanceSelectedText",
"key": "ctrl+shift+e",
"mac": "cmd+shift+r",
"when": "editorTextFocus"
}
]Change the key (Windows/Linux) and mac values to your preferred combination. Examples:
| Shortcut | Value to Use |
|---|---|
Cmd + Alt + E |
"cmd+alt+e" |
Cmd + K, then E |
"cmd+k cmd+e" |
Ctrl + Shift + P |
"ctrl+shift+p" |
This means the extension can't reach Ollama. Try:
- Check if Ollama is running β look for the llama icon in your menu bar (Mac) or system tray (Windows)
- Start Ollama manually β open Terminal and run:
ollama serve
- Test the connection β run this in Terminal:
If you get a JSON response back, Ollama is working. If not, reinstall Ollama.
curl http://localhost:11434/api/generate -d '{"model": "llama3", "prompt": "Hi", "stream": false}'
You pressed the shortcut without highlighting any text. Select (highlight) the text you want to enhance first, then press the shortcut.
Local LLMs depend on your hardware. If it's slow:
- Try a smaller model:
ollama pull llama3:8b - Update the Model setting in VS Code to match:
llama3:8b - Close other heavy applications to free up RAM
Create or update the .enhancerc.json file in your project root with more specific rules. The more detail you provide, the better the output.
promptEnhancer/
βββ .vscode/
β βββ launch.json # Debug launch configuration (F5)
β βββ tasks.json # Build task for compilation
βββ src/
β βββ extension.ts # Main extension logic
β βββ configReader.ts # Reads .enhancerc.json configuration
β βββ llmService.ts # Communicates with the local LLM
βββ out/ # Compiled JavaScript (auto-generated)
βββ package.json # Extension manifest & settings
βββ tsconfig.json # TypeScript configuration
βββ .enhancerc.example.json # Sample project rules configuration
βββ README.md # This file
If you want to modify the extension or contribute to it, follow these steps:
git clone https://github.com/dotheki/prompt-enhancer-local.git
cd prompt-enhancer-local
npm install # or: pnpm install- Open the project folder in VS Code / Antigravity IDE
- Press F5 β this opens a second IDE window (called the Extension Development Host) with your extension loaded
- Make changes to the code in
src/, then press Ctrl+Shift+F5 to restart and test again - You can set breakpoints, view
console.logoutput in the Debug Console, and step through the code
npm run compile
pnpm dlx @vscode/vsce package --allow-missing-repositoryThis creates a new .vsix file you can share or install.
MIT β use it however you like.