A VS Code extension that monitors and displays the context window usage for Large Language Models in Cursor.
- Real-time context window usage monitoring in the status bar
- Support for multiple LLM models including GPT-4, Claude 3, and Gemini
- Visual progress bar showing context window utilization
- Automatic updates every 30 seconds
- Integration with chat responses to show context usage
- Download the VSIX file from the releases page
- Install in VS Code using:
code --install-extension llm-context-window-monitor-0.0.1.vsix
The extension automatically activates when VS Code starts:
- Watch the status bar for real-time context window usage
- Use the chat interface to see context usage after each response
- Configure settings through VS Code's settings panel
Available settings:
cursor.chat.showContextWindow: Enable/disable context window display after chat responses
git clone https://github.com/MagickDataLLC/llm-context-window-monitor.git
cd llm-context-window-monitor
npm install
code .Press F5 to start debugging.
MIT License - see LICENSE.md for details
Report issues on GitHub