Code Performance Analyzer is a Visual Studio Code extension that helps developers analyze code performance and automatically generate performance tests. It combines static complexity analysis with empirical benchmarking to predict complexity, produce performance tests, and visualize results.
Use cases:
- Quickly assess runtime/memory complexity of functions
- Generate performance tests for real-metric time and space performance
- Visualize runtime growth at scale
- Export performance overviews and test results
- OS: Windows / Linux / macOS (development files and batch scripts assume Windows PowerShell)
- Python 3.9+ (backend server and utilities)
- Node.js (for VS Code extension build steps)
- Docker Desktop (recommended for dev container)
- Optional: NVIDIA GPU drivers + Docker with GPU support for model hosting
- Optional: Kubernetes-In-Docker (Kind, for cluster deployment)
- Clone the repository
- Clone the repository:
cd src/model/models/student && git clone https://huggingface.co/philippesic/cpa
Running the model from a dev containers allows for GPU inference, which is much faster than the cluster, but requires a CUDA-capable GPU with >10GB VRAM
- Enter the container:
dev.bat - Start the server:
bash serve.sh - Confirm the server is running:
curl 127.0.0.1:5000/health - Compile the extension in the root directory:
npm run compile - Enter the VSCode test environment by running F5 from
src/extension/extension.ts(Using Visual Studio Extension Development) - Open the repository and experiment on the provided test functions (or your own)
Running the model from a cluster currently only supports CPU inference, which is slower, but works with a much wider range of hardware
- Start (Or Create a new) cluster:
cicd/start_deploy.bat - Update the cluster image to the most recent version:
cicd/update_cluster.bat - The server will automatically deploy. Enter the test environment and experiment with the extension
Within the test environment, highlight a Python code snippet -> Right Click -> Analyze Code Complexity
Automatically render in the sidebar upon analysis
Paste a Python Function into the Performance Test input in the sidebar and click generate. Save the provided file and run it.
Upon generating a performance test, click Export JSON and save the file
Open the command pallete and select CPA: Export as CSV
├───cicd
├───demo
├───dist
└───src
├───extension
└───model
├───build
├───data
├───examples
├───exported_results
├───models
│ └───student
│ ├───adapters
│ ├───base
│ └───cpa
├───stub-server
├───tree-sitter-python
└───utils


