Skip to content

imec-int/hespas

Repository files navigation

Hespas

📖 Documentation

Hespas is a distributed ML performance estimation tool built on StableHLO. It takes a StableHLO workload representation of an ML model, splits it into compute and communication regions, estimates compute performance using analytical, simulation, or profiling-based backends, and outputs Chakra execution traces for network simulation with ASTRA-sim.

One workload representation. Multiple simulation backends. Multiple hardware targets.

Installation

git clone https://github.com/imec-int/hespas.git
cd hespas
pip install .

Or with uv:

uv sync

Quick Start

Generate a Chakra trace using the roofline estimator:

hespas_chakra_gen tests/fixtures/configs/config_roofline_a100.json

Use a hardware preset with a custom workload:

hespas_chakra_gen configs/nodes/A100_PCIe_40GB/roofline/config.json \
    --mlir_file tests/fixtures/mlir/jax_example.mlir \
    --output output/jax_example_a100

Main Estimator Backends

Backend Type Hardware Required
Roofline Analytical None
XLA Profiling NVIDIA GPU + CUDA
IREE Compilation + profiling CPU or GPU

CLI Tools

Command Description
hespas_chakra_gen Main tool — split, estimate, and generate Chakra traces
hespas_mlir_splitter Split MLIR into compute/communication modules
hespas_mlir_analyzer Analyze StableHLO operator statistics

Publication

Evaluating Cross-Architecture Performance Modeling of Distributed ML Workloads Using StableHLO

Jonas Svedas, Nathan Laubeuf, Ryan Harvey, Arjun Singh, Changhai Man, Abubakr Nada, Tushar Krishna, James Myers, Debjyoti Bhattacharjee

IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS), 2026

arXiv:2604.12090

@inproceedings{svedas2026stablehlo,
  title     = {Evaluating Cross-Architecture Performance Modeling of Distributed ML Workloads Using StableHLO},
  author    = {Svedas, Jonas and Laubeuf, Nathan and Harvey, Ryan and Singh, Arjun and Man, Changhai and Nada, Abubakr and Krishna, Tushar and Myers, James and Bhattacharjee, Debjyoti},
  year      = {2026},
  booktitle = {Proceedings of ISPASS 2026},
  organization = {IEEE}
}

Acknowledgments

This work is funded by the Advanced Research + Invention Agency (ARIA).

License

MIT — © imec

About

Distributed ML performance estimation tool built on StableHLO.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages