A unified API that consolidates different Signal Temporal Logic (STL) semantics under a single interface with native support for NumPy, JAX and PyTorch.
Unified, backend-agnostic STL robustness semantics in one API. Switch semantics. Switch backend. Keep the same specification:
- One API
- Multiple STL semantics (Classic, soft, cumulative, D-GMSR... and more!)
- NumPy / JAX / PyTorch backends
- Autograd & GPU ready
Why?
STL robustness definitions are scattered across separate repositories, programming languages, and incompatible interfaces. Classical, smooth, cumulative, AGM, and D-GMSR semantics often live in isolation, making comparisons and experimentation unnecessarily difficult.
PySTL consolidates them under a single, consistent API. The same STL specification can be evaluated under different quantitative semantics without rewriting logic or changing infrastructure.
How?
Under PySTL's unified API, switching semantics is as simple as changing a function argument. Switching backend (NumPy, JAX, PyTorch) is equally straightforward, enabling automatic differentiation and GPU acceleration out of the box.
sem_jax = create_semantics("AGM", backend="jax")All semantics share the same internal interface and operator implementation (AND, OR, NOT, Always, Eventually, Until), ensuring consistent behavior across backends.
PySTL runs natively on NumPy, JAX, and PyTorch, making every supported semantics immediately usable in gradient-based control synthesis, reinforcement learning, and differentiable optimization pipelines. This includes semantics that were not originally designed for autograd workflows.
By unifying formal STL semantics with modern ML tooling, PySTL turns robustness evaluation from a fragmented research artifact into a backend-agnostic building block for verification, control, and learning over time-series signals.
| STL Semantic | Sound (Sign-Preserving)? | Smooth | What Makes It Special | Citation |
|---|---|---|---|---|
| Classical (Space) Robustness | Yes | No | Uses min/max and inf/sup style operators to compute signed distance to violation (worst-case semantics). |
Donze, A., & Maler, O. (2010). Robust Satisfaction of Temporal Logic over Real-Valued Signals. In Formal Modeling and Analysis of Timed Systems (FORMATS). doi:10.1007/978-3-642-15297-9_9 |
| Smooth Robustness | Approximate (temperature-dependent) | Yes | Replaces min/max by softmin/softmax (logsumexp) for differentiable approximations of classical robustness. |
Leung K, Aréchiga N, Pavone M. Backpropagation through signal temporal logic specifications: Infusing logical structure into gradient-based methods. The International Journal of Robotics Research. 2022;42(6):356-370. doi:10.1177/02783649221082115 |
| Cumulative Robustness | No (redefines satisfaction) | Usually no (piecewise) | Integrates robustness over time instead of only worst-case aggregation; captures sustained behavior. | Haghighi, I., Mehdipour, N., Bartocci, E., & Belta, C. (2019). Control from Signal Temporal Logic Specifications with Smooth Cumulative Quantitative Semantics. IEEE Control Systems Letters. arXiv:1904.11611 |
| AGM Robustness | Yes | Piecewise smooth (non-smooth on sign boundaries) | Uses arithmetic/geometric means to reward both degree and frequency of satisfaction across subformulae and time. | Mehdipour, N., Vasile, C.-I., & Belta, C. (2019). Arithmetic-Geometric Mean Robustness for Control from Signal Temporal Logic Specifications. ACC 2019. doi:10.23919/ACC.2019.8814487 |
| D-GMSR Robustness | Yes | Mostly yes (except boundary points) | Reformulates min/max with structured generalized means; smooth while preserving sign semantics. |
Uzun, S., et al. (2024). Discrete Generalized Mean Smooth Robustness (D-GMSR) for Signal Temporal Logic. arXiv:2405.10996 |
Supported syntax/backend combinations:
- Syntaxes:
classical,smooth,cumulative,agm,dgmsr - Backends:
numpy(default),jax(with--extra jax),torch(with--extra torch) - Total combinations: 15 (
5 x 3) when all extras are installed
Using uv (recommended):
# Regular install, NumPy-only
uv sync
# Regular install, NumPy + JAX
uv sync --extra jax
# Regular install, NumPy + PyTorch
uv sync --extra torchUsing pip:
# NumPy-only install (default)
python -m venv .venv
source .venv/bin/activate
pip install -e .
# NumPy + JAX install
pip install -e ".[jax]"
# NumPy + PyTorch install
pip install -e ".[torch]"import numpy as np
from pystl import Predicate, Interval, create_semantics
# signal shape: (time, state_dim)
signal = np.array([
[0.2, 0.8],
[0.3, 0.6],
[0.5, 0.4],
], dtype=float)
p_speed_ok = Predicate("speed_ok", fn=lambda s, t: 0.6 - s[t, 0])
p_alt_ok = Predicate("alt_ok", fn=lambda s, t: s[t, 1] - 0.2)
phi = (p_speed_ok & p_alt_ok).always(Interval(0, 2))
sem = create_semantics("classical", backend="numpy")
rho0 = phi.evaluate(signal, sem, t=0)
print(float(rho0))JAX semantics + gradients:
import jax
import jax.numpy as jnp
from pystl import create_semantics
signal_jax = jnp.asarray(signal)
sem_jax = create_semantics("classical", backend="jax")
rho0_jax = phi.evaluate(signal_jax, sem_jax, t=0)
grad0 = jax.grad(lambda s: phi.evaluate(s, sem_jax, t=0))(signal_jax)NumPy D-GMSR semantics + gradients:
sem_dgmsr = create_semantics("dgmsr", backend="numpy", eps=1e-8, p=2)
rho0, grad0 = phi.evaluate_with_grad(signal, sem_dgmsr, t=0)
print(grad0.shape) # (time, state_dim)Everyone is welcome to contribute.
Development install using uv:
# Dev install (adds pytest + pre-commit), NumPy-only
uv sync --dev
pre-commit install
# Dev install (adds pytest + pre-commit), NumPy + JAX
uv sync --dev --extra jax
pre-commit install
# Dev install (adds pytest + pre-commit), NumPy + PyTorch
uv sync --dev --extra torch
pre-commit installInstructions:
- Create a feature branch for your change.
- Implement the change with tests and docs updates when relevant.
- Run checks locally (
uv run pytest). - Linter should run automatically on commit.
- Open a pull request.
This project is licensed under the MIT License. See LICENSE for details.
If you use PySTL in your research, please cite it as:
@software{palanques_tost_2026_pystl,
author = {Palanques Tost, Eric},
title = {PySTL},
year = 2026,
publisher = {Zenodo},
doi = {10.5281/zenodo.18707962},
url = {https://doi.org/10.5281/zenodo.18707962}
}