🔍 What is ProxTorch?
Dive into a rich realm of proximal operators and constraints with ProxTorch
, a state-of-the-art Python library crafted
on PyTorch. Whether it's optimization challenges or the complexities of machine learning, ProxTorch
is designed for
speed, efficiency, and seamless GPU integration.
- 🚀 GPU-Boosted: Experience lightning-fast computations with extensive CUDA support.
- 🔥 PyTorch Synergy: Naturally integrates with all your PyTorch endeavours.
- 📚 Expansive Library: From elemental norms (
L0
,L1
,L2
,L∞
) to advanced regularizations like Total Variation and Fused Lasso. - 🤝 User-Friendly: Jump right in! Intuitive design means minimal disruptions to your existing projects.
Getting started with ProxTorch
is a breeze. Install from PyPI with:
pip install proxtorch
Or install from source with:
git clone
cd ProxTorch
pip install -e .
Dive in with this straightforward example:
import torch
from proxtorch.operators import L1
# Define a sample tensor
x = torch.tensor([0.5, -1.2, 0.3, -0.4, 0.7])
# Initialize the L1 proximal operator
l1_prox = L1(sigma=0.1)
# Compute the regularization component value
reg_value = l1_prox(x)
print("Regularization Value:", reg_value)
# Apply the proximal operator
result = l1_prox.prox(x)
print("Prox Result:", result)
- L1, L2 (Ridge), ElasticNet, GroupLasso, TV (includes TV_2D, TV_3D, TVL1_2D, TVL1_3D), **Frobenius **
- Norms: TraceNorm, NuclearNorm
- FusedLasso, Huber
- L0Ball, L1Ball, L2Ball, L∞Ball (Infinity Norm), Frobenius, TraceNorm, Box
Explore the comprehensive documentation on Read the Docs.
ProxTorch
stands on the shoulders of giants:
We're thrilled to introduce ProxTorch
as an exciting addition to the PyTorch ecosystem. We're confident you'll love
it!
Got ideas? Join our vibrant community and make ProxTorch
even better!
ProxTorch
is proudly released under the MIT License.