Skip to content
@sipsalabs

sipsalabs

Sipsa Labs

A research lab — Systems · Intelligence · Precision. UltraCompress is our flagship publicly-shipped product.


What we ship publicly

Extreme compression infrastructure for large language models. The only sub-3-bits-per-weight method we evaluated on a 6-model head-to-head benchmark with zero catastrophic failures.

pip install ultracompress

Apache-2.0 CLI. Pre-compressed reference models distributed via the Hugging Face Hub (rolling release through April–May 2026).

Patent pending — USPTO 64/049,511 + 64/049,517, filed April 25, 2026.


What's coming

  • v0.1 alpha — pre-compressed reference models for Qwen3, Llama, Mistral families releasing throughout April–May 2026
  • v0.2 (Q3 2026)uc compress (self-compression of customer models) + Track B architectural-compression variants + native exports to llama.cpp, vLLM, TensorRT-LLM, CoreML

Contact

sipsalabs.com

Popular repositories Loading

  1. ultracompress ultracompress Public

    Extreme compression infrastructure for large language models. pip install ultracompress

    Python

  2. .github .github Public

    Sipsa Labs organization profile

Repositories

Showing 2 of 2 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…