Why do we need an intelligent block matrix library?
Let's try to construct the KKT matrix from Mattingley and Boyd's CVXGEN paper in numpy and PyTorch:
block, there is no way to infer the appropriate sizes of
the zero and identity matrix blocks.
It is an inconvenience to think about what size these
matrices should be.
Block acts a lot like
np.bmat and replaces:
- Any constant with an appropriately shaped block matrix filled with that constant.
- The string
'I'with an appropriately shaped identity matrix.
- The string
'-I'with an appropriately shaped negated identity matrix.
- [Request more features.]
Isn't constructing large block matrices with a lot of zeros inefficient?
block is meant to be a quick prototyping tool and
there's probably a more efficient way to solve your system
if it has a lot of zeros or identity elements.
block handle numpy and PyTorch with the same interface?
I wrote the logic to handle matrix sizing to be agnostic of the matrix library being used. numpy and PyTorch are just backends. More backends can easily be added for your favorite Python matrix library.
class Backend(metaclass=ABCMeta): @abstractmethod def extract_shape(self, x): pass @abstractmethod def build_eye(self, n): pass @abstractmethod def build_full(self, shape, fill_val): pass @abstractmethod def build(self, rows): pass @abstractmethod def is_complete(self, rows): pass
pip install block
from block import block
- Run tests in
Issues and Contributions
This repository is Apache-licensed.