A Matrix decomposition is a way of reducing matrix into constituent parts.
What is that mean?
It is an approach that can simplify more complex matrix operations that can be performed on the decomposed matrix rather than on the original one itself.
Matrix decomposition types:
- LU Matrix Decomposition
- QR Matrix Decomposition
- Cholesky Decomposition
The LU decomposition is for square matrices and decomposes a matrix into L and U components. It's defined as follows: A= L . U
Where A is the square matrix that we wish to decompose, L is the lower triangle matrix and U is the upper triangle matrix.
The factors L and U are triangular matrices. The factorization that comes from elimination is A = LU.
It looks like this:
The LU decomposition is often used to simplify the solving of systems of linear equations, such as finding the coefficients in a linear regression, as well as in calculating the determinant and inverse of a matrix.
It's defined as follows:
A=Q . R
The QR decomposition is for m x n matrices and decomposes a matrix into Q and R components.
Is decomposition of a matrix into an orthogonal matrix multipled by an upper-triangular matrix.
Where A is the matrix that we wish to decompose, Q a matrix with the size m x n, and R is an upper triangle matrix with the size m x n.
The Cholesky decomposition is for square symmetric matrices where all values are greater than zero, so-called positive definite matrices.
It's defined as follows:
A=L . L^T
Where A is the matrix being decomposed, L is the lower triangular matrix and L^T is the transpose of L.
Cholesky is useful at many applications like:
- Linear least squares
- Non-linear optimization
- Monte Carlo simulation
- Kalman filters
- Matrix inversion
Cholesky Decomposition looks like that:
It looks something like this:
Very Simple high school math yet very useful for many applications in moder Machine learning :)