Five Matrix Decompositions (Cholesky Decomposition, LU Decomposition, QR Decomposition, Spectral Decomposition, SVD)

This article provides a concise explanation of five matrix decomposition techniques frequently used in machine learning. The theoretical background and Python implementations are included.

The following excellent article was used as a reference:

Source Code

GitHub

  • Jupyter Notebook file: link

Google Colaboratory

  • Google Colaboratory file: link

Execution Environment

The operating system used is macOS. Note that Linux and Unix commands may have different options.

!sw_vers
ProductName:        macOS
ProductVersion:     13.5.1
BuildVersion:       22G90
!python -V
Python 3.9.17

Basic libraries are imported, and watermark is used to confirm their versions. The random seed is also set.

%matplotlib inline
%config InlineBackend.figure_format = 'svg'
import random

import scipy
import numpy as np

import matplotlib
import matplotlib.pyplot as plt

seed = 123
random_state = 123

random.seed(seed)
np.random.seed(seed)

from watermark import watermark

print(watermark(python=True, watermark=True, iversions=True, globals_=globals()))
Python implementation: CPython
Python version       : 3.9.17
IPython version      : 8.17.2

scipy     : 1.11.2
numpy     : 1.25.2
matplotlib: 3.8.1

Watermark: 2.4.3

Cholesky Decomposition

Cholesky decomposition is a method for decomposing positive-definite symmetric matrices into triangular matrices. For a positive-definite symmetric matrix $ \mathbf{A} $, the decomposition is expressed as $ \mathbf{A} = \mathbf{L} \mathbf{L}^\top $, where $ \mathbf{L} $ is a lower triangular matrix with positive diagonal elements.

For example, a $ 3 \times 3 $ positive-definite symmetric matrix can be decomposed into a single lower triangular matrix.

The computational complexity is approximately $ O(n^3) $, which is often more efficient than Gaussian elimination. Cholesky decomposition requires the matrix to be positive-definite and symmetric, which can be verified by ensuring all eigenvalues are positive.

Python example:

import numpy as np

# Define a positive-definite symmetric matrix
A = np.array([[4.0, 2.0], [2.0, 3.0]])

# Perform Cholesky decomposition
L = np.linalg.cholesky(A)

# Display results
print("A =\n", A.round(2))
print("L =\n", L.round(2))
print("L * L^T =\n", (L @ L.T).round(2))
A =
 [[4. 2.]
 [2. 3.]]
L =
 [[2.   0.  ]
 [1.   1.41]]
L * L^T =
 [[4. 2.]
 [2. 3.]]

Here, $ \mathbf{A} = \begin{pmatrix} 4 & 2 \ 2 & 3 \end{pmatrix} $ is decomposed into a lower triangular matrix $ \mathbf{L} $.


LU Decomposition

LU decomposition is a standard method to decompose any square matrix into lower and upper triangular matrices. The decomposition is expressed as $ \mathbf{A} = \mathbf{L} \mathbf{U} $.

Example in Python:

import numpy as np
import scipy.linalg as la

# Example matrix
A = np.array([[2.0, 1.0, 1.0], [4.0, 3.0, 3.0], [8.0, 7.0, 9.0]])

# LU decomposition (with partial pivoting)
P, L, U = la.lu(A)

print("A =\n", A)
print("P =\n", P)
print("L =\n", L.round(2))
print("U =\n", U.round(2))
print("P*A =\n", (P @ A).round(2))
print("L*U =\n", (L @ U).round(2))

QR Decomposition

QR decomposition decomposes a matrix into an orthogonal matrix $ \mathbf{Q} $ and an upper triangular matrix $ \mathbf{R} $.

Python example:

import numpy as np

# Define a matrix A
A = np.array([[1.0, 1.0], [1.0, 2.0], [1.0, 3.0]])

# Perform QR decomposition
Q, R = np.linalg.qr(A)

print("A =\n", A.round(2))
print("Q =\n", Q.round(2))
print("R =\n", R.round(2))
print("Q*R =\n", (Q @ R).round(2))

Spectral Decomposition

Spectral decomposition, or eigen decomposition, diagonalizes a matrix using its eigenvalues and eigenvectors.

Python example:

import numpy as np

# Define a symmetric matrix
A = np.array([[4.0, 1.0], [1.0, 3.0]])

# Compute eigenvalues and eigenvectors
eigvals, eigvecs = np.linalg.eig(A)

print("A =\n", A)
print("Eigenvalues =\n", eigvals.round(2))
print("Eigenvectors =\n", eigvecs.round(2))

Singular Value Decomposition (SVD)

SVD decomposes any $m \times n$ matrix $ \mathbf{A} $ into $ \mathbf{U} \mathbf{\Sigma} \mathbf{V}^\top $.

Python example:

import numpy as np

A = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0], [7.0, 8.0, 9.0]])

U, S, Vt = np.linalg.svd(A, full_matrices=True)

print("A =\n", A)
print("U =\n", U.round(2))
print("S =\n", S.round(2))
print("V^T =\n", Vt.round(2))

Conclusion

This article summarized five key matrix decomposition techniques: Cholesky, LU, QR, spectral, and SVD. These methods enhance computational efficiency in large-scale data analysis, making them central to machine learning and AI.

References