Decompositions


LU

wiki entry

It turns out that a proper permutation in rows (or columns) is sufficient for LU factorization. LU factorization with partial pivoting (LUP) refers often to LU factorization with row permutations only:

$PA=LU$

where

It turns out that all square matrices can be factorized in this form Pavel Okunev, Charles R. Johnson 1997

and the factorization is numerically stable in practice.

This makes LUP decomposition a useful technique in practice.

Look at discussion and python 2 code


Solving linear equations

Given a system of linear equations in matrix form

$ A\mathbf {x} =\mathbf {b} $ we want to solve the equation for x, given A and b.

Suppose we have already obtained the $LUP$ decomposition of $A$ such that

$PA=LU$ , so $LU\mathbf {x} =P\mathbf {b}$.

In this case the solution is done in two logical steps:

  1. solve the equation $L\mathbf {y} =P\mathbf {b} $ for y.
  2. solve the equation $U\mathbf {x} =\mathbf {y} $ for x.

In both cases we are dealing with triangular matrices (L and U), which can be solved directly by forward and backward substitution without using the Gaussian elimination process (however we do need this process or equivalent to compute the LU decomposition itself).

non trivial permutation

singular decomposition


Cholesky

wiki

In linear algebra, the Cholesky decomposition is a decomposition of a matrix $\mathbf {A}$ which is

  1. Hermitian
  2. positive-definite

into the product of a lower triangular matrix and its conjugate transpose ie

$\mathbf {A} =\mathbf {LL} ^{*}$

The recipe for the coefficients of $\mathbf {L}$ is :


QR decompositions

source

There are a several different algorithms for calculating the matrices $Q$ and $R$.

We will outline the method of Householder Reflections, which is known to be more numerically stable than the alternative Gramm-Schmidt method.

For more details see QR Decomposition using Householder Reflections.

A Householder Reflection is a linear transformation that enables a vector to be reflected through a plane or hyperplane. Essentially, we use this method because we want to create an upper triangular matrix, $R$. The householder reflection is able to carry out this vector reflection such that all but one of the coordinates disappears. The matrix $Q$ will be built up as a sequence of matrix multiplications that eliminate each coordinate in turn, up to the rank of the matrix $A$.

First step create the vector $\mathbb{x}$, which is the $k$-th column of the matrix $A$, for step $k$.

We define $\alpha = -sgn(\mathbb{x}_k)(||\mathbb{x}||)$. The norm $||\cdot||$ used here is the Euclidean norm.

Given the first column vector of the identity matrix, $I$ of equal size to $A$, $\mathbb{e}_1 = (1,0,...,0)^T$, we create the vector $\mathbb{u}$:

\begin{eqnarray*} \mathbb{u} = \mathbb{x} + \alpha \mathbb{e}_1 \end{eqnarray*}

Once we have the vector $\mathbb{u}$, we need to convert it to a unit vector, which we denote as $\mathbb{v}$:

\begin{eqnarray*} \mathbb{v} = \mathbb{u}/||\mathbb{u}|| \end{eqnarray*}

Second step form the matrix of the Householder transformation $Q$ associated to $\mathbb{v}$ :

\begin{eqnarray*} Q = I - 2 \mathbb{v} \mathbb{v}^T \end{eqnarray*}

Third step $Q$ is now an $m\times m$ Householder matrix, with $Q\mathbb{x} = \left( \alpha, 0, ..., 0\right)^T$. We will use $Q$ to transform $A$ to upper triangular form, giving us the matrix $R$.

Write $Q_k$ for $Q$ at the $k$th tep and, since $k=0$ in this first step, we have $Q_0$ as our first Householder matrix. Muliplying by $A$ gives us:

\begin{eqnarray*} Q_0A = \begin{bmatrix} \alpha_1&\star&\dots&\star\\ 0 & & & \\ \vdots & & A' & \\ 0 & & & \end{bmatrix} \end{eqnarray*}

The whole process is recursive and we repeat the 3 steps above for the minor matrix $A'$, which will give a second Householder matrix $Q'_1$. Now we have to "pad out" this minor matrix with elements from the identity matrix such that we can consistently multiply the Householder matrices together. Hence, we define $Q_k$ as the block matrix:

\begin{eqnarray*} Q_k = \begin{pmatrix} I_{k-1} & 0\\ 0 & Q_k'\end{pmatrix} \end{eqnarray*}

Once we have carried out $t$ iterations of this process we have $R$ as an upper triangular matrix:

\begin{eqnarray*} R = Q_t ... Q_2 Q_1 A \end{eqnarray*}

$Q$ is then fully defined as the multiplication of the transposes of each $Q_k$:

\begin{eqnarray*} Q = Q^T_1 Q^T_2 ... Q^T_t \end{eqnarray*}

This gives $A=QR$, the QR Decomposition of $A$.


Recursive version

Same method but avoids doing the loop.


eigenvalues using QR

source

! pip install tabulate


partial pivot

finding P in P @ L @ U