1. Householder Qr Decomposition
  2. Qr Factorization Householder
  3. Eigen Householder Qr
  4. Qr Householder Image
  5. Qr Householder Matlab

This article will discuss QR Decomposition in Python. In previous articles we have looked at LU Decomposition in Python and Cholesky Decomposition in Python as two alternative matrix decomposition methods. QR Decomposition is widely used in quantitative finance as the basis for the solution of the linear least squares problem, which itself is used for statistical regression analysis.

Householder Qr Decomposition

QR Decomposition Like in case of LU Decomposition, where we applied a series of Gauss Transformation changes, we can do the same and perform a series of Householder Transformations so if we select y = ± e 1 (where e 1 is the matrix with 1 on position 1 and rest are zeros) then it will zero all elements of x except the first one. A better example of a QR decomposition of a 3x3 Matrix with the Householder Transformation than those of last year I hope.

One of the key benefits of using QR Decomposition over other methods for solving linear least squares is that it is more numerically stable, albeit at the expense of being slower to execute. Hence if you are performing a large quantity of regressions as part of a trading backtest, for instance, you will need to consider very extensively whether QR Decomposition is the best fit (excuse the pun).

For a square matrix $A$ the QR Decomposition converts $A$ into the product of an orthogonal matrix $Q$ (i.e. $Q^TQ=I$) and an upper triangular matrix $R$. Hence:

begin{eqnarray*} A = QR end{eqnarray*}

There are a few different algorithms for calculating the matrices $Q$ and $R$. We will outline the method of Householder Reflections, which is known to be more numerically stable the the alternative Gramm-Schmidt method. I've outlined the Householder Reflections method below.

Note, the following explanation is an expansion of the extremely detailed article on QR Decomposition using Householder Reflections over at Wikipedia.

A Householder Reflection is a linear transformation that enables a vector to be reflected through a plane or hyperplane. Essentially, we use this method because we want to create an upper triangular matrix, $R$. The householder reflection is able to carry out this vector reflection such that all but one of the coordinates disappears. The matrix $Q$ will be built up as a sequence of matrix multiplications that eliminate each coordinate in turn, up to the rank of the matrix $A$.

The first step is to create the vector $mathbb{x}$, which is the $k$-th column of the matrix $A$, for step $k$. We define $alpha = -sgn(mathbb{x}_k)( mathbb{x} )$. The norm $ cdot $ used here is the Euclidean norm. Given the first column vector of the identity matrix, $I$ of equal size to $A$, $mathbb{e}_1 = (1,0,...,0)^T$, we create the vector $mathbb{u}$:

begin{eqnarray*} mathbb{u} = mathbb{x} + alpha mathbb{e}_1 end{eqnarray*}

Once we have the vector $mathbb{u}$, we need to convert it to a unit vector, which we denote as $mathbb{v}$:

begin{eqnarray*} mathbb{v} = mathbb{u}/ mathbb{u} end{eqnarray*}

Now we form the matrix $Q$ out of the identity matrix $I$ and the vector multiplication of $mathbb{v}$:

begin{eqnarray*} Q = I - 2 mathbb{v} mathbb{v}^T end{eqnarray*}

$Q$ is now an $mtimes m$ Householder matrix, with $Qmathbb{x} = left( alpha, 0, ..., 0right)^T$. We will use $Q$ to transform $A$ to upper triangular form, giving us the matrix $R$. We denote $Q$ as $Q_k$ and, since $k=1$ in this first step, we have $Q_1$ as our first Householder matrix. We multiply this with $A$ to give us:

begin{eqnarray*} Q_1A = begin{bmatrix} alpha_1&star&dots&star 0 & & & vdots & & A' & 0 & & & end{bmatrix} end{eqnarray*}

The whole process is now repeated for the minor matrix $A'$, which will give a second Householder matrix $Q'_2$. Now we have to 'pad out' this minor matrix with elements from the identity matrix such that we can consistently multiply the Householder matrices together. Hence, we define $Q_k$ as the block matrix:

begin{eqnarray*} Q_k = begin{pmatrix} I_{k-1} & 0 0 & Q_k'end{pmatrix} end{eqnarray*}

Qr Factorization Householder

HouseholderHouseholder

Once we have carried out $t$ iterations of this process we have $R$ as an upper triangular matrix:

begin{eqnarray*} R = Q_t ... Q_2 Q_1 A end{eqnarray*}

$Q$ is then fully defined as the multiplication of the transposes of each $Q_k$:

begin{eqnarray*} Q = Q^T_1 Q^T_2 ... Q^T_t end{eqnarray*}

This gives $A=QR$, the QR Decomposition of $A$.

To calculate the QR Decomposition of a matrix $A$ with NumPy/SciPy, we can make use of the built-in linalg library via the linalg.qr function. This is significantly more efficient than using a pure Python implementation:

The output of the QR decomposition includes $A$, $Q$ and $R$. As a basic sanity check we can see that $R$ is in fact an upper triangular matrix:

You aren't likely to ever need a pure Python implementation of QR Decomposition (homework notwithstanding), but I feel that it is helpful to gain an understanding of the Householder Reflections algorithm, so I have written my own implementation:

Eigen Householder Qr

Qr householder method

Qr Householder Image

The output from the Householder method implemented in pure Python is given below:

Householder qr factorization complete example

Qr Householder Matlab

You can see that we get the same answers as above in the SciPy implementation, albeit with a few more significant figures! One has to be extremely careful in numerical algorithms when dealing with floating point arithmetic, but that is a discussion for another day.