OLS via QR decomposition

Published

October 8, 2025

Data

We will analyze the same randomized complete block design (RCBD) (4 genotypes and 3 blocks).

library(dplyr)
library(Matrix)
data <- read.csv("../data/example_1.csv") |>
  mutate(gen = as.factor(gen),
         block = as.factor(block))
head(data)
  block gen yield
1     1  g1   7.4
2     2  g1   6.5
3     3  g1   5.6
4     1  g2   9.8
5     2  g2   6.8
6     3  g2   6.2
str(data)
'data.frame':   12 obs. of  3 variables:
 $ block: Factor w/ 3 levels "1","2","3": 1 2 3 1 2 3 1 2 3 1 ...
 $ gen  : Factor w/ 4 levels "g1","g2","g3",..: 1 1 1 2 2 2 3 3 3 4 ...
 $ yield: num  7.4 6.5 5.6 9.8 6.8 6.2 7.3 6.1 6.4 9.5 ...

Model Matrix

We fit the additive model

\[\texttt{yield} \sim 1 + \texttt{block} + \texttt{gen}\]

X <- model.matrix(yield ~ 1 + block + gen, data = data)
y <- data[["yield"]]
dim(X)
[1] 12  6
X[1:6, ]
  (Intercept) block2 block3 geng2 geng3 geng4
1           1      0      0     0     0     0
2           1      1      0     0     0     0
3           1      0      1     0     0     0
4           1      0      0     1     0     0
5           1      1      0     1     0     0
6           1      0      1     1     0     0
y
 [1] 7.4 6.5 5.6 9.8 6.8 6.2 7.3 6.1 6.4 9.5 8.0 7.4

Coefficients via Normal Equations

The OLS solution solves \((X^\top X)\beta = X^\top y\).

XtX <- crossprod(X)      # X'X
Xty <- crossprod(X, y)   # X'y
qr(XtX)$rank             # check rank of X'X
[1] 6
beta_normal <- solve(XtX, Xty)
beta_normal
             [,1]
(Intercept)  7.75
block2      -1.65
block3      -2.10
geng2        1.10
geng3        0.10
geng4        1.80

Coefficients via QR Decomposition

Important

The QR decomposition (also called the QR factorization) of a matrix is a decomposition of the matrix into an orthogonal matrix and a triangular matrix. A QR decomposition of a real square matrix \(A\) is a decomposition of \(A\) as \(A = QR\), where \(Q\) is an orthogonal matrix (i.e. \(Q^\top Q = I\)) and \(R\) is an upper triangular matrix. If \(A\) is nonsingular, then this factorization is unique. By Yanovsky (UCLA) (PDF) link

We start by factoring the design matrix \(X\), such that \[X = QR\]

Substituting this factorization into the normal equations, \[X^\top X \beta = X^\top y,\] we obtain \[(R^\top Q^\top)(Q R)\beta = R^\top Q^\top y\] Because \(Q^\top Q = I\), this simplifies to \[R^\top R \beta = R^\top Q^\top y\] By pre-multiplying both sides by \((R^\top)^{-1}\), we get \[R \beta = Q^\top y\] Finally, since \(R\) is upper triangular, we can solve for \(\beta\) efficiently using back substitution.

qrX <- qr(X)
Q   <- qr.Q(qrX)
R   <- qr.R(qrX)
beta_qr <- backsolve(R, t(Q) %*% y)
beta_qr
      [,1]
[1,]  7.75
[2,] -1.65
[3,] -2.10
[4,]  1.10
[5,]  0.10
[6,]  1.80
round(Q, 3)
        [,1]   [,2]   [,3]   [,4]   [,5]   [,6]
 [1,] -0.289  0.204  0.354  0.167  0.236  0.408
 [2,] -0.289 -0.408  0.000  0.167  0.236  0.408
 [3,] -0.289  0.204 -0.354  0.167  0.236  0.408
 [4,] -0.289  0.204  0.354 -0.500  0.000  0.000
 [5,] -0.289 -0.408  0.000 -0.500  0.000  0.000
 [6,] -0.289  0.204 -0.354 -0.500  0.000  0.000
 [7,] -0.289  0.204  0.354  0.167 -0.471  0.000
 [8,] -0.289 -0.408  0.000  0.167 -0.471  0.000
 [9,] -0.289  0.204 -0.354  0.167 -0.471  0.000
[10,] -0.289  0.204  0.354  0.167  0.236 -0.408
[11,] -0.289 -0.408  0.000  0.167  0.236 -0.408
[12,] -0.289  0.204 -0.354  0.167  0.236 -0.408
round(R, 3)
  (Intercept) block2 block3  geng2  geng3  geng4
1      -3.464 -1.155 -1.155 -0.866 -0.866 -0.866
2       0.000 -1.633  0.816  0.000  0.000  0.000
3       0.000  0.000 -1.414  0.000  0.000  0.000
4       0.000  0.000  0.000 -1.500  0.500  0.500
5       0.000  0.000  0.000  0.000 -1.414  0.707
6       0.000  0.000  0.000  0.000  0.000 -1.225

Check: Same Fit Either Way

all.equal(as.vector(beta_normal), as.vector(beta_qr))
[1] TRUE

Notes

  • The normal-equations method uses an explicit \((X^\top X)^{-1}\); it’s concise but can be numerically less stable if \(X^\top X\) is ill-conditioned.
  • The QR method avoids forming \((X^\top X)^{-1}\) directly and is typically more stable; most linear model software uses QR (or related decompositions) under the hood.