Singular value decomposition (SVD) is commonly used for data compression or variable reduction, and plays a large role in machine learning applications. SVD decomposes a matrix into the product of three matrices:
And for mathematical convenience, we can take advantage that U and V are an orthogonal matrices (\(\mathbf{U}^{\intercal}\mathbf{U}=\mathbf{V}^{\intercal}\mathbf{V}=\mathbf{I}\)) by expressing two equations:
These two equations are called eigenvalue equations, which show up all over the place in statistical work. These are easy to solve by computation. For example, we would solve the second eigenvalue equation to find V and D, and then find U by solving \(\mathbf{A}=\mathbf{UDV}^\intercal\) for U, namely:
\[
\begin{split}
\mathbf{U} &= \begin{bmatrix}
-0.76 & 0.65 \\
0.65 & 0.76 \\
\end{bmatrix} \\[1em]
\mathbf{D} &= \begin{bmatrix}
6.42 & 0 \\
0 & 2.18 \\
\end{bmatrix} \\[1em]
\mathbf{V} &= \begin{bmatrix}
-1.00 & 0.08 \\
0.08 & 1.00 \\
\end{bmatrix}
\end{split}
\] We can verify that \(\mathbf{A}=\mathbf{USV}^\intercal\). Note that the svd() function outputs the diagonal elements of the D matrix, so we need to use the diag() function to create the actual D matrix.
# Verify the results of the SVDsv_decomp$u %*%diag(sv_decomp$d) %*%t(sv_decomp$v)