Decompose any 2×2 real matrix into U, Σ, and VT matrices.
Orthogonal Matrix U:
Singular Values Σ (Sigma):
Orthogonal Matrix Vᵀ (V Transpose):
* Verification: A = U * Σ * Vᵀ
Understanding Singular Value Decomposition (SVD)
Singular Value Decomposition (SVD) is a fundamental theorem in linear algebra that factors a matrix into three specific matrices. It is widely considered the "Swiss Army Knife" of matrix computations due to its robustness and wide range of applications in data science, physics, and engineering.
The decomposition is expressed as:
A = UΣVᵀ
The Components of SVD
U (Left Singular Vectors): An orthogonal matrix whose columns are the eigenvectors of AAᵀ.
Σ (Singular Values): A diagonal matrix containing the square roots of the eigenvalues of AᵀA (or AAᵀ), arranged in descending order.
Vᵀ (Right Singular Vectors): The transpose of an orthogonal matrix V, whose columns are the eigenvectors of AᵀA.
How to Calculate SVD for a 2×2 Matrix
While computing SVD for large matrices requires iterative algorithms like the Jacobi method or QR algorithm, a 2×2 matrix can be solved analytically:
Compute the symmetric matrix M = AᵀA.
Find the eigenvalues (λ₁, λ₂) of M by solving the characteristic equation det(M – λI) = 0.
The singular values are σ₁ = √λ₁ and σ₂ = √λ₂.
Find the unit eigenvectors for M to form the columns of V.
Compute the columns of U using the formula uᵢ = (1/σᵢ)Avᵢ.
Real-World Example
Consider Matrix A = [[3, 2], [2, 3]].
1. AᵀA results in [[13, 12], [12, 13]].
2. Eigenvalues are λ₁=25, λ₂=1.
3. Singular values are σ₁=5, σ₂=1.
4. Resulting U and V matrices represent a 45-degree rotation and scaling.
Applications of SVD
SVD is not just a theoretical tool; it powers many modern technologies:
Image Compression: By keeping only the largest singular values, we can represent an image with significantly less data.
Principal Component Analysis (PCA): SVD is the mathematical engine behind PCA, used for dimensionality reduction in machine learning.
Latent Semantic Analysis: Used in natural language processing to find relationships between documents and terms.
Noise Reduction: SVD helps in filtering out noise from signals by removing small singular values that typically represent random interference.
function calculateSVD() {
var a = parseFloat(document.getElementById('m11').value);
var b = parseFloat(document.getElementById('m12').value);
var c = parseFloat(document.getElementById('m21').value);
var d = parseFloat(document.getElementById('m22').value);
if (isNaN(a) || isNaN(b) || isNaN(c) || isNaN(d)) {
alert("Please enter valid numeric values for all matrix elements.");
return;
}
// A^T * A
var s11 = a*a + c*c;
var s12 = a*b + c*d;
var s21 = s12;
var s22 = b*b + d*d;
// Eigenvalues of A^T * A
var trace = s11 + s22;
var det = s11 * s22 – s12 * s21;
var diff = Math.sqrt(Math.pow(trace / 2, 2) – det);
var l1 = trace / 2 + diff;
var l2 = trace / 2 – diff;
// Singular Values
var sig1 = Math.sqrt(Math.max(0, l1));
var sig2 = Math.sqrt(Math.max(0, l2));
// V Matrix (Eigenvectors of A^T * A)
var thetaV = 0.5 * Math.atan2(2 * s12, s11 – s22);
var v11 = Math.cos(thetaV);
var v21 = Math.sin(thetaV);
var v12 = -Math.sin(thetaV);
var v22 = Math.cos(thetaV);
// U Matrix calculation: u_i = (1/sigma_i) * A * v_i
var u11, u21, u12, u22;
if (sig1 > 0.000001) {
u11 = (a * v11 + b * v21) / sig1;
u21 = (c * v11 + d * v21) / sig1;
} else {
u11 = 1; u21 = 0;
}
if (sig2 > 0.000001) {
u12 = (a * v12 + b * v22) / sig2;
u22 = (c * v12 + d * v22) / sig2;
} else {
// Find orthogonal vector to u1
u12 = -u21;
u22 = u11;
}
// Helper to format numbers
function f(num) {
return Number(num.toFixed(4));
}
// Display Matrices
document.getElementById('matrixU').innerHTML =
'