Understanding Singular Value Decomposition (SVD)
The Singular Value Decomposition is one of the most important matrix factorizations in linear algebra. Every real m x n matrix A can be decomposed as A = U * Sigma * V^T, where U and V are orthogonal matrices and Sigma is a diagonal matrix of non-negative singular values.
SVD Components
U Matrix
Orthogonal matrix whose columns are the left singular vectors (eigenvectors of A*A^T).
Sigma Matrix
Diagonal matrix with singular values (square roots of eigenvalues of A^T*A).
V^T Matrix
Transpose of orthogonal matrix V whose columns are the right singular vectors.
Applications of SVD
SVD is widely used in data science, machine learning, and engineering. It powers principal component analysis (PCA), image compression, recommendation systems, pseudoinverse computation, and noise reduction in signal processing.
Key Properties
- Singular values are always non-negative and ordered in descending magnitude.
- The number of non-zero singular values equals the rank of the matrix.
- SVD exists for every matrix, unlike eigendecomposition.
- The Frobenius norm of A equals the square root of the sum of squared singular values.