Singular Value Decomposition Calculator
Calculate SVD of Matrices
Compute Singular Value Decomposition A = UΣVᵀ for any matrix. Get U, Σ, V matrices, singular values, and rank with visualization.
SVD Result: A = UΣVᵀ
Singular Values (σᵢ):
Singular Value Energy Distribution:
SVD Properties:
Verification: UΣVᵀ ≈ A
Matrix Analysis:
Singular Value Decomposition decomposes any matrix into three matrices: U (left singular vectors), Σ (diagonal with singular values), and Vᵀ (right singular vectors transposed).
What is Singular Value Decomposition (SVD)?
Singular Value Decomposition (SVD) is a fundamental matrix factorization technique in linear algebra that decomposes any real or complex matrix into three matrices: A = UΣVᵀ, where U and V are orthogonal/unitary matrices, and Σ is a diagonal matrix containing singular values. SVD reveals the intrinsic geometric structure of a matrix and is widely used in data analysis, signal processing, and machine learning.
SVD Components
U Matrix
Left singular vectors
Columns are orthonormal
Σ Matrix
Singular values σ₁ ≥ σ₂ ≥ ... ≥ 0
Non-negative, descending order
V Matrix
Right singular vectors
Rows of Vᵀ are orthonormal
Properties
Orthogonal matrices
Σ contains rank information
SVD Mathematical Details
1. SVD Formula
For any m×n matrix A (real or complex):
A = U Σ Vᵀ
Where:
• U: m×m orthogonal matrix (UᵀU = I)
• Σ: m×n diagonal matrix with σ₁ ≥ σ₂ ≥ ... ≥ σᵣ > 0
• V: n×n orthogonal matrix (VᵀV = I)
• r = rank(A) = number of non-zero singular values
2. Reduced SVD (Economy SVD)
Compact form using only non-zero singular values:
A = Uᵣ Σᵣ Vᵣᵀ
Where:
• Uᵣ: m×r (first r columns of U)
• Σᵣ: r×r diagonal (non-zero singular values)
• Vᵣ: n×r (first r columns of V)
• Storage: O(mr + r² + nr) vs O(m² + mn + n²)
3. Relation to Eigenvalues
SVD connects to eigenvalues of AᵀA and AAᵀ:
AᵀA = V ΣᵀΣ Vᵀ (eigenvectors = V, eigenvalues = σᵢ²)
AAᵀ = U ΣΣᵀ Uᵀ (eigenvectors = U, eigenvalues = σᵢ²)
Singular values: σᵢ = √λᵢ(AᵀA) = √λᵢ(AAᵀ)
V contains eigenvectors of AᵀA, U contains eigenvectors of AAᵀ
Real-World Applications
Data Science & Machine Learning
- Principal Component Analysis (PCA): Dimensionality reduction via SVD of covariance matrix
- Recommendation systems: Collaborative filtering (Netflix Prize, matrix completion)
- Natural Language Processing: Latent Semantic Analysis (LSA) for document retrieval
- Image compression: JPEG compression using truncated SVD
Computer Vision & Graphics
- Face recognition: Eigenfaces using SVD of face image matrices
- Structure from motion: 3D reconstruction from 2D images
- Image denoising: Low-rank approximation via SVD thresholding
- Video compression: Temporal and spatial redundancy reduction
Signal Processing
- Noise reduction: Separating signal from noise using SVD
- Array processing: Direction of arrival estimation
- System identification: Hankel matrix analysis
- Audio processing: Source separation in music
Scientific Computing
- Numerical linear algebra: Solving ill-conditioned linear systems
- Computational fluid dynamics: Proper Orthogonal Decomposition (POD)
- Quantum mechanics: Density matrix renormalization
- Bioinformatics: Gene expression data analysis
SVD Examples and Properties
| Matrix Type | Singular Values | Rank | Special Properties |
|---|---|---|---|
| Identity | σ = [1,1,...,1] | n | U = V = I, Σ = I |
| Zero Matrix | σ = [0,0,...,0] | 0 | Rank zero, any U,V work |
| Diagonal | σ = |diagonal| | # non-zero diag | U = V = I, Σ = |D| |
| Orthogonal | σ = [1,1,...,1] | n | All singular values = 1 |
| Rank-deficient | σᵣ₊₁ = ... = σₙ = 0 | r < min(m,n) | Numerical rank detection |
| Ill-conditioned | σₘₐₓ/σₘᵢₙ large | n | Large condition number |
SVD Algorithm Properties
| Algorithm | Complexity | Stability | Best For |
|---|---|---|---|
| Golub-Reinsch | O(mn²) for m≥n | Numerically stable | General matrices |
| R-SVD | O(mn log n) | Randomized, approximate | Large matrices |
| Jacobi SVD | O(mn²) | High accuracy | Small matrices |
| Divide & Conquer | O(mn²) | Fast for bidiagonal | Medium matrices |
| Power Method | O(mn k) | Iterative, for top-k | Partial SVD |
| Lanczos | O(mn k) | Good for sparse | Large sparse matrices |
Step-by-Step SVD Calculation
Example: SVD of 2×2 Matrix [[3,0],[0,2]]
- Given matrix A = [[3, 0], [0, 2]]
- Compute AᵀA = [[9, 0], [0, 4]] and AAᵀ = [[9, 0], [0, 4]]
- Find eigenvalues of AᵀA: λ₁ = 9, λ₂ = 4
- Singular values: σ₁ = √9 = 3, σ₂ = √4 = 2
- Find eigenvectors of AᵀA: v₁ = [1,0], v₂ = [0,1] (V = I)
- Find eigenvectors of AAᵀ: u₁ = [1,0], u₂ = [0,1] (U = I)
- Construct Σ = [[3, 0], [0, 2]]
- Result: A = UΣVᵀ = I × [[3,0],[0,2]] × I
- Verification: UΣVᵀ = [[3,0],[0,2]] = A ✓
Example: SVD of 3×2 Matrix [[1,2],[3,4],[5,6]]
- Given A = [[1,2],[3,4],[5,6]] (3×2)
- Compute AᵀA = [[35,44],[44,56]]
- Find eigenvalues: λ₁ ≈ 90.4, λ₂ ≈ 0.6
- Singular values: σ₁ ≈ 9.51, σ₂ ≈ 0.77
- Rank = 2 (both singular values non-zero)
- U size: 3×3, Σ size: 3×2, V size: 2×2
- Σ contains σ₁, σ₂ on diagonal, zeros elsewhere
- Used for dimensionality reduction: keep only σ₁ for rank-1 approximation
Related Calculators
Frequently Asked Questions (FAQs)
Q: What's the difference between SVD and Eigenvalue Decomposition?
A: Eigenvalue decomposition (A = XΛX⁻¹) works only for square diagonalizable matrices. SVD (A = UΣVᵀ) works for any rectangular matrix. For symmetric positive definite matrices, SVD and EVD coincide with U = V. SVD always exists, EVD doesn't always exist.
Q: How do I choose how many singular values to keep?
A: Common methods: 1) Keep singular values above a threshold (e.g., σᵢ > ε·σ₁), 2) Keep enough to capture certain energy percentage (e.g., 95% of total variance), 3) Look for "elbow" in scree plot, 4) Use cross-validation for specific applications.
Q: Can SVD handle complex matrices?
A: Yes, SVD extends to complex matrices: A = UΣVᴴ where U and V are unitary matrices (UᴴU = I), Σ is real non-negative diagonal, and ᴴ denotes conjugate transpose. The singular values remain real and non-negative.
Q: What's the Moore-Penrose pseudoinverse using SVD?
A: For A = UΣVᵀ, the pseudoinverse A⁺ = VΣ⁺Uᵀ where Σ⁺ is formed by taking reciprocal of non-zero singular values (1/σᵢ) and transposing. This gives the least-squares solution to Ax = b: x = A⁺b.
Master matrix decompositions with Toolivaa's free Singular Value Decomposition Calculator, and explore more advanced tools in our Linear Algebra Calculators collection.
Singular Value Decomposition Calculator
Calculate SVD of Matrices
Compute Singular Value Decomposition A = UΣVᵀ for any matrix. Get U, Σ, V matrices, singular values, and rank with visualization.
SVD Result: A = UΣVᵀ
Singular Values (σᵢ):
Singular Value Energy Distribution:
SVD Properties:
Verification: UΣVᵀ ≈ A
Matrix Analysis:
Singular Value Decomposition decomposes any matrix into three matrices: U (left singular vectors), Σ (diagonal with singular values), and Vᵀ (right singular vectors transposed).
What is Singular Value Decomposition (SVD)?
Singular Value Decomposition (SVD) is a fundamental matrix factorization technique in linear algebra that decomposes any real or complex matrix into three matrices: A = UΣVᵀ, where U and V are orthogonal/unitary matrices, and Σ is a diagonal matrix containing singular values. SVD reveals the intrinsic geometric structure of a matrix and is widely used in data analysis, signal processing, and machine learning.
SVD Components
U Matrix
Left singular vectors
Columns are orthonormal
Σ Matrix
Singular values σ₁ ≥ σ₂ ≥ ... ≥ 0
Non-negative, descending order
V Matrix
Right singular vectors
Rows of Vᵀ are orthonormal
Properties
Orthogonal matrices
Σ contains rank information
SVD Mathematical Details
1. SVD Formula
For any m×n matrix A (real or complex):
A = U Σ Vᵀ
Where:
• U: m×m orthogonal matrix (UᵀU = I)
• Σ: m×n diagonal matrix with σ₁ ≥ σ₂ ≥ ... ≥ σᵣ > 0
• V: n×n orthogonal matrix (VᵀV = I)
• r = rank(A) = number of non-zero singular values
2. Reduced SVD (Economy SVD)
Compact form using only non-zero singular values:
A = Uᵣ Σᵣ Vᵣᵀ
Where:
• Uᵣ: m×r (first r columns of U)
• Σᵣ: r×r diagonal (non-zero singular values)
• Vᵣ: n×r (first r columns of V)
• Storage: O(mr + r² + nr) vs O(m² + mn + n²)
3. Relation to Eigenvalues
SVD connects to eigenvalues of AᵀA and AAᵀ:
AᵀA = V ΣᵀΣ Vᵀ (eigenvectors = V, eigenvalues = σᵢ²)
AAᵀ = U ΣΣᵀ Uᵀ (eigenvectors = U, eigenvalues = σᵢ²)
Singular values: σᵢ = √λᵢ(AᵀA) = √λᵢ(AAᵀ)
V contains eigenvectors of AᵀA, U contains eigenvectors of AAᵀ
Real-World Applications
Data Science & Machine Learning
- Principal Component Analysis (PCA): Dimensionality reduction via SVD of covariance matrix
- Recommendation systems: Collaborative filtering (Netflix Prize, matrix completion)
- Natural Language Processing: Latent Semantic Analysis (LSA) for document retrieval
- Image compression: JPEG compression using truncated SVD
Computer Vision & Graphics
- Face recognition: Eigenfaces using SVD of face image matrices
- Structure from motion: 3D reconstruction from 2D images
- Image denoising: Low-rank approximation via SVD thresholding
- Video compression: Temporal and spatial redundancy reduction
Signal Processing
- Noise reduction: Separating signal from noise using SVD
- Array processing: Direction of arrival estimation
- System identification: Hankel matrix analysis
- Audio processing: Source separation in music
Scientific Computing
- Numerical linear algebra: Solving ill-conditioned linear systems
- Computational fluid dynamics: Proper Orthogonal Decomposition (POD)
- Quantum mechanics: Density matrix renormalization
- Bioinformatics: Gene expression data analysis
SVD Examples and Properties
| Matrix Type | Singular Values | Rank | Special Properties |
|---|---|---|---|
| Identity | σ = [1,1,...,1] | n | U = V = I, Σ = I |
| Zero Matrix | σ = [0,0,...,0] | 0 | Rank zero, any U,V work |
| Diagonal | σ = |diagonal| | # non-zero diag | U = V = I, Σ = |D| |
| Orthogonal | σ = [1,1,...,1] | n | All singular values = 1 |
| Rank-deficient | σᵣ₊₁ = ... = σₙ = 0 | r < min(m,n) | Numerical rank detection |
| Ill-conditioned | σₘₐₓ/σₘᵢₙ large | n | Large condition number |
SVD Algorithm Properties
| Algorithm | Complexity | Stability | Best For |
|---|---|---|---|
| Golub-Reinsch | O(mn²) for m≥n | Numerically stable | General matrices |
| R-SVD | O(mn log n) | Randomized, approximate | Large matrices |
| Jacobi SVD | O(mn²) | High accuracy | Small matrices |
| Divide & Conquer | O(mn²) | Fast for bidiagonal | Medium matrices |
| Power Method | O(mn k) | Iterative, for top-k | Partial SVD |
| Lanczos | O(mn k) | Good for sparse | Large sparse matrices |
Step-by-Step SVD Calculation
Example: SVD of 2×2 Matrix [[3,0],[0,2]]
- Given matrix A = [[3, 0], [0, 2]]
- Compute AᵀA = [[9, 0], [0, 4]] and AAᵀ = [[9, 0], [0, 4]]
- Find eigenvalues of AᵀA: λ₁ = 9, λ₂ = 4
- Singular values: σ₁ = √9 = 3, σ₂ = √4 = 2
- Find eigenvectors of AᵀA: v₁ = [1,0], v₂ = [0,1] (V = I)
- Find eigenvectors of AAᵀ: u₁ = [1,0], u₂ = [0,1] (U = I)
- Construct Σ = [[3, 0], [0, 2]]
- Result: A = UΣVᵀ = I × [[3,0],[0,2]] × I
- Verification: UΣVᵀ = [[3,0],[0,2]] = A ✓
Example: SVD of 3×2 Matrix [[1,2],[3,4],[5,6]]
- Given A = [[1,2],[3,4],[5,6]] (3×2)
- Compute AᵀA = [[35,44],[44,56]]
- Find eigenvalues: λ₁ ≈ 90.4, λ₂ ≈ 0.6
- Singular values: σ₁ ≈ 9.51, σ₂ ≈ 0.77
- Rank = 2 (both singular values non-zero)
- U size: 3×3, Σ size: 3×2, V size: 2×2
- Σ contains σ₁, σ₂ on diagonal, zeros elsewhere
- Used for dimensionality reduction: keep only σ₁ for rank-1 approximation
Related Calculators
Frequently Asked Questions (FAQs)
Q: What's the difference between SVD and Eigenvalue Decomposition?
A: Eigenvalue decomposition (A = XΛX⁻¹) works only for square diagonalizable matrices. SVD (A = UΣVᵀ) works for any rectangular matrix. For symmetric positive definite matrices, SVD and EVD coincide with U = V. SVD always exists, EVD doesn't always exist.
Q: How do I choose how many singular values to keep?
A: Common methods: 1) Keep singular values above a threshold (e.g., σᵢ > ε·σ₁), 2) Keep enough to capture certain energy percentage (e.g., 95% of total variance), 3) Look for "elbow" in scree plot, 4) Use cross-validation for specific applications.
Q: Can SVD handle complex matrices?
A: Yes, SVD extends to complex matrices: A = UΣVᴴ where U and V are unitary matrices (UᴴU = I), Σ is real non-negative diagonal, and ᴴ denotes conjugate transpose. The singular values remain real and non-negative.
Q: What's the Moore-Penrose pseudoinverse using SVD?
A: For A = UΣVᵀ, the pseudoinverse A⁺ = VΣ⁺Uᵀ where Σ⁺ is formed by taking reciprocal of non-zero singular values (1/σᵢ) and transposing. This gives the least-squares solution to Ax = b: x = A⁺b.
Master matrix decompositions with Toolivaa's free Singular Value Decomposition Calculator, and explore more advanced tools in our Linear Algebra Calculators collection.