All the Tools You Need

Singular Value Decomposition Calculator - Linear Algebra Tools | Toolivaa

Singular Value Decomposition Calculator

Calculate SVD of Matrices

Compute Singular Value Decomposition A = UΣVᵀ for any matrix. Get U, Σ, V matrices, singular values, and rank with visualization.

A = U Σ Vᵀ
Enter matrix A (m×n). SVD works for any rectangular matrix. U: m×m orthogonal, Σ: m×n diagonal, V: n×n orthogonal.

Identity Matrix

[[1,0,0],[0,1,0],[0,0,1]]
σ = [1,1,1]

Rank 2 Matrix

[[1,2],[3,4],[5,6]]
σ = [9.5, 0.5]

Simple 2×2

[[3,0],[0,2]]
σ = [3, 2]

SVD Result: A = UΣVᵀ

Original Matrix A
=
U Matrix
Σ Matrix
Vᵀ Matrix

Singular Values (σᵢ):

Matrix Size
3×3
Rank
3
Condition Number
1.0
Frobenius Norm
5.48

Singular Value Energy Distribution:

Bar heights show relative importance of each singular value

SVD Properties:

Verification: UΣVᵀ ≈ A

U × Σ × Vᵀ
Original A

Matrix Analysis:

Singular Value Decomposition decomposes any matrix into three matrices: U (left singular vectors), Σ (diagonal with singular values), and Vᵀ (right singular vectors transposed).

What is Singular Value Decomposition (SVD)?

Singular Value Decomposition (SVD) is a fundamental matrix factorization technique in linear algebra that decomposes any real or complex matrix into three matrices: A = UΣVᵀ, where U and V are orthogonal/unitary matrices, and Σ is a diagonal matrix containing singular values. SVD reveals the intrinsic geometric structure of a matrix and is widely used in data analysis, signal processing, and machine learning.

SVD Components

U Matrix

m×m orthogonal

Left singular vectors

Columns are orthonormal

Σ Matrix

m×n diagonal

Singular values σ₁ ≥ σ₂ ≥ ... ≥ 0

Non-negative, descending order

V Matrix

n×n orthogonal

Right singular vectors

Rows of Vᵀ are orthonormal

Properties

UᵀU = I, VᵀV = I

Orthogonal matrices

Σ contains rank information

SVD Mathematical Details

1. SVD Formula

For any m×n matrix A (real or complex):

A = U Σ Vᵀ
Where:
• U: m×m orthogonal matrix (UᵀU = I)
• Σ: m×n diagonal matrix with σ₁ ≥ σ₂ ≥ ... ≥ σᵣ > 0
• V: n×n orthogonal matrix (VᵀV = I)
• r = rank(A) = number of non-zero singular values

2. Reduced SVD (Economy SVD)

Compact form using only non-zero singular values:

A = Uᵣ Σᵣ Vᵣᵀ
Where:
• Uᵣ: m×r (first r columns of U)
• Σᵣ: r×r diagonal (non-zero singular values)
• Vᵣ: n×r (first r columns of V)
• Storage: O(mr + r² + nr) vs O(m² + mn + n²)

3. Relation to Eigenvalues

SVD connects to eigenvalues of AᵀA and AAᵀ:

AᵀA = V ΣᵀΣ Vᵀ (eigenvectors = V, eigenvalues = σᵢ²)
AAᵀ = U ΣΣᵀ Uᵀ (eigenvectors = U, eigenvalues = σᵢ²)
Singular values: σᵢ = √λᵢ(AᵀA) = √λᵢ(AAᵀ)
V contains eigenvectors of AᵀA, U contains eigenvectors of AAᵀ

Real-World Applications

Data Science & Machine Learning

  • Principal Component Analysis (PCA): Dimensionality reduction via SVD of covariance matrix
  • Recommendation systems: Collaborative filtering (Netflix Prize, matrix completion)
  • Natural Language Processing: Latent Semantic Analysis (LSA) for document retrieval
  • Image compression: JPEG compression using truncated SVD

Computer Vision & Graphics

  • Face recognition: Eigenfaces using SVD of face image matrices
  • Structure from motion: 3D reconstruction from 2D images
  • Image denoising: Low-rank approximation via SVD thresholding
  • Video compression: Temporal and spatial redundancy reduction

Signal Processing

  • Noise reduction: Separating signal from noise using SVD
  • Array processing: Direction of arrival estimation
  • System identification: Hankel matrix analysis
  • Audio processing: Source separation in music

Scientific Computing

  • Numerical linear algebra: Solving ill-conditioned linear systems
  • Computational fluid dynamics: Proper Orthogonal Decomposition (POD)
  • Quantum mechanics: Density matrix renormalization
  • Bioinformatics: Gene expression data analysis

SVD Examples and Properties

Matrix TypeSingular ValuesRankSpecial Properties
Identityσ = [1,1,...,1]nU = V = I, Σ = I
Zero Matrixσ = [0,0,...,0]0Rank zero, any U,V work
Diagonalσ = |diagonal|# non-zero diagU = V = I, Σ = |D|
Orthogonalσ = [1,1,...,1]nAll singular values = 1
Rank-deficientσᵣ₊₁ = ... = σₙ = 0r < min(m,n)Numerical rank detection
Ill-conditionedσₘₐₓ/σₘᵢₙ largenLarge condition number

SVD Algorithm Properties

AlgorithmComplexityStabilityBest For
Golub-ReinschO(mn²) for m≥nNumerically stableGeneral matrices
R-SVDO(mn log n)Randomized, approximateLarge matrices
Jacobi SVDO(mn²)High accuracySmall matrices
Divide & ConquerO(mn²)Fast for bidiagonalMedium matrices
Power MethodO(mn k)Iterative, for top-kPartial SVD
LanczosO(mn k)Good for sparseLarge sparse matrices

Step-by-Step SVD Calculation

Example: SVD of 2×2 Matrix [[3,0],[0,2]]

  1. Given matrix A = [[3, 0], [0, 2]]
  2. Compute AᵀA = [[9, 0], [0, 4]] and AAᵀ = [[9, 0], [0, 4]]
  3. Find eigenvalues of AᵀA: λ₁ = 9, λ₂ = 4
  4. Singular values: σ₁ = √9 = 3, σ₂ = √4 = 2
  5. Find eigenvectors of AᵀA: v₁ = [1,0], v₂ = [0,1] (V = I)
  6. Find eigenvectors of AAᵀ: u₁ = [1,0], u₂ = [0,1] (U = I)
  7. Construct Σ = [[3, 0], [0, 2]]
  8. Result: A = UΣVᵀ = I × [[3,0],[0,2]] × I
  9. Verification: UΣVᵀ = [[3,0],[0,2]] = A ✓

Example: SVD of 3×2 Matrix [[1,2],[3,4],[5,6]]

  1. Given A = [[1,2],[3,4],[5,6]] (3×2)
  2. Compute AᵀA = [[35,44],[44,56]]
  3. Find eigenvalues: λ₁ ≈ 90.4, λ₂ ≈ 0.6
  4. Singular values: σ₁ ≈ 9.51, σ₂ ≈ 0.77
  5. Rank = 2 (both singular values non-zero)
  6. U size: 3×3, Σ size: 3×2, V size: 2×2
  7. Σ contains σ₁, σ₂ on diagonal, zeros elsewhere
  8. Used for dimensionality reduction: keep only σ₁ for rank-1 approximation

Related Calculators

Frequently Asked Questions (FAQs)

Q: What's the difference between SVD and Eigenvalue Decomposition?

A: Eigenvalue decomposition (A = XΛX⁻¹) works only for square diagonalizable matrices. SVD (A = UΣVᵀ) works for any rectangular matrix. For symmetric positive definite matrices, SVD and EVD coincide with U = V. SVD always exists, EVD doesn't always exist.

Q: How do I choose how many singular values to keep?

A: Common methods: 1) Keep singular values above a threshold (e.g., σᵢ > ε·σ₁), 2) Keep enough to capture certain energy percentage (e.g., 95% of total variance), 3) Look for "elbow" in scree plot, 4) Use cross-validation for specific applications.

Q: Can SVD handle complex matrices?

A: Yes, SVD extends to complex matrices: A = UΣVᴴ where U and V are unitary matrices (UᴴU = I), Σ is real non-negative diagonal, and ᴴ denotes conjugate transpose. The singular values remain real and non-negative.

Q: What's the Moore-Penrose pseudoinverse using SVD?

A: For A = UΣVᵀ, the pseudoinverse A⁺ = VΣ⁺Uᵀ where Σ⁺ is formed by taking reciprocal of non-zero singular values (1/σᵢ) and transposing. This gives the least-squares solution to Ax = b: x = A⁺b.

Master matrix decompositions with Toolivaa's free Singular Value Decomposition Calculator, and explore more advanced tools in our Linear Algebra Calculators collection.

Singular Value Decomposition Calculator - Linear Algebra Tools | Toolivaa

Singular Value Decomposition Calculator

Calculate SVD of Matrices

Compute Singular Value Decomposition A = UΣVᵀ for any matrix. Get U, Σ, V matrices, singular values, and rank with visualization.

A = U Σ Vᵀ
Enter matrix A (m×n). SVD works for any rectangular matrix. U: m×m orthogonal, Σ: m×n diagonal, V: n×n orthogonal.

Identity Matrix

[[1,0,0],[0,1,0],[0,0,1]]
σ = [1,1,1]

Rank 2 Matrix

[[1,2],[3,4],[5,6]]
σ = [9.5, 0.5]

Simple 2×2

[[3,0],[0,2]]
σ = [3, 2]

SVD Result: A = UΣVᵀ

Original Matrix A
=
U Matrix
Σ Matrix
Vᵀ Matrix

Singular Values (σᵢ):

Matrix Size
3×3
Rank
3
Condition Number
1.0
Frobenius Norm
5.48

Singular Value Energy Distribution:

Bar heights show relative importance of each singular value

SVD Properties:

Verification: UΣVᵀ ≈ A

U × Σ × Vᵀ
Original A

Matrix Analysis:

Singular Value Decomposition decomposes any matrix into three matrices: U (left singular vectors), Σ (diagonal with singular values), and Vᵀ (right singular vectors transposed).

What is Singular Value Decomposition (SVD)?

Singular Value Decomposition (SVD) is a fundamental matrix factorization technique in linear algebra that decomposes any real or complex matrix into three matrices: A = UΣVᵀ, where U and V are orthogonal/unitary matrices, and Σ is a diagonal matrix containing singular values. SVD reveals the intrinsic geometric structure of a matrix and is widely used in data analysis, signal processing, and machine learning.

SVD Components

U Matrix

m×m orthogonal

Left singular vectors

Columns are orthonormal

Σ Matrix

m×n diagonal

Singular values σ₁ ≥ σ₂ ≥ ... ≥ 0

Non-negative, descending order

V Matrix

n×n orthogonal

Right singular vectors

Rows of Vᵀ are orthonormal

Properties

UᵀU = I, VᵀV = I

Orthogonal matrices

Σ contains rank information

SVD Mathematical Details

1. SVD Formula

For any m×n matrix A (real or complex):

A = U Σ Vᵀ
Where:
• U: m×m orthogonal matrix (UᵀU = I)
• Σ: m×n diagonal matrix with σ₁ ≥ σ₂ ≥ ... ≥ σᵣ > 0
• V: n×n orthogonal matrix (VᵀV = I)
• r = rank(A) = number of non-zero singular values

2. Reduced SVD (Economy SVD)

Compact form using only non-zero singular values:

A = Uᵣ Σᵣ Vᵣᵀ
Where:
• Uᵣ: m×r (first r columns of U)
• Σᵣ: r×r diagonal (non-zero singular values)
• Vᵣ: n×r (first r columns of V)
• Storage: O(mr + r² + nr) vs O(m² + mn + n²)

3. Relation to Eigenvalues

SVD connects to eigenvalues of AᵀA and AAᵀ:

AᵀA = V ΣᵀΣ Vᵀ (eigenvectors = V, eigenvalues = σᵢ²)
AAᵀ = U ΣΣᵀ Uᵀ (eigenvectors = U, eigenvalues = σᵢ²)
Singular values: σᵢ = √λᵢ(AᵀA) = √λᵢ(AAᵀ)
V contains eigenvectors of AᵀA, U contains eigenvectors of AAᵀ

Real-World Applications

Data Science & Machine Learning

  • Principal Component Analysis (PCA): Dimensionality reduction via SVD of covariance matrix
  • Recommendation systems: Collaborative filtering (Netflix Prize, matrix completion)
  • Natural Language Processing: Latent Semantic Analysis (LSA) for document retrieval
  • Image compression: JPEG compression using truncated SVD

Computer Vision & Graphics

  • Face recognition: Eigenfaces using SVD of face image matrices
  • Structure from motion: 3D reconstruction from 2D images
  • Image denoising: Low-rank approximation via SVD thresholding
  • Video compression: Temporal and spatial redundancy reduction

Signal Processing

  • Noise reduction: Separating signal from noise using SVD
  • Array processing: Direction of arrival estimation
  • System identification: Hankel matrix analysis
  • Audio processing: Source separation in music

Scientific Computing

  • Numerical linear algebra: Solving ill-conditioned linear systems
  • Computational fluid dynamics: Proper Orthogonal Decomposition (POD)
  • Quantum mechanics: Density matrix renormalization
  • Bioinformatics: Gene expression data analysis

SVD Examples and Properties

Matrix TypeSingular ValuesRankSpecial Properties
Identityσ = [1,1,...,1]nU = V = I, Σ = I
Zero Matrixσ = [0,0,...,0]0Rank zero, any U,V work
Diagonalσ = |diagonal|# non-zero diagU = V = I, Σ = |D|
Orthogonalσ = [1,1,...,1]nAll singular values = 1
Rank-deficientσᵣ₊₁ = ... = σₙ = 0r < min(m,n)Numerical rank detection
Ill-conditionedσₘₐₓ/σₘᵢₙ largenLarge condition number

SVD Algorithm Properties

AlgorithmComplexityStabilityBest For
Golub-ReinschO(mn²) for m≥nNumerically stableGeneral matrices
R-SVDO(mn log n)Randomized, approximateLarge matrices
Jacobi SVDO(mn²)High accuracySmall matrices
Divide & ConquerO(mn²)Fast for bidiagonalMedium matrices
Power MethodO(mn k)Iterative, for top-kPartial SVD
LanczosO(mn k)Good for sparseLarge sparse matrices

Step-by-Step SVD Calculation

Example: SVD of 2×2 Matrix [[3,0],[0,2]]

  1. Given matrix A = [[3, 0], [0, 2]]
  2. Compute AᵀA = [[9, 0], [0, 4]] and AAᵀ = [[9, 0], [0, 4]]
  3. Find eigenvalues of AᵀA: λ₁ = 9, λ₂ = 4
  4. Singular values: σ₁ = √9 = 3, σ₂ = √4 = 2
  5. Find eigenvectors of AᵀA: v₁ = [1,0], v₂ = [0,1] (V = I)
  6. Find eigenvectors of AAᵀ: u₁ = [1,0], u₂ = [0,1] (U = I)
  7. Construct Σ = [[3, 0], [0, 2]]
  8. Result: A = UΣVᵀ = I × [[3,0],[0,2]] × I
  9. Verification: UΣVᵀ = [[3,0],[0,2]] = A ✓

Example: SVD of 3×2 Matrix [[1,2],[3,4],[5,6]]

  1. Given A = [[1,2],[3,4],[5,6]] (3×2)
  2. Compute AᵀA = [[35,44],[44,56]]
  3. Find eigenvalues: λ₁ ≈ 90.4, λ₂ ≈ 0.6
  4. Singular values: σ₁ ≈ 9.51, σ₂ ≈ 0.77
  5. Rank = 2 (both singular values non-zero)
  6. U size: 3×3, Σ size: 3×2, V size: 2×2
  7. Σ contains σ₁, σ₂ on diagonal, zeros elsewhere
  8. Used for dimensionality reduction: keep only σ₁ for rank-1 approximation

Related Calculators

Frequently Asked Questions (FAQs)

Q: What's the difference between SVD and Eigenvalue Decomposition?

A: Eigenvalue decomposition (A = XΛX⁻¹) works only for square diagonalizable matrices. SVD (A = UΣVᵀ) works for any rectangular matrix. For symmetric positive definite matrices, SVD and EVD coincide with U = V. SVD always exists, EVD doesn't always exist.

Q: How do I choose how many singular values to keep?

A: Common methods: 1) Keep singular values above a threshold (e.g., σᵢ > ε·σ₁), 2) Keep enough to capture certain energy percentage (e.g., 95% of total variance), 3) Look for "elbow" in scree plot, 4) Use cross-validation for specific applications.

Q: Can SVD handle complex matrices?

A: Yes, SVD extends to complex matrices: A = UΣVᴴ where U and V are unitary matrices (UᴴU = I), Σ is real non-negative diagonal, and ᴴ denotes conjugate transpose. The singular values remain real and non-negative.

Q: What's the Moore-Penrose pseudoinverse using SVD?

A: For A = UΣVᵀ, the pseudoinverse A⁺ = VΣ⁺Uᵀ where Σ⁺ is formed by taking reciprocal of non-zero singular values (1/σᵢ) and transposing. This gives the least-squares solution to Ax = b: x = A⁺b.

Master matrix decompositions with Toolivaa's free Singular Value Decomposition Calculator, and explore more advanced tools in our Linear Algebra Calculators collection.

Scroll to Top