Any matrix non-zero A∈Rm,n can be expressed as A=∑j=1rσiuivi⊺=UrSVr⊺ where
r=rankA
Ur=[u1...ur] is such that Ur⊺Ur=Ir
Vr=[v1...vr] is such that Vr⊺Vr=Ir
σ1≥σ2≥...≥σr≥0
The positive numbers σi are called singular value of A
Vectors ui are called the left singular vectors of A
Vectors vi are called the right singular vectors of A
These satisfies Avi=σiui,A⊺ui=σivi,i=1,...,r
Intepretation:
The singular value decomposition theorem allows to write any matrix as a sum of DyadA=∑j=1rσiuivi⊺ where
vectors ui,vi are normalized with σi>0 providing “strength” of the corresponding dyad
the vectors ui,i=1,...,r (respectively vi,i=1,...r) are mutually orthogonal, ensuring that each dyad represents “new information”
Matrix properties via SVD:
Rank nullspace and range:
The rank of A is also the nb of non-zero entries on the diagonal of S~
Since r=rankA, by Fundamental of Linear Algebra, the dimension of the nullspace of A is dimN(A)=n−r. An orthogonal basis spanning N(A) is given by the last n−r collumns of V, i.e N(A)=R(Vnr),Vnr≐[ur+1...vn]
Similarly, an orthonormal basis spanning the range of A is given by the first r columns of U, i.e. R(A)=R(Ur)Ur≐[u1...ur]
Matrix norms:
The squared Frobenius matrix norm of a matrix A∈Rm,n can be defined as ∣∣A∣∣F2=traceA⊺A=∑i=1nσi2 where σi are the singular values of A
Hence the squared Frobenius norm is sum of squared of the singular values
The l2 induced norm ∣∣A∣∣2 is equal to the largest singular value: ∣∣A∣∣2=x=0max∣∣x∣∣2∣∣Ax∣∣2=σ1
Solving linear equation via SVD:
The linear equation Ax=y can be fully analyzed via SVD. If A=U∑~V⊺ is the SVD of A then Ax=y is equivalent to ∑~x~=y~ where
A∈Rm,n and y∈Rm
x~≐V⊺x
y~≐U⊺y
Since ∑~ is a diagonal matrix: ∑~=[∑0m−r,r0r,n−r0m−r,n−r],∑=diag (σ1,...,σr)≻0 , the system ∑~x~=y~ is simple to solve
For ∑~x~=y~, write σix~i=y~i for i=1,...,r and 0=y~ for i=r+1,...,m
2 cases can occur:
If the last m−r components of y~ are not zero, then the above system is infeasible, and the solution set is empty. This occurs when y is not in range of A
If y∈R(A) then the last set of conditions in the above system hold, and we solve for x~ with the first set of conditions: x~i=y~i/σi.
The last n−r components of x~ are free, which corresponds to elements in the nullspace of A.
If A is full column rank (its nullspace is reduced to {0}), then there is a unique solution.