ShanZhengYang ShanZhengYang - 2 months ago 19
Python Question

How to invert numpy matrices using Singular Value Decomposition?

(Before you tell me, yes, I know you should never invert the matrix. Unfortunately for my calculations, I have a matrix which I have constructed, and it must be inverted somehow.)

I have a large matrix

M
which is ill-conditioned.
numpy.linalg.cond(M)
outputs a value of magnitude
e+22
. The matrix
M
is shaped
(1000,1000)
.

Naturally,
numpy.linalg.inv()
will result in many precision errors. So, I have used
numpy.linalg.solve()
to invert the matrix.

Consider that the matrix inverse
A^{-1}
is defined by
A * A^{-1} = Identity
.
numpy.linalg.solve()
computes the “exact” solution, x, of the well-determined, i.e., full rank, linear matrix equation ax = b.

So, I define the identity matrix:

import numpy as np
iddmatrix = np.identity(100)


and solve:

inverse = np.linalg.solve(M, iddmatrix)


However, because my matrix is so large and so ill-conditioned,
np.linalg.solve()
will not give the "exact solution". I need another method to invert the matrix.


  1. What is the standard way to implement such an inverse with SVD?

  2. How could I make this ill-conditioned matrix....well-defined?



Any recommendations are appreciated. Thanks!

Answer

Consider what taking the SVD of a matrix actually means. It means that for some matrix M, then we can express it as M=UDV* (here let's let * represent transpose, because I don't see a good way to do that in stack overflow).

if M=UDV*:
  then: M^-1 = (UDV*)^-1 = (V*^-1)(D^-1)(U^-1)

But thanks to the fact that U's columns are the eigenvalues of MM* and V's columns are the eigenvalues of M\*M, the inverses of these matrices are their own transposes (since eigenvectors are orthogonal). So we get: M^-1 = V(D^-1)U*. Taking the inverse of a diagonal matrix is as easy as taking the multiplicative inverse of each of these elements.

Better typesetting (kind of) here: http://adrianboeing.blogspot.com/2010/05/inverting-matrix-svd-singular-value.html