Trisha Shetty (Editor)

Nullspace property

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In compressed sensing, the nullspace property gives necessary and sufficient conditions on the reconstruction of sparse signals using the techniques of 1 -relaxation. The term "nullspace property" originates from Cohen, Dahmen, and DeVore. The nullspace property is often difficult to check in practice, and the restricted isometry property is a more modern condition in the field of compressed sensing.

Contents

The technique of ℓ 1 {\displaystyle \ell _{1}} -relaxation

The non-convex 0 -minimization problem,

min x x 0 subject to A x = b ,

is a standard problem in compressed sensing. However, 0 -minimization is known to be NP-hard in general. As such, the technique of 1 -relaxation is sometimes employed to circumvent the difficulties of signal reconstruction using the 0 -norm. In 1 -relaxation, the 1 problem,

min x x 1 subject to A x = b ,

is solved in place of the 0 problem. Note that this relaxation is convex and hence amenable to the standard techniques of linear programming - a computationally desirable feature. Naturally we wish to know when 1 -relaxation will give the same answer as the 0 problem. The nullspace property is one way to guarantee agreement.

Definition

An m × n complex matrix A has the nullspace property of order s if for all index sets S with s = | S | n we have that: η S 1 < η S C 1 for all η ker A { 0 } .

Recovery Condition

The following theorem gives necessary and sufficient condition on the recoverability of a given s -sparse vector in C n . The proof of the theorem is a standard one, and the proof supplied here is summarized from Holger Rauhut.

Theorem: Let A be a m × n complex matrix. Then every s -sparse signal x C n is the unique solution to the 1 -relaxation problem with b = A x if and only if A satisfies the nullspace property with order s .

Proof: For the forwards direction notice that η S and η S C are distinct vectors with A ( η S C ) = A ( η S ) by the linearity of A , and hence by uniqueness we must have η S 1 < η S C 1 as desired. For the backwards direction, let x be s -sparse and z another (not necessary s -sparse) vector such that z x and A z = A x . Define the (non-zero) vector η = x z and notice that it lies in the nullspace of A . Call S the support of x , and then the result follows from an elementary application of the triangle inequality: x 1 x z S 1 + z S 1 = η S 1 + z S 1 < η S C 1 + z S 1 = z S C 1 + z S 1 = z 1 , establishing the minimality of x .

References

Nullspace property Wikipedia