In mathematics, particularly in linear algebra, the Schur product theorem states that the Hadamard product of two positive definite matrices is also a positive definite matrix. The result is named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik.)
For any matrices M and N , the Hadamard product M ∘ N considered as a bilinear form acts on vectors a , b as
a ∗ ( M ∘ N ) b = tr ( M T diag ( a ∗ ) N diag ( b ) ) where tr is the matrix trace and diag ( a ) is the diagonal matrix having as diagonal entries the elements of a .
Suppose M and N are positive definite, and so Hermitian. We can consider their square-roots M 1 / 2 and N 1 / 2 , which are also Hermitian, and write
tr ( M T diag ( a ∗ ) N diag ( b ) ) = tr ( M ¯ 1 / 2 M ¯ 1 / 2 diag ( a ∗ ) N 1 / 2 N 1 / 2 diag ( b ) ) = tr ( M ¯ 1 / 2 diag ( a ∗ ) N 1 / 2 N 1 / 2 diag ( b ) M ¯ 1 / 2 ) Then, for a = b , this is written as tr ( A ∗ A ) for A = N 1 / 2 diag ( a ) M ¯ 1 / 2 and thus is strictly positive for A ≠ 0 , which occurs if and only if a ≠ 0 . This shows that ( M ∘ N ) is a positive definite matrix.
Let X be an n -dimensional centered Gaussian random variable with covariance ⟨ X i X j ⟩ = M i j . Then the covariance matrix of X i 2 and X j 2 is
Cov ( X i 2 , X j 2 ) = ⟨ X i 2 X j 2 ⟩ − ⟨ X i 2 ⟩ ⟨ X j 2 ⟩ Using Wick's theorem to develop ⟨ X i 2 X j 2 ⟩ = 2 ⟨ X i X j ⟩ 2 + ⟨ X i 2 ⟩ ⟨ X j 2 ⟩ we have
Cov ( X i 2 , X j 2 ) = 2 ⟨ X i X j ⟩ 2 = 2 M i j 2 Since a covariance matrix is positive definite, this proves that the matrix with elements M i j 2 is a positive definite matrix.
Let X and Y be n -dimensional centered Gaussian random variables with covariances ⟨ X i X j ⟩ = M i j , ⟨ Y i Y j ⟩ = N i j and independent from each other so that we have
⟨ X i Y j ⟩ = 0 for any
i , j Then the covariance matrix of X i Y i and X j Y j is
Cov ( X i Y i , X j Y j ) = ⟨ X i Y i X j Y j ⟩ − ⟨ X i Y i ⟩ ⟨ X j Y j ⟩ Using Wick's theorem to develop
⟨ X i Y i X j Y j ⟩ = ⟨ X i X j ⟩ ⟨ Y i Y j ⟩ + ⟨ X i Y i ⟩ ⟨ X j Y j ⟩ + ⟨ X i Y j ⟩ ⟨ X j Y i ⟩ and also using the independence of X and Y , we have
Cov ( X i Y i , X j Y j ) = ⟨ X i X j ⟩ ⟨ Y i Y j ⟩ = M i j N i j Since a covariance matrix is positive definite, this proves that the matrix with elements M i j N i j is a positive definite matrix.
Let M = ∑ μ i m i m i T and N = ∑ ν i n i n i T . Then
M ∘ N = ∑ i j μ i ν j ( m i m i T ) ∘ ( n j n j T ) = ∑ i j μ i ν j ( m i ∘ n j ) ( m i ∘ n j ) T Each ( m i ∘ n j ) ( m i ∘ n j ) T is positive semidefinite (but, except in the 1-dimensional case, not positive definite, since they are rank 1 matrices). Also, μ i ν j > 0 thus the sum M ∘ N is also positive semidefinite.
To show that the result is positive definite requires further proof. We shall show that for any vector a ≠ 0 , we have a T ( M ∘ N ) a > 0 . Continuing as above, each a T ( m i ∘ n j ) ( m i ∘ n j ) T a ≥ 0 , so it remains to show that there exist i and j for which the inequality is strict. For this we observe that
a T ( m i ∘ n j ) ( m i ∘ n j ) T a = ( ∑ k m i , k n j , k a k ) 2 Since N is positive definite, there is a j for which n j , k a k is not 0 for all k , and then, since M is positive definite, there is an i for which m i , k n j , k a k is not 0 for all k . Then for this i and j we have ( ∑ k m i , k n j , k a k ) 2 > 0 . This completes the proof.