Suvarna Garge (Editor)

Transfer matrix

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In applied mathematics, the transfer matrix is a formulation in terms of a block-Toeplitz matrix of the two-scale equation, which characterizes refinable functions. Refinable functions play an important role in wavelet theory and finite element theory.

For the mask h , which is a vector with component indexes from a to b , the transfer matrix of h , we call it T h here, is defined as

( T h ) j , k = h 2 j k .

More verbosely

T h = ( h a h a + 2 h a + 1 h a h a + 4 h a + 3 h a + 2 h a + 1 h a h b h b 1 h b 2 h b 3 h b 4 h b h b 1 h b 2 h b ) .

The effect of T h can be expressed in terms of the downsampling operator " ":

T h x = ( h x ) 2.

Properties

  • T h x = T x h .
  • If you drop the first and the last column and move the odd-indexed columns to the left and the even-indexed columns to the right, then you obtain a transposed Sylvester matrix.
  • The determinant of a transfer matrix is essentially a resultant.
  • More precisely: Let h e be the even-indexed coefficients of h ( ( h e ) k = h 2 k ) and let h o be the odd-indexed coefficients of h ( ( h o ) k = h 2 k + 1 ). Then det T h = ( 1 ) b a + 1 4 h a h b r e s ( h e , h o ) , where r e s is the resultant. This connection allows for fast computation using the Euclidean algorithm.
  • For the trace of the transfer matrix of convolved masks holds
  • t r   T g h = t r   T g t r   T h
  • For the determinant of the transfer matrix of convolved mask holds
  • det T g h = det T g det T h r e s ( g , h ) where g denotes the mask with alternating signs, i.e. ( g ) k = ( 1 ) k g k .
  • If T h x = 0 , then T g h ( g x ) = 0 .
  • This is a concretion of the determinant property above. From the determinant property one knows that T g h is singular whenever T h is singular. This property also tells, how vectors from the null space of T h can be converted to null space vectors of T g h .
  • If x is an eigenvector of T h with respect to the eigenvalue λ , i.e.
  • T h x = λ x , then x ( 1 , 1 ) is an eigenvector of T h ( 1 , 1 ) with respect to the same eigenvalue, i.e. T h ( 1 , 1 ) ( x ( 1 , 1 ) ) = λ ( x ( 1 , 1 ) ) .
  • Let λ a , , λ b be the eigenvalues of T h , which implies λ a + + λ b = t r   T h and more generally λ a n + + λ b n = t r ( T h n ) . This sum is useful for estimating the spectral radius of T h . There is an alternative possibility for computing the sum of eigenvalue powers, which is faster for small n .
  • Let C k h be the periodization of h with respect to period 2 k 1 . That is C k h is a circular filter, which means that the component indexes are residue classes with respect to the modulus 2 k 1 . Then with the upsampling operator it holds t r ( T h n ) = ( C k h ( C k h 2 ) ( C k h 2 2 ) ( C k h 2 n 1 ) ) [ 0 ] 2 n 1 Actually not n 2 convolutions are necessary, but only 2 log 2 n ones, when applying the strategy of efficient computation of powers. Even more the approach can be further sped up using the Fast Fourier transform.
  • From the previous statement we can derive an estimate of the spectral radius of ϱ ( T h ) . It holds
  • ϱ ( T h ) a # h 1 3 # h where # h is the size of the filter and if all eigenvalues are real, it is also true that ϱ ( T h ) a , where a = C 2 h 2 .

    References

    Transfer matrix Wikipedia