In linear algebra, an alternant matrix is a matrix with a particular structure, in which successive columns have a particular function applied to their entries. An alternant determinant is the determinant of an alternant matrix. Such a matrix of size m × n may be written out as
M = [ f 1 ( α 1 ) f 2 ( α 1 ) … f n ( α 1 ) f 1 ( α 2 ) f 2 ( α 2 ) … f n ( α 2 ) f 1 ( α 3 ) f 2 ( α 3 ) … f n ( α 3 ) ⋮ ⋮ ⋱ ⋮ f 1 ( α m ) f 2 ( α m ) … f n ( α m ) ] or more succinctly
M i , j = f j ( α i ) for all indices i and j. (Some authors use the transpose of the above matrix.)
Examples of alternant matrices include Vandermonde matrices, for which f i ( α ) = α i − 1 , and Moore matrices, for which f i ( α ) = α q i − 1 .
If n = m and the f j ( x ) functions are all polynomials, there are some additional results: if α i = α j for any i < j , then the determinant of any alternant matrix is zero (as a row is then repeated), thus ( α j − α i ) divides the determinant for all 1 ≤ i < j ≤ n . As such, if one takes
V = [ 1 α 1 … α 1 n − 1 1 α 2 … α 2 n − 1 1 α 3 … α 3 n − 1 ⋮ ⋮ ⋱ ⋮ 1 α n … α n n − 1 ] (a Vandermonde matrix), then ∏ i < j ( α j − α i ) = det V divides such polynomial alternant determinants. The ratio det M det V is called a bialternant. The case where each function f j ( x ) = x m j forms the classical definition of the Schur polynomials.
Alternant matrices are used in coding theory in the construction of alternant codes.