In mathematics, the Marcinkiewicz–Zygmund inequality, named after Józef Marcinkiewicz and Antoni Zygmund, gives relations between moments of a collection of independent random variables. It is a generalization of the rule for the sum of variances of independent random variables to moments of arbitrary order.
Theorem If x i , i = 1 , … , n , are independent random variables such that E ( x i ) = 0 and E ( | x i | p ) < + ∞ , 1 ≤ p < + ∞ , then
A p E ( ( ∑ i = 1 n | x i | 2 ) p / 2 ) ≤ E ( | ∑ i = 1 n x i | p ) ≤ B p E ( ( ∑ i = 1 n | x i | 2 ) p / 2 ) where A p and B p are positive constants, which depend only on p .
In the case p = 2 , the inequality holds with A 2 = B 2 = 1 , and it reduces to the rule for the sum of variances of independent random variables with zero mean, known from elementary statistics: If E ( x i ) = 0 and E ( | x i | 2 ) < + ∞ , then
V a r ( ∑ i = 1 n x i ) = E ( | ∑ i = 1 n x i | 2 ) = ∑ i = 1 n ∑ j = 1 n E ( x i x ¯ j ) = ∑ i = 1 n E ( | x i | 2 ) = ∑ i = 1 n V a r ( x i ) .