Samiksha Jaiswal (Editor)

Khintchine inequality

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick N complex numbers x 1 , , x N C , and add them together each multiplied by a random sign ± 1 , then the expected value of its modulus, or the modulus it will be closest to on average, will be not too far off from | x 1 | 2 + + | x N | 2 .

Contents

Statement of theorem

Let { ε n } n = 1 N be i.i.d. random variables with P ( ε n = ± 1 ) = 1 2 for n = 1 , , N , i.e., a sequence with Rademacher distribution. Let 0 < p < and let x 1 , , x N C . Then

A p ( n = 1 N | x n | 2 ) 1 / 2 ( E | n = 1 N ε n x n | p ) 1 / p B p ( n = 1 N | x n | 2 ) 1 / 2

for some constants A p , B p > 0 depending only on p (see Expected value for notation). The sharp values of the constants A p , B p were found by Haagerup (Ref. 2; see Ref. 3 for a simpler proof). It is a simple matter to see that A p = 1 when p 2 , and B p = 1 when 0 < p 2 .

Uses in analysis

The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let T be a linear operator between two Lp spaces L p ( X , μ ) and L p ( Y , ν ) , 1 p < , with bounded norm T < , then one can use Khintchine's inequality to show that

( n = 1 N | T f n | 2 ) 1 / 2 L p ( Y , ν ) C p ( n = 1 N | f n | 2 ) 1 / 2 L p ( X , μ )

for some constant C p > 0 depending only on p and T .

References

Khintchine inequality Wikipedia