Suvarna Garge (Editor)

Multidimensional Chebyshev's inequality

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In probability theory, the multidimensional Chebyshev's inequality is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.

Let X be an N-dimensional random vector with expected value μ = E [ X ] and covariance matrix

V = E [ ( X μ ) ( X μ ) T ] .

If V is a positive-definite matrix, for any real number t > 0 :

Pr ( ( X μ ) T V 1 ( X μ ) > t ) N t 2

Proof

Since V is positive-definite, so is V 1 . Define the random variable

y = ( X μ ) T V 1 ( X μ ) .

Since y is positive, Markov's inequality holds:

Pr ( ( X μ ) T V 1 ( X μ ) > t ) = Pr ( y > t ) = Pr ( y > t 2 ) E [ y ] t 2 .

Finally,

E [ y ] = E [ ( X μ ) T V 1 ( X μ ) ] = E [ trace ( V 1 ( X μ ) ( X μ ) T ) ] = trace ( V 1 V ) = N .

References

Multidimensional Chebyshev's inequality Wikipedia