Puneet Varma (Editor)

Uncorrelated random variables

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In probability theory and statistics, two real-valued random variables, X,Y, are said to be uncorrelated if their covariance, E(XY) − E(X)E(Y), is zero. A set of two or more random variables is called uncorrelated if each pair of them are uncorrelated. If two variables are uncorrelated, there is no linear relationship between them.

Contents

Uncorrelated random variables have a Pearson correlation coefficient of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.

In general, uncorrelatedness is not the same as orthogonality, except in the special case where either X or Y has an expected value of 0. In this case, the covariance is the expectation of the product, and X and Y are uncorrelated if and only if E(XY) = 0.

If X and Y are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent. For example, if X is a continuous random variable uniformly distributed on [−1, 1] and Y = X2, then X and Y are uncorrelated even though X determines Y and a particular value of Y can be produced by only one or two values of X.

Example of dependence without correlation

Uncorrelated random variables are not necessarily independent
  • Let X be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2.
  • Let Z be a random variable, independent of X, that takes the value −1 with probability 1/2, and takes the value 1 with probability 1/2.
  • Let U be a random variable constructed as U = XZ.
  • The claim is that U and X have zero covariance (and thus are uncorrelated), but are not independent.

    Proof:

    First note:

  • E [ U ] = E [ X Z ] = E [ X ] E [ Z ] = E [ X ] 0 = 0
  • Now, cov ( U , X ) = E [ ( U E [ U ] ) ( X E [ X ] ) ] = E [ U ( X 1 2 ) ] = E [ X 2 Z 1 2 X Z ] = E [ ( X 2 1 2 X ) Z ] = E [ ( X 2 1 2 X ) ] E [ Z ] = 0

    Independence of U and X means that for all a and b, Pr ( U = a X = b ) = Pr ( U = a ) . This is not true, in particular, for a = 1 and b = 0.

  • Pr ( U = 1 X = 0 ) = Pr ( X Z = 1 X = 0 ) = 0
  • Pr ( U = 1 ) = Pr ( X Z = 1 ) = 1 / 4
  • Thus Pr ( U = 1 X = 0 ) Pr ( U = 1 ) so U and X are not independent.

    Q.E.D.

    When uncorrelatedness implies independence

    There are cases in which uncorrelatedness does imply independence. One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a binomial distribution with n = 1). Further, two jointly normally distributed random variables are independent if they are uncorrelated, although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see Normally distributed and uncorrelated does not imply independent).

    References

    Uncorrelated random variables Wikipedia