Suvarna Garge (Editor)

Multivariate analysis of variance

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, multivariate analysis of variance (MANOVA) is a procedure for comparing multivariate sample means. As a multivariate procedure, it is used when there are two or more dependent variables, and is typically followed by significance tests involving individual dependent variables separately. It helps to answer

Contents

  1. Do changes in the independent variable(s) have significant effects on the dependent variables?
  2. What are the relationships among the dependent variables?
  3. What are the relationships among the independent variables?

Relationship with ANOVA

MANOVA is a generalized form of univariate analysis of variance (ANOVA), although, unlike univariate ANOVA, it uses the covariance between outcome variables in testing the statistical significance of the mean differences.

Where sums of squares appear in univariate analysis of variance, in multivariate analysis of variance certain positive-definite matrices appear. The diagonal entries are the same kinds of sums of squares that appear in univariate ANOVA. The off-diagonal entries are corresponding sums of products. Under normality assumptions about error distributions, the counterpart of the sum of squares due to error has a Wishart distribution.

MANOVA is based on the product of model variance matrix, Σ m o d e l and inverse of the error variance matrix, Σ r e s 1 , or A = Σ m o d e l × Σ r e s 1 . The hypothesis that Σ m o d e l = Σ r e s i d u a l implies that the product A I . Invariance considerations imply the MANOVA statistic should be a measure of magnitude of the singular value decomposition of this matrix product, but there is no unique choice owing to the multi-dimensional nature of the alternative hypothesis.

The most common statistics are summaries based on the roots (or eigenvalues) λ p of the A matrix:

  • Samuel Stanley Wilks' Λ W i l k s = 1... p ( 1 / ( 1 + λ p ) ) = det ( I + A ) 1 = det ( Σ r e s ) / det ( Σ r e s + Σ m o d e l ) distributed as lambda (Λ)
  • the Pillai-M. S. Bartlett trace, Λ P i l l a i = 1... p ( λ p / ( 1 + λ p ) ) = t r ( ( I + A ) 1 )
  • the Lawley-Hotelling trace, Λ L H = 1... p ( λ p ) = t r ( A )
  • Roy's greatest root (also called Roy's largest root), Λ R o y = m a x p ( λ p ) = A
  • Discussion continues over the merits of each, although the greatest root leads only to a bound on significance which is not generally of practical interest. A further complication is that, except for the Roy's greatest root, the distribution of these statistics under the null hypothesis is not straightforward and can only be approximated except in a few low-dimensional cases. An algorithm for the distribution of the Roy's largest root under the null hypothesis was derived in while the distribution under the alternative is studied in.

    The best-known approximation for Wilks' lambda was derived by C. R. Rao.

    In the case of two groups, all the statistics are equivalent and the test reduces to Hotelling's T-square.

    Correlation of dependent variables

    MANOVA's power is affected by the correlations of the dependent variables and by the effect sizes associated with those variables. For example, when there are two groups and two dependent variables, MANOVA's power is lowest when the correlation equals the ratio of the smaller to the larger standardized effect size.

    References

    Multivariate analysis of variance Wikipedia