![]() | ||
Support x ∈ [ 0 ; ∞ ) {displaystyle xin [0;infty )} PDF 2 1 − k / 2 x k − 1 e − x 2 / 2 Γ ( k / 2 ) {displaystyle {rac {2^{1-k/2}x^{k-1}e^{-x^{2}/2}}{Gamma (k/2)}}} CDF P ( k / 2 , x 2 / 2 ) {displaystyle P(k/2,x^{2}/2),} Mean μ = 2 Γ ( ( k + 1 ) / 2 ) Γ ( k / 2 ) {displaystyle mu ={sqrt {2}},{rac {Gamma ((k+1)/2)}{Gamma (k/2)}}} Mode k − 1 {displaystyle {sqrt {k-1}},} for k ≥ 1 {displaystyle kgeq 1} |
In probability theory and statistics, the chi distribution is a continuous probability distribution. It is the distribution of the square root of the sum of squares of independent random variables having a standard normal distribution, or equivalently, the distribution of the Euclidean distance of the random variables from the origin. The most familiar examples are the Rayleigh distribution with chi distribution with 2 degrees of freedom, and the Maxwell distribution of (normalized) molecular speeds which is a chi distribution with 3 degrees of freedom (one for each spatial coordinate). If
Contents
- Probability density function
- Cumulative distribution function
- Moment generating function
- Characteristic function
- Properties
- Moments
- Entropy
- Related distributions
- References
is distributed according to the chi distribution. Accordingly, dividing by the mean of the chi distribution (scaled by the square root of n − 1) yields the correction factor in the unbiased estimation of the standard deviation of the normal distribution. The chi distribution has one parameter:
Probability density function
The probability density function is
where
Cumulative distribution function
The cumulative distribution function is given by:
where
Moment generating function
The moment generating function is given by:
where
Characteristic function
The characteristic function is given by:
where again,
Properties
Differential equation
Moments
The raw moments are then given by:
where
where the rightmost expressions are derived using the recurrence relationship for the Gamma function:
From these expressions we may derive the following relationships:
Mean:
Variance:
Skewness:
Kurtosis excess:
Entropy
The entropy is given by:
where