Support x ∈ [ 0 , ∞ ) {\displaystyle x\in [0,\infty )} PDF f ( x ; σ ) = 2 σ π exp ( − x 2 2 σ 2 ) x > 0 {\displaystyle f(x;\sigma )={\frac {\sqrt {2}}{\sigma {\sqrt {\pi }}}}\exp \left(-{\frac {x^{2}}{2\sigma ^{2}}}\right)\quad x>0} CDF F ( x ; σ ) = erf ( x σ 2 ) {\displaystyle F(x;\sigma )=\operatorname {erf} \left({\frac {x}{\sigma {\sqrt {2}}}}\right)} Quantile Q ( F ; σ ) = σ 2 erf − 1 ( F ) {\displaystyle Q(F;\sigma )=\sigma {\sqrt {2}}\operatorname {erf} ^{-1}(F)} Mean σ 2 π {\displaystyle {\frac {\sigma {\sqrt {2}}}{\sqrt {\pi }}}} |
In probability theory and statistics, the half-normal distribution is a special case of the folded normal distribution.
Contents
Let
Properties
Using the
where
Alternatively using a scaled precision (inverse of the variance) parametrization (to avoid issues if
where
The cumulative distribution function (CDF) is given by
Using the change-of-variables
where erf is the error function, a standard function in many mathematical software packages.
The quantile function (or inverse CDF) is written:
where
The expectation is then given by
The variance is given by
Since this is proportional to the variance σ2 of X, σ can be seen as a scale parameter of the new distribution.
The entropy of the half-normal distribution is exactly one bit less the entropy of a zero-mean normal distribution with the same second moment about 0. This can be understood intuitively since the magnitude operator reduces information by one bit (if the probability distribution at its input is even). Alternatively, since a half-normal distribution is always positive, the one bit it would take to record whether a standard normal random variable were positive (say, a 1) or negative (say, a 0) is no longer necessary. Thus,
The density functions satisfy the differential equations
and
Parameter estimation
Given numbers