Parameters μ {displaystyle mu ,} location (real) λ > 0 {displaystyle lambda >0,} (real) α > 0 {displaystyle alpha >0,} (real) β > 0 {displaystyle eta >0,} (real) Support x ∈ ( − ∞ , ∞ ) , τ ∈ ( 0 , ∞ ) {displaystyle xin (-infty ,infty ),!,; au in (0,infty )} PDF f ( x , τ | μ , λ , α , β ) = β α λ Γ ( α ) 2 π τ α − 1 2 e − β τ e − λ τ ( x − μ ) 2 2 {displaystyle f(x, au |mu ,lambda ,alpha ,eta )={rac {eta ^{alpha }{sqrt {lambda }}}{Gamma (alpha ){sqrt {2pi }}}}, au ^{alpha -{rac {1}{2}}},e^{-eta au },e^{-{rac {lambda au (x-mu )^{2}}{2}}}} Mean E ( X ) = μ , E ( T ) = α β − 1 {displaystyle operatorname {E} (X)=mu ,!,quad operatorname {E} (mathrm {T} )=alpha eta ^{-1}} Mode ( μ , α − 1 2 β ) {displaystyle left(mu ,{rac {alpha -{rac {1}{2}}}{eta }}ight)} Variance var ( X ) = β λ ( α − 1 ) , var ( T ) = α β − 2 {displaystyle operatorname {var} (X)={rac {eta }{lambda (alpha -1)}},quad operatorname {var} (mathrm {T} )=alpha eta ^{-2}} |
In probability theory and statistics, the normal-gamma distribution (or Gaussian-gamma distribution) is a bivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and precision.
Contents
Definition
For a pair of random variables, (X,T), suppose that the conditional distribution of X given T is given by
meaning that the conditional distribution is a normal distribution with mean
Suppose also that the marginal distribution of T is given by
where this means that T has a gamma distribution. Here λ, α and β are parameters of the joint distribution.
Then (X,T) has a normal-gamma distribution, and this is denoted by
Probability density function
The joint probability density function of (X,T) is
Marginal distributions
By construction, the marginal distribution over
Exponential family
The normal-gamma distribution is a four-parameter exponential family with natural parameters
Moments of the natural statistics
The following moments can be easily computed using the moment generating function of the sufficient statistic:
Scaling
If
Posterior distribution of the parameters
Assume that x is distributed according to a normal distribution with unknown mean
and that the prior distribution on
for which the density π satisfies
Given a dataset
where
Since the data are i.i.d, the likelihood of the entire dataset is equal to the product of the likelihoods of the individual data samples:
This expression can be simplified as follows:
where
The posterior distribution of the parameters is proportional to the prior times the likelihood.
The final exponential term is simplified by completing the square.
On inserting this back into the expression above,
This final expression is in exactly the same form as a Normal-Gamma distribution, i.e.,
Interpretation of parameters
The interpretation of parameters in terms of pseudo-observations is as follows:
As a consequence, if one has a prior mean of
and after observing
Note that in some programming languages, such as Matlab, the gamma distribution is implemented with the inverse definition of
Generating normal-gamma random variates
Generation of random variates is straightforward:
- Sample
τ from a gamma distribution with parametersα andβ - Sample
x from a normal distribution with meanμ and variance1 / ( λ τ )