Puneet Varma (Editor)

Sensitivity index

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, compared against the standard deviation of the signal or noise distribution. For normally distributed signal and noise with mean and standard deviations μ S and σ S , and μ N and σ N , respectively, d' is defined as:

d = μ S μ N 1 2 ( σ S 2 + σ N 2 )

Note that by convention, d' assumes that the standard deviations for signal and noise are equal. An estimate of d' can be also found from measurements of the hit rate and false-alarm rate. It is calculated as:

where function Z(p), p ∈ [0,1], is the inverse of the cumulative distribution function of the Gaussian distribution.

d' is a dimensionless statistic. A higher d' indicates that the signal can be more readily detected.

References

Sensitivity index Wikipedia


Similar Topics