The signal-to-noise ratio (SNR) is used in imaging as a physical measure of the sensitivity of a (digital or film) imaging system. Industry standards measure SNR in decibels (dB) of power and therefore apply the 10 log rule to the "pure" SNR ratio (a ratio of 1:1 yields 0 decibels, for instance). In turn, yielding the "sensitivity." Industry standards measure and define sensitivity in terms of the ISO film speed equivalent; SNR:32.04 dB = excellent image quality and SNR:20 dB = acceptable image quality.
Contents
Definition of SNR
Traditionally, SNR has been defined as the ratio of the average signal value
However, when presented with a high-contrast scene, many imaging systems clamp the background to uniform black, forcing
which gives a meaningful result in the presence of clamping.
Explanation
The line data is gathered from the arbitrarily defined signal and background regions and input into an array (refer to image to the right). To calculate the average signal and background values, a second order polynomial is fitted to the array of line data and subtracted from the original array line data. This is done to remove any trends. Finding the mean of this data yields the average signal and background values. The net signal is calculated from the difference of the average signal and background values. The RMS or root mean square noise is defined from the background region. Finally, SNR is determined as the ratio of the net signal to the RMS noise.
Polynomial and coefficients
Net signal, signal, and background
The second-order polynomial is subtracted from the original data to remove any trends and then averaged. This yields the signal and background values:
where
Hence, the net signal value is determined by:
RMS noise and SNR
The SNR is thus given by
Using the industry standard 20 log rule...