In signal processing, Total variation denoising, also known as total variation regularization is a process, most often used in digital image processing, that has applications in noise removal. It is based on the principle that signals with excessive and possibly spurious detail have high total variation, that is, the integral of the absolute gradient of the signal is high. According to this principle, reducing the total variation of the signal subject to it being a close match to the original signal, removes unwanted detail whilst preserving important details such as edges. The concept was pioneered by Rudin et al. in 1992 and is today known as the ROF model.
Contents
This noise removal technique has advantages over simple techniques such as linear smoothing or median filtering which reduce noise but at the same time smooth away edges to a greater or lesser degree. By contrast, total variation denoising is remarkably effective at simultaneously preserving edges whilst smoothing away noise in flat regions, even at low signal-to-noise ratios.
Mathematical exposition for 1D digital signals
For a digital signal
Given an input signal
So the total variation denoising problem amounts to minimizing the following discrete functional over the signal
By differentiating this functional with respect to
Regularization properties
The regularization parameter
2D digital signals
We now consider 2D signals y, such as images. The total variation norm proposed by the 1992 paper is
and is isotropic and not differentiable. A variation that is sometimes used, since it may sometimes be easier to minimize, is an anisotropic version
The standard total variation denoising problem is still of the form
where E is the 2D L2 norm. In contrast to the 1D case, solving this denoising is non-trivial. A recent algorithm that solves this is known as the primal dual method.
Due in part to much research in compressed sensing in the mid-2000s, there are many algorithms, such as the split-Bregman method, that solve variants of this problem.