Puneet Varma (Editor)

Bayes error rate

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistical classification, Bayes error rate is the lowest possible error rate for any classifier of a random outcome (into, for example, one of two categories) and is analogous to the irreducible error.

A number of approaches to the estimation of the Bayes error rate exist. One method seeks to obtain analytical bounds which are inherently dependent on distribution parameters, and hence difficult to estimate. Another approach focuses on class densities, while yet another method combines and compares various classifiers.

The Bayes error rate finds important use in the study of patterns and machine learning techniques.

Error determination

In terms of machine learning and pattern classification, the labels of a set of random observations can be divided into 2 or more classes. Each observation is called an instance and the class it belongs to is the label. The Bayes error rate of the data distribution is the probability an instance is misclassified by a classifier that knows the true class probabilities given the predictors. For a multiclass classifier, the Bayes error rate may be calculated as follows:

p = x H i C i C max,x P ( C i | x ) p ( x ) d x

where x is an instance, Ci is a class into which an instance is classified, Hi is the area/region that a classifier function h classifies as Ci.

The Bayes error is non-zero if the classification labels are not deterministic, i.e., there is a non-zero probability of a given instance belonging to more than one class.

References

Bayes error rate Wikipedia