In machine learning, one-class classification, also known as unary classification, tries to identify objects of a specific class amongst all objects, by learning from a training set containing only the objects of that class. This is different from and more difficult than the traditional classification problem, which tries to distinguish between two or more classes with the training set containing objects from all the classes. An example is the classification of the operational status of a nuclear plant as 'normal': In this scenario, there are few, if any, examples of catastrophic system states; only the statistics of normal operation are known. The term One-class classification was coined by Moya & Hush (1996) and many applications can be found in scientific literature, for example outlier detection, anomaly detection, novelty detection. A component of one-class classification is that it uses only samples from the assigned class, so that a representative sampling is not strictly required for non-target classes.
A similar problem is PU learning, in which a binary classifier is learned in a semi-supervised way from only positive and unlabeled samples.
PU learning
In PU learning, two sets of examples are assumed to be available for training: the positive set