A probabilistic metric space is a generalization of metric spaces where the distance is no longer defined on positive real numbers, but on distribution functions.
Contents
Let D+ be the set of all probability distribution functions F such that F(0) = 0 (F is a nondecreasing, left continuous mapping from R into [0, 1] such that sup(F(x)) = 1 for x∈R.
The ordered pair (S,F) is said to be a probabilistic metric space if S is a nonempty set and F: S×S → D+ (F(p, q) is denoted by Fp,q for every (p, q) ∈ S × S) satisfies the following conditions:
Probability metric of random variables
A probability metric D between two random variables X and Y may be defined e.g. as:
where F(x, y) denotes the joint probability density function of random variables X and Y. Obviously if X and Y are independent from each other the equation above transforms into:
where f(x) and g(y) are probability density functions of X and Y respectively.
One may easily show that such probability metrics do not satisfy the first metric axiom or satisfies is only if, and only if, both of its arguments X, Y are certain events described by Dirac delta density probability distribution functions. In this case:
the probability metric simply transforms into the metric between expected values
For all other random variables X, Y the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric space, that is:
Example
For example if both probability distribution functions of random variables X and Y are normal distributions (N) having the same standard deviation
where:
and
In this case:
Probability metric of random vectors
The probability metric of random variables may be extended into metric D(X, Y) of random vectors X, Y by substituting
where F(X, Y) is the joint probability density function of random vectors X and Y. For example substituting d(x,y) with Euclidean metric and providing the vectors X and Y are mutually independent would yield to: