![]() | ||
Renewal theory is the branch of probability theory that generalizes Poisson processes for arbitrary holding times. Applications include calculating the best strategy for replacing worn-out machinery in a factory (example below) and comparing the long-term benefits of different insurance policies.
Contents
- Introduction
- Formal definition
- Interpretation
- Renewal reward processes
- Properties of renewal processes and renewal reward processes
- The elementary renewal theorem
- Proof
- The elementary renewal theorem for renewal reward processes
- The renewal equation
- Proof of the renewal equation
- Asymptotic properties
- The inspection paradox
- Proof of the inspection paradox
- Superposition
- Example 1 use of the strong law of large numbers
- Solution
- References
Introduction
A renewal process is a generalization of the Poisson process. In essence, the Poisson process is a continuous-time Markov process on the positive integers (usually starting at zero) which has independent identically distributed holding times at each integer
Formal definition
Let
We refer to the random variable
Define for each n > 0 :
each
being called renewal intervals.
Then the random variable
where
Interpretation
If one considers events occurring at random times, one may choose to think of the holding times
Renewal-reward processes
Let
Then the random variable
is called a renewal-reward process. Note that unlike the
The random variable
Interpretation
In the context of the above interpretation of the holding times as the time between successive malfunctions of a machine, the "rewards"
An alternative analogy is that we have a magic goose which lays eggs at intervals (holding times) distributed as
Properties of renewal processes and renewal-reward processes
We define the renewal function as the expected value of the number of jumps observed up to some time
The elementary renewal theorem
The renewal function satisfies
Proof
Below, you find that the strong law of large numbers for renewal processes tell us that
To prove the elementary renewal theorem, it is sufficient to show that
To do this, consider some truncated renewal process where the holding times are defined by
The elementary renewal theorem for renewal reward processes
We define the reward function:
The reward function satisfies
The renewal equation
The renewal function satisfies
where
Proof of the renewal equation
We may iterate the expectation about the first holding time: But by the Markov property So as required.Asymptotic properties
almost surely.
Proof
First considerThe inspection paradox
A curious feature of renewal processes is that if we wait some predetermined time t and then observe how large the renewal interval containing t is, we should expect it to be typically larger than a renewal interval of average size.
Mathematically the inspection paradox states: for any t > 0 the renewal interval containing t is stochastically larger than the first renewal interval. That is, for all x > 0 and for all t > 0:
where FS is the cumulative distribution function of the IID holding times Si.
Proof of the inspection paradox
Observe that the last jump-time before t is
as required.
Superposition
The superposition of independent renewal processes is not generally a renewal process, but it can be described within a larger class of processes called the Markov-renewal processes. However, the cumulative distribution function of the first inter-event time in the superposition process is given by
where Rk(t) and αk > 0 are the CDF of the inter-event times and the arrival rate of process k.
Example 1: use of the strong law of large numbers
Eric the entrepreneur has n machines, each having an operational lifetime uniformly distributed between zero and two years. Eric may let each machine run until it fails with replacement cost €2600; alternatively he may replace a machine at any time while it is still functional at a cost of €200.
What is his optimal replacement policy?
Solution
The lifetime of the n machines can be modeled as n independent concurrent renewal-reward processes, so it is sufficient to consider the case n=1. Denote this process by
If Eric decides at the start of a machine's life to replace it at time 0 < t < 2 but the machine happens to fail before that time then the lifetime S of the machine is uniformly distributed on [0, t] and thus has expectation 0.5t. So the overall expected lifetime of the machine is:
and the expected cost W per machine is:
So by the strong law of large numbers, his long-term average cost per unit time is:
then differentiating with respect to t:
this implies that the turning points satisfy:
and thus
We take the only solution t in [0, 2]: t = 2/3. This is indeed a minimum (and not a maximum) since the cost per unit time tends to infinity as t tends to zero, meaning that the cost is decreasing as t increases, until the point 2/3 where it starts to increase.