![]() | ||
In electrical engineering, computer science and information theory, channel capacity is the tight upper bound on the rate at which information can be reliably transmitted over a communications channel.
Contents
- Formal definition
- Shannon capacity of a graph
- Noisy channel coding theorem
- Example application
- Channel capacity in wireless communications
- AWGN channel
- Frequency selective channel
- Slow fading channel
- Fast fading channel
- References
By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.
Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.
The notion of channel capacity has been central to the development of modern wireline and wireless communication systems, with the advent of novel error correction coding mechanisms that have resulted in achieving performance very close to the limits promised by channel capacity.
Formal definition
Let
which, in turn, induces a mutual information
where the supremum is taken over all possible choices of
Shannon capacity of a graph
If G is an undirected graph, it can be used to define a communications channel in which the symbols are the graph vertices, and two codewords may be confused with each other if their symbols in each position are equal or adjacent. The computational complexity of finding the Shannon capacity of such a channel remains open, but it can be upper bounded by another important graph invariant, the Lovász number.
Noisy-channel coding theorem
The noisy-channel coding theorem states that for any ε > 0 and for any transmission rate R less than the channel capacity C, there is an encoding and decoding scheme transmitting data at rate R whose error probability is less than ε, for a sufficiently large block length. Also, for any rate greater than the channel capacity, the probability of error at the receiver goes to one as the block length goes to infinity.
Example application
An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:
C is measured in bits per second if the logarithm is taken in base 2, or nats per second if the natural logarithm is used, assuming B is in hertz; the signal and noise powers S and N are measured in watts or volts2, so the signal-to-noise ratio here is expressed as a power ratio, not in decibels (dB); since figures are often cited in dB, a conversion may be needed. For example, 30 dB is a power ratio of
Channel capacity in wireless communications
This section focuses on the single-antenna, point-to-point scenario. For channel capacity in systems with multiple antennas, see the article on MIMO.
AWGN channel
If the average received power is
where
When the SNR is large (SNR >> 0 dB), the capacity
When the SNR is small (SNR << 0 dB), the capacity
The bandwidth-limited regime and power-limited regime are illustrated in the figure.
Frequency-selective channel
The capacity of the frequency-selective channel is given by so-called water filling power allocation,
where
Slow-fading channel
In a slow-fading channel, where the coherence time is greater than the latency requirement, there is no definite capacity as the maximum rate of reliable communications supported by the channel,
in which case the system is said to be in outage. With a non-zero probability that the channel is in deep fade, the capacity of the slow-fading channel in strict sense is zero. However, it is possible to determine the largest value of
Fast-fading channel
In a fast-fading channel, where the latency requirement is greater than the coherence time and the codeword length spans many coherence periods, one can average over many independent channel fades by coding over a large number of coherence time intervals. Thus, it is possible to achieve a reliable rate of communication of