Channel capacity
Information-theoretical limit on transmission rate in a communication channel / From Wikipedia, the free encyclopedia
Dear Wikiwand AI, let's keep it short by simply answering these key questions:
Can you list the top facts and stats about Channel capacity?
Summarize this article for a 10 year old
Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel.
This article needs additional citations for verification. (May 2023) |
Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.[1][2]
Information theory, developed by Claude E. Shannon in 1948, defines the notion of channel capacity and provides a mathematical model by which it may be computed. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.[3]
The notion of channel capacity has been central to the development of modern wireline and wireless communication systems, with the advent of novel error correction coding mechanisms that have resulted in achieving performance very close to the limits promised by channel capacity.
The basic mathematical model for a communication system is the following:
where:
- is the message to be transmitted;
- is the channel input symbol ( is a sequence of symbols) taken in an alphabet ;
- is the channel output symbol ( is a sequence of symbols) taken in an alphabet ;
- is the estimate of the transmitted message;
- is the encoding function for a block of length ;
- is the noisy channel, which is modeled by a conditional probability distribution; and,
- is the decoding function for a block of length .
Let and be modeled as random variables. Furthermore, let be the conditional probability distribution function of given , which is an inherent fixed property of the communication channel. Then the choice of the marginal distribution completely determines the joint distribution due to the identity
which, in turn, induces a mutual information . The channel capacity is defined as
where the supremum is taken over all possible choices of .