ShannonHartley theorem

In information theory, the ShannonHartley theorem states the maximum amount of errorfree digital data (that is, information) that can be transmitted over a communication link with a specified bandwidth in the presence of noise interference. The law is named after Claude Shannon and Ralph Hartley. The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel.
Contents 
Theorem
Proved by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of errorcorrecting methods versus levels of noise interference and data corruption. The theory doesn't describe how to construct the errorcorrecting method, it only tells us how good the best possible method can be. Shannon's theorem has wideranging applications in both communications and data storage applications. This theorem is of foundational importance to the modern field of information theory.
If we had such a thing as an infinitebandwidth, noisefree analog channel we could transmit unlimited amounts of errorfree data over it per unit of time. However real life signals have both bandwidth and noiseinterference limitations.
Shannon and Hartley asked: How do bandwidth and noise affect the rate at which information can be transmitted over an analog channel? Surprisingly, bandwidth limitations alone do not impose a cap on maximum information transfer. This is because it is still possible (at least in a thoughtexperiment model) for the signal to take on an infinite number of different voltage levels on each cycle, with each slightly different level being assigned a different meaning or bit sequence. If we combine both noise and bandwidth limitations, however, we do find there is a limit to the amount of information that can be transferred, even when clever multilevel encoding techniques are used. This is because the noise signal obliterates the fine differences that distinguish the various signal levels, limiting in practice the number of detection levels we can use in our scheme.
The Shannon theorem states that given a channel with information capacity C and information is transmitted at a rate R, then if
 <math> R \le C <math>
there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. This means that theoretically, it is possible to transmit information without error up to a limit, C.
The converse is also important. If
 <math> R > C <math>
the probability of error at the receiver increases without bound. This implies that no useful information can be transmitted beyond the channel capacity.
Capacity of a binary symmetric channel with Gaussian noise
Considering all possible multilevel and multiphase encoding techniques, Shannon's theorem gives the theoretical maximum rate of clean (or arbitrarily low bit error rate) data C with a given average signal power that can be sent through an analog communication channel subject to additive, white, Gaussiandistribution noise interference:
 <math>
C = BW \times \log_2(1+S/N) <math>
where
 C is the channel capacity in bits per second inclusive of error correction;
 BW is the bandwidth of the channel in hertz; and
 S/N is the signaltonoise ratio of the communication signal to the Gaussian noise interference expressed as a straight power ratio (not as decibels)
For large or small signaltonoise ratios, this formula can be approximated.
If S/N >> 1, C = 0.332 · BW · SNR (in dB).
If S/N << 1, C = 1.44 · BW · S/N (in power).
Simple schemes such as "send the message 3 times and use a best 2 out of 3 voting scheme if the copies differ" are inefficient users of bandwidth, and thus are far from the Shannon limit. Advanced techniques such as ReedSolomon codes and, more recently, Turbo codes come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. With Turbo codes and the computing power in today's digital signal processors, it is now possible to reach within 1/10 of one decibel of the Shannon limit.
The V.34 modem standard advertises a rate of 33.6 kbit/s, and V.90 claims a rate of 56 kbit/s, apparently in excess of the Shannon limit (telephone bandwidth is 3.3 kHz). In fact, neither standard actually reaches the Shannon limit, but closely approaches it. The speed improvement of V.90 was made possible by the elimination of an additional step of analogtodigital conversion by the use of fully digital equipment at the other end of a modem connection. This improves the signal to noise ratio, which in turn produces the required headroom to exceed 33.6 kbit/s which was otherwise near the Shannon limit.
Examples
 If the S/N is 20 dB, and the bandwidth available is 4 kHz, which is appropriate for telephone communications, then C = 4 log_{2}(1 + 100) = 4 log_{2} (101) = 26.63 kbit/s. Note that the value of 100 is appropriate for an S/N of 20 dB.
 If it is required to transmit at 50 kbit/s, and a bandwidth of 1 MHz is used, then the minimum S/N required is given by 50 = 1000 log_{2}(1+S/N) so S/N = 2^{C/W} 1 = 0.035 corresponding to an S/N of 14.5 dB. This shows that it is possible to transmit using signals which are actually much weaker than the background noise level, as in spreadspectrum communications.
References
 C. E. Shannon, The Mathematical Theory of Information. Urbana, IL:University of Illinois Press, 1949 (reprinted 1998).
 Herbert Taub, Donald L. Schilling, "Principles of Communication Systems", McGrawHill, 1986
See also
External links
 On Shannon and Shannon's law (http://www.tele.ntnu.no/projects/beats/Documents/LarsTelektronikk02.pdf)
 The ShannonHartley Theorem (http://www.cs.man.ac.uk/~barry/mydocs/CS3282/lastyear/csdc7.pdf)
 The relationship between information, bandwidth and noise (http://www.cs.ucl.ac.uk/staff/S.Bhatti/D51notes/node6.html#equHartleyShannon)zh:香农极限