In Information theory, theorem of Shannon-Hartley is an application of the theorem of codification for channels with noise. A very frequent case is the one of Analogical Communication channel continuous in the time that presents/displays gausiano Noise .

The theorem establishes the capacity of the channel of Shannon, one Level superior that establishes the maximum amount of digital data that can be transmitted without error (that is to say, Information ) that they can be transmitted on this connection of communications with specific Ancho of band and that is put under the presence of the interference of the noise.

In the departure hypotheses, for the correct application of the theorem, a limitation in the power of the signal and, in addition, that is assumed the process of the gausiano noise is characterized by a well-known power or a spectral densidad of power.

The law must its name Claude Shannon and Ralph Hartley .

Declaration of the theorem

Considering all the possible techniques of codification of multiple and multiphase levels, the theorem of Shannon-Hartley it indicates that capacity of the channel C is:



C = B \ log_2 \ left (1+ \ frac {S} {N} \ right)

where:

*B is the bandwidth of the channel.
*C is the capacity of the channel (rate of information bits bit/s)
*S is the power of the useful signal, that it can be expressed in watts, milliwatts, etc.)
*N is the power of the present noise in the channel, (mW, \ muW, etc.) that tries to mask to the useful signal.

Historical development

At the end of years 20, Harry Nyquist and Ralph Hartley developed a series of fundamental ideas related to the transmission of the information, of particular way, in the context of the telegraph like system of communications. In those years, these concepts were long-range advances of individual character, but Corpora did not comprise of of an exhaustive theory.

It was in years 40, when Claude Shannon developed the concept of capacity of a channel being based, partly, in the ideas that already had proposed Nyquist and Hartley and formulating, later, a complete theory on the information and the transmission of this, through channels.

Rate of Nyquist

In 1927, Nyquist determined that the number of independent pulses that could happen through a telegraph channel, by time unit, was limited twice the bandwidth of the channel.



f_p \ him 2B

where f_p is the frequency of the pulse (in pulses per second) and B is the bandwidth (in Hertz . The amount 2B was called, more ahead, rate of Nyquist, and transmitting to this rate of pulses limit of 2B pulses per second was denominated to him signaling to the rate of Nyquist .

Nyquist published its results in 1928 like part of its article " Certain topics in Telegraph Transmission Theory" .

Law of Hartley

During that same year, Hartley formulated a way to quantify the information and its rate of transmission through a communication channel. This method, known more ahead like law of Hartley, became an important precursor for the sophisticated notion of capacity of a channel, formulated by Shannon.

Hartley indicated that the maximum number of different pulses that can be transmitted and be received, of trustworthy way, on a communication channel is limited by the dynamic range of the amplitude of the signal and the precision with which the receiver can distinguish different levels from amplitude.

Of specific way, if the amplitude of the transmitted signal restricts to the rank of voltios, and the precision of the receiver is +/- \ V Delta voltios, then the number maximum of different pulses M is given by:



M = 1 + {To \ to over \ Delta V}

Taking the information to be the logarithm of the number of the different messages that could be sent, Hartley later constructed a measurement of the proportional information to the bandwidth of the channel and to the duration of its use. Sometimes it is only spoken of this proportionality when one mentions the law of Hartley.

Later, Hartley combined the observation of Nyquist, and its own quantification of the quality or noise of a channel in terms of the number of levels of pulse that could be distinguished, of trustworthy way and denoted by M, to arrive at a quantitative measurement of the rate of information that can be obtained.

The law of Hartley is explained, quantitatively, of usual way, like the rate of attainable information of R bps, (b/s) :



R = 2B \ log_2 (M) \,

Hartley not solved, of way precise how parameter M must to depend on statistics of Noise of channel, or how the communication even could be trustworthy when the individual pulses corresponding to symbols could not be distinguished, of trustworthy way, of the levels of M; with the statistics of the Gaussian noise.

The designers of systems must choose a very preservative value of M to reach the minimum error rate.

The concept of a free capacity of errors waited until Claude Shannon investigated on the observations of Hartley with respect to the logarithmic measurement of the information and the observations of Nyquist on the effect of the limitations of the bandwidth of the channel.

The result of the rate of Hartley can be seen like the capacity of a channel M without errors of 2B symbols per second. Some authors talk about to it like capacity. But that supposed channel, free of errors, is an ideal channel, and the result is, necessarily, minor who the capacity of Shannon of a channel with noise of bandwidth B, that is the Hartley-Shannon result that was considered more ahead.

Theorem of codification of channels with noise and capacity

The development of the information theory of Claude Shannon during World War II stimulated the following great step to understand what amount of information could be communicated, without errors and of trustworthy way, through channels with basic gausiano noise.

Based on the ideas of Hartley, the theorem of Shannon of the codification of channels with noise (1948) describes to the maximum possible efficiency of the methods of correction of errors versus the levels of noise interference and corruption of data. The test of the theorem sample that a code corrector of errors constructed randomly is, essentially, good equal of that the best possible code. The theorem test with the statistic of such random codes.

The theorem of Shannon demonstrates how to calculate the capacity of a channel on a statistical description of the channel and establishes that, given to a channel with noise with capacity C and information transmitted in one appraises R, then if

R < C \,

a codification technique exists that allows that the probability of error in the receiver is made small arbitrarily. This means that, theoretically, it is possible to almost transmit information without error until a limit near C bps.

The inverse one also is important. If

R > C \,

the probability of the error in the receiver is increased without limit while the rate is increased. This way any useful information over the capacity of the channel cannot be transmitted. The theorem does not treat the situation, little frequents, in which the rate and the capacity are equal.

Theorem of Shannon-Hartley

The theorem of Shannon-Hartley establishes which is the capacity of the channel, for a channel with finite continuous signal and bandwidth that it undergoes a Gaussian noise. It connects the result of Hartley with the theorem of Shannon of the capacity of the channel in a form that is equivalent to specify M in the formula of Hartley of the rate of information in terms of the relation signal/noise, but reaching reliability through the cogoverning codification of errors, more trustworthy, than the distinguishable levels of pulse.

If a thing existed as an analog channel with infinite bandwidth and without noise, one could transmit limitless amounts of data without error, on this, by each unit of time. Nevertheless, the real communication channels are subject to the limitations imposed by the finite bandwidth and the noise.

Then, how the bandwidth and the noise affect the rate in which the information can be transmitted on an analog channel?

Although it seems surprising, the limitations of the bandwidth, in case single, they do not impose restrictions on the maximum rate of information. This is because it continues being possible, for the signal, to take an infinitely great number of values different from voltage for each pulse of symbol, being each level slightly different from the previous one that represents a certain meaning or sequence of bits. Nevertheless, if we combined both factors as much, that is to say, the noise as the limitations of the bandwidth, we found a limit to the amount of information that can be transferred by a signal of limited power, even though are used techniques of codification of multiple levels.

In the channel considered by the theorem of Shannon-Hartley, the noise and the signal are added. That is to say, the receiver measures a signal that is equal to the sum of the signal that codifies the wished information and a continuous variate that represents the noise. This sum creates uncertainty as far as the value of the original signal.

If the receiver has certain information on the random process that it generates the noise, the information of the original signal can, in principle, be recovered considering all the possible states of the process of the noise. In the case of the theorem of Shannon-Hartley, it is assumed that the noise is generated by a Gaussian process with one well-known Variance . Since the variance of the Gaussian process is equivalent to its power, normally the noise power is called to this variance.

Such channel is called channel additive of the Gaussian white noise, because the Gaussian noise is addition to the signal; white means equal amount of noise in all the frequencies within the bandwidth of the channel.

Implications of the theorem

Comparison of the capacity of Shannon with the law of Hartley

Comparing the capacity of the channel with the rate of information of the law of Hartley, we can find the number effective of the distinguishable levels M:



2B \ log_2 (M) = B \ log_2 \ left (1+ \ frac {S} {N} \ right)



M = \ sqrt {1+ \ frac {S} {N}}

The square root again turns with effectiveness the quotient of powers into a voltage quotient, so the number of levels is approximately proportional to the quotient between the value by the square root average of the amplitude of the signal and the standard deviation of the noise.

This similarity between the capacity of Shannon and the law of Hartley is not due to interpret as M levels of pulses can be sent literally without no confusion. More levels are needed, to allow to superfluous codification and the correction of errors, but the net rate of data that can approach with the codification is equivalent to use M in the law of Hartley.

Alternative forms

Dependant case of the frequency ( Noise of color )

In the simple version of above, the signal and the noise completely are incorreladas, and in that case S + N is the total power of the together received signal and the noise. A generallización of the aforesaid equation for the case where the additional noise is not white (that is to say, relation S/N is not constant with the frequency on the bandwidth) like many independent and Gaussian narrow channels in parallel:



C = \ int_ {0} ^B \ log_2 \ left (1+ \ frac {S (f)}{N (f)} \ right) DF

where:

*C is the capacity of the channel in bps
*B is the bandwidth of the channel in Hertz
*S (f) is the phantom of power of the signal
*N (f) is the phantom of power of the noise
*f is the frequency in Hertz

Note: the theorem is only applied to the noises that are stationary Gaussian processes. The way in which this formula introduces the dependant noise of the frequency does not serve to describe all the processes of the continuous noise in the time. For example, we consider a process of the noise that consists of adding a random wave whose amplitude is 1 or -1 at any time of the time, and a channel that year this wave to the original signal. The components of the frequency of the wave are highly employees. Although such noise can have a high power, is quite easy to transmit a continuous signal with much less power than the necessary one if the underlying noise were a sum of the independent noises of each frequency band.

Approaches

For the relations great or small and constant signal/noise, the formula of the capacity can be approximated:

  • If S/N>>1, then





C \ approx 0,332 \ cdot B \ cdot \ mathrm {SNR (in \ dB)}

where



\ mathrm {SNR (in \ dB)} = 10 \ log_ {10} {S \ to over N}

  • Of analogous way, if S/N<<1, then





C \ approx 1,44 \ cdot B \ cdot {S \ to over N}

In this low approach of SNR ( signal to noise ratio, relation signal to noise), the capacity is independent of the bandwidth if the noise is white, the spectral densidad of this noise is N_0 watts by Hertz, (W/Hz). In this case the total of the power of the noise is B \ cdot N_0.



C \ approx 1,44 \ cdot {S \ to over N_0}

Numerical examples

  • If the 20 SNR is dB, and the bandwidth 4 available is kHz, appropriate for the telephone communications, then C = 4 log2 (1 + 100) = 4 log2 (101) = 26,63 kbit/s. Obs√©rvese that the value S/N=100 is equivalent to the 20 SNR of dB.

  • If it is required to transmit in 50 kbit/s, and the used bandwidth is 1 MHz, then minimum required relation S/N is given by 50 = 1000 log2 (1+S/N) so that S/N = 2 C/W -1 = 0.035 corresponding to a -14,5 SNR of dB. This demonstrates that it is possible to transmit with signals that is much more weak that the basic noise level like in the communications of widened phantom.

    Random links:Daniel Diaz | Elephas maximus indicus | Elvira Madigan (1967 film) | Gerald Blyde | Selection of basketball of the Czech Republic

© 2007-2008 speedlook.com; article text available under the terms of GFDL, from fr.wikipedia.org