Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Capacity of a channel, college study notes - Constraining channel code, Study notes of Multimedia Applications

Study Material. Shannon derived the maximum datarate of a source coder's output that can be transmitted through a bandlimited additive white noise channel with no error. Capacity of a Channel, Connexions Web site. http://cnx.org/content/m0098/2.13/, Jul 15, 2007. Connexions, Capacity, Channel, Theorem, Bandlimited, Additive, Transmission, Fundamental, Multi-level, Signaling, Amplitudes, Comprised, Sinusoidal, Amplitudes, Solution, Interval, Constraining

Typology: Study notes

2011/2012

Uploaded on 10/09/2012

alfred67
alfred67 🇺🇸

4.9

(20)

328 documents

1 / 3

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Connexions module: m0098 1
Capacity of a Channel
Don Johnson
This work is produced by The Connexions Project and licensed under the
Creative Commons Attribution License
Abstract
Shannon derived the maximum datarate of a source coder's output that can be transmitted through
a bandlimited additive white noise channel with no error.
In addition to the Noisy Channel Coding Theorem and its converse
1
, Shannon also derived the capacity
for a bandlimited (to
W
Hz) additive white noise channel. For this case, the signal set is unrestricted, even
to the point that more than one bit can be transmitted each "bit interval." Instead of constraining channel
code eciency, the revised Noisy Channel Coding Theorem states that some error-correcting code exists such
that as the block length increases, error-free transmission is possible if the source coder's datarate,
B(A)R
,
is less than capacity.
C=Wlog2(1 + SNR)
bits/s (1)
This result sets the maximum datarate of the source coder's output that can be transmitted through the
bandlimited channel with no error.
2
Shannon's proof of his theorem was very clever, and did not indicate
what this code might be; it has never been found. Codes such as the Hamming code work quite well in
practice to keep error rates low, but they remain greater than zero. Until the "magic" code is found, more
important in communication system design is the converse. It states that if your data rate exceeds capacity,
errors will overwhelm you no matter what channel coding you use. For this reason, capacity calculations are
made to understand the fundamental limits on transmission rates.
Exercise 1
(Solution on p. 3.)
The rst denition of capacity applies only for binary symmetric channels, and represents the
number of bits/transmission. The second result states capacity more generally, having units of
bits/second. How would you convert the rst denition's result into units of bits/second?
Example 1
The telephone channel has a bandwidth of 3 kHz and a signal-to-noise ratio exceeding 30 dB (at
least they promise this much). The maximum data rate a modem can produce for this wireline
channel and hope that errors will not become rampant is the capacity.
C= 3 ×103log21 + 103
=
29.901 kbps (2)
Thus, the so-called 33 kbps modems operate right at the capacity limit.
Version 2.13: Jul 15, 2007 5:43 pm GMT-5
http://creativecommons.org/licenses/by/1.0
1
"Noisy Channel Coding Theorem" <http://cnx.org/content/m0073/latest/>
2
The bandwidth restriction arises not so much from channel properties, but from spectral regulation, especially for wireless
channels.
http://cnx.org/content/m0098/2.13/
pf3

Partial preview of the text

Download Capacity of a channel, college study notes - Constraining channel code and more Study notes Multimedia Applications in PDF only on Docsity!

Capacity of a Channel

Don Johnson

This work is produced by The Connexions Project and licensed under the Creative Commons Attribution License †

Abstract Shannon derived the maximum datarate of a source coder's output that can be transmitted through a bandlimited additive white noise channel with no error. In addition to the Noisy Channel Coding Theorem and its converse^1 , Shannon also derived the capacity for a bandlimited (to W Hz) additive white noise channel. For this case, the signal set is unrestricted, even to the point that more than one bit can be transmitted each "bit interval." Instead of constraining channel code eciency, the revised Noisy Channel Coding Theorem states that some error-correcting code exists such that as the block length increases, error-free transmission is possible if the source coder's datarate, B (A)R, is less than capacity. C = W log 2 (1 + SNR) bits/s (1) This result sets the maximum datarate of the source coder's output that can be transmitted through the bandlimited channel with no error. 2 Shannon's proof of his theorem was very clever, and did not indicate what this code might be; it has never been found. Codes such as the Hamming code work quite well in practice to keep error rates low, but they remain greater than zero. Until the "magic" code is found, more important in communication system design is the converse. It states that if your data rate exceeds capacity, errors will overwhelm you no matter what channel coding you use. For this reason, capacity calculations are made to understand the fundamental limits on transmission rates.

Exercise 1 (Solution on p. 3.) The rst denition of capacity applies only for binary symmetric channels, and represents the number of bits/transmission. The second result states capacity more generally, having units of bits/second. How would you convert the rst denition's result into units of bits/second? Example 1 The telephone channel has a bandwidth of 3 kHz and a signal-to-noise ratio exceeding 30 dB (at least they promise this much). The maximum data rate a modem can produce for this wireline channel and hope that errors will not become rampant is the capacity.

C = 3 × 103 log 2

1 + 10^3

= 29.901 kbps

Thus, the so-called 33 kbps modems operate right at the capacity limit. ∗Version 2.13: Jul 15, 2007 5:43 pm GMT- †http://creativecommons.org/licenses/by/1. (^1) "Noisy Channel Coding Theorem" http://cnx.org/content/m0073/latest/ (^2) The bandwidth restriction arises not so much from channel properties, but from spectral regulation, especially for wireless channels.

Note that the data rate allowed by the capacity can exceed the bandwidth when the signal-to-noise ratio exceeds 0 dB. Our results for BPSK and FSK indicated the bandwidth they require exceeds (^) T^1. What kind of signal sets might be used to achieve capacity? Modem signal sets send more than one bit/transmission using a number, one of the most popular of which is multi-level signaling. Here, we can transmit several bits during one transmission interval by representing bit by some signal's amplitude. For example, two bits can be sent with a signal set comprised of a sinusoid with amplitudes of ±A and ±

( A

2