0% found this document useful (0 votes)
157 views33 pages

Shannon's Information Theory Overview

This document discusses Shannon's Information Theory and key concepts like channel capacity, signal-to-noise ratio, and entropy. It explains how Information Theory can be used to optimize codes by accounting for the frequency of symbols. Shannon's theorem establishes theoretical limits for error correction based on noise levels.

Uploaded by

Saavedra Rudz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
157 views33 pages

Shannon's Information Theory Overview

This document discusses Shannon's Information Theory and key concepts like channel capacity, signal-to-noise ratio, and entropy. It explains how Information Theory can be used to optimize codes by accounting for the frequency of symbols. Shannon's theorem establishes theoretical limits for error correction based on noise levels.

Uploaded by

Saavedra Rudz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Shannon’s Information Theory

What is Information Theory


Information theory deals with measurement and
transmission of information through a channel.

A fundamental work in this area is the Shannon's


Information Theory, which provides many useful
tools that are based on measuring information in
terms of bits or - more generally - in terms of (the
minimal amount of) the complexity of structures
needed to encode a given piece of information.
NOISE
Noise can be considered data without
meaning; that is, data that is not
being used to transmit a signal, but
is simply produced as an unwanted
by-product of other activities. Noise
is still considered information, in the
sense of Information Theory.
Information Theory Cont…

Shannon’s ideas

• Form the basis for the field of Information Theory

• Provide the yardsticks for measuring the efficiency


of communication system.

• Identified problems that had to be solved to get to


what he described as ideal communications systems
Information
In defining information, Shannon identified the
critical relationships among the
elements of a communication system
the power at the source of a signal
the bandwidth or frequency range of an
information channel through which the signal travels
the noise of the channel, such as unpredictable
static on a radio, which will alter the signal by the
time it reaches the last element of the System
the receiver, which must decode the signal.
General Model Of Communication
◼ Signal bandwidth:
• We can divide signals into two categories: The pure tone signal
(the sinusoidal wave, consisting of one frequency component),
and complex signals that are composed of several components, or
sinusoids of various frequencies.
• Since voice signals are
also composed of several
components (pure tones)
of various frequencies,
the bandwidth of a voice
signal is taken to be the
difference between the
highest and lowest
frequencies which are
3000 Hz and (close to) 0
Hz
• Although other frequency
components above 3000
Hz exist, (they are more
prominent in the male
voice), an acceptable
degradation of voice
quality is achieved by
disregarding the higher
frequency components,
accepting the 3kHz
bandwidth as a standard
for voice communications
◼ channel bandwidth:
• The bandwidth of a channel (medium) is defined to be the range
of frequencies that the medium can support. Bandwidth is
measured in Hz
• With each transmission medium, there is a frequency range of
electromagnetic waves that can be transmitted:
◼ Twisted pair: 0 to 109 Hz (Bandwidth : 109 Hz)
◼ Coax cable: 0 to 1010 Hz (Bandwidth : 1010 Hz)
◼ Optical fiber: 1014 to 1016 Hz (Bandwidth : 1016 -1014 = 9.9x1015 Hz)

• Optical fibers have the highest bandwidth (they can support


electromagnetic waves with very high frequencies, such as light
waves)
• The bandwidth of the channel dictates the information carrying
capacity of the channel
• This is calculated using Shannon’s channel capacity formula
Information Theory Cont.
To get a high-level understanding of his theory,
a few basic points should be made.
First, words are symbols to carry information
between people. If one says
to an American, “Let’s go!”, the command is
immediately understood. But if we
give the commands in Russian, “Pustim v
xod!”, we only get a quizzical look.
Russian is the wrong code for an American.
Information Theory Cont.
Second, all communication involves
three steps

❖ Coding a message at its source

❖ Transmitting the message through a


communications channel

❖Decoding the message at its


destination.
Information Theory Cont….

In the first step, the message has to be put into


some kind of symbolic
representation – words, musical notes, icons,
mathematical equations, or bits.
When we write “Hello,” we encode a greeting.
When we write a musical score,
it’s the same thing – only we’re encoding sounds.
Information Theory Cont.
For any code to be useful it has to be transmitted to
someone or, in a computer’s case, to something.

Transmission can be by voice, a letter, a billboard, a


telephone conversation, a radio or television
broadcast.

At the destination, someone or something has to


receive the symbols, and then decode them by
matching them against his or her own body
of information to extract the data.
Information Theory Cont….
Fourth, there is a distinction between a
communications channel’s
designed symbol rate of so many bits per second
and its actual information
capacity. Shannon defines channel capacity as how
many kilobits per second
of user information can be transmitted over a noisy
channel with as small an
error rate as possible, which can be less than the
channel’s “raw” symbol rate.
EXAMPLE
Suppose we are watching cars going past on a
highway. For simplicity, suppose 50% of the cars
are black, 25% are white, 12.5% are red, and
12.5% are blue. Consider the flow of cars as an
information source with four words: black, white,
red, and blue. A simple way of encoding this source
into binary symbols would be to associate each
color with two bits, that is:
black = 00, white = 01, red = 10, and blue = 11, an
average of 2.00 bits per color.
A Better Code Using
Information Theory
A better encoding can be
constructed by allowing for the frequency of certain
symbols, or words:
black = 0, white = 10, red = 110, blue = 111.

How is this encoding better?

0.50 black x 1 bit = .500


0.25 white x 2 bits = .500
0.125 red x 3 bits = .375
0.125 blue x 3 bits = .375
Average-- 1.750 bits per car
ENTROPY

A quantitative measure of the disorder of a


system and inversely related to the amount of
energy available to do work in an isolated
system. The more energy has become
dispersed, the less work it can perform and the
greater the entropy.
Information Theory Cont..

Furthermore Information Theory tells us


that the entropy of this
information source is 1.75 bits per car and
thus no encoding scheme will do
better than the scheme we just described.
In general, an efficient code for a
source will not represent single letters,
as in our example before, but will
represent strings of letters or words.
If we see three black cars, followed by
a white car, a red car, and a blue
car, the sequence would be encoded
as 00010110111, and the original
sequence of cars can readily be
recovered from the encoded sequence.
Shannon’s Theorem

Shannon's theorem, proved by Claude


Shannon in 1948, describes the maximum
possible efficiency of error correcting
methods versus levels of noise interference
and data corruption.
Shannon’s theorem
The theory doesn't describe how to
construct the error-correcting method, it
only tells us how good the best possible
method can be. Shannon's theorem has
wide-ranging applications in both
communications and data storage
applications.
SHANNON’S LAW

Shannon's law is any statement


defining the theoretical maximum
rate at which error free digits can
be transmitted over a bandwidth
limited channel in the presence
of noise
where
C is the post-correction effective channel
capacity in bits per second;
B is the raw channel capacity in hertz (the
bandwidth); and
S/N is the signal-to-noise ratio of the
communication signal to the Gaussian noise
interference expressed as a straight power ratio
(not as decibels)
Shannon’s Theorem Cont..

Channel capacity, shown often as "C" in


communication formulas, is the amount of
discrete information bits that a defined
area or segment in a communications
medium can hold.
Shannon Theorem Cont..

The phrase signal-to-noise ratio, often


abbreviated SNR or S/N, is an engineering
term for the ratio between the magnitude of a
signal (meaningful information) and the
magnitude of background noise. Because
many signals have a very wide dynamic
range, SNRs are often expressed in terms of
the logarithmic decibel scale.
Signal-to-Noise Ratio

⚫ S/N is normally measured in dB (decibel). It is a


relationship between the signal we want versus the
noise that we do not want, which is in the medium.

⚫ It can be thought of as a fractional relationship


(that is, before we take the logarithm):

⚫ 1000W of signal power versus 20W of noise power


is either:
− 1000/20=50 (unitless!)
− or: about 17 dB ==> 10 log10 1000/20 = 16.9897 dB
Example
If the SNR is 20 dB, and the bandwidth available
is 4 kHz, which is appropriate for telephone
communications, then C = 4 log2(1 + 100) = 4
log2 (101) = 26.63 kbit/s. Note that the value of
100 is appropriate for an SNR of 20 dB.
Example

If it is required to transmit at 50 kbit/s, and a


bandwidth of 1 MHz is used, then the minimum
SNR required is given by 50 = 1000
log2(1+S/N) so S/N = 2C/W -1 = 0.035
corresponding to an SNR of -14.5 dB. This
shows that it is possible to transmit using
signals which are actually much weaker than
the background noise level.
Communication Systems
We recall the components of a communication
system:
◼Input transducer: The device that converts a
physical signal from source to an electrical,
mechanical or electromagnetic signal more suitable
for communicating
◼Transmitter: The device that sends the transduced
signal
◼Transmission channel: The physical medium on
which the signal is carried
◼Receiver: The device that recovers the transmitted
signal from the channel
◼Output transducer: The device that converts the
received signal back into a useful quantity
REFRENCES
◼ Dewdney, A. K. The New Turing Omnibus. New York: Henry
Holt and Company, 2001
◼ http://encyclopedia.thefreedictionary.com/Shannon's%20la
w
◼ http://www-2.cs.cmu.edu/~dst/Tutorials/Info-Theory/
◼ http://encyclopedia.thefreedictionary.com/Shannon%27s%
20theorem
◼ http://www.lucent.com/minds/infotheory/what5.html

You might also like