Information and Digital
Transmission
Haykin chapter 9
Carlson chapter 16
1
Information
What is the amount of information that a message “A” with probability of occurrence
PA provides?
1
I A = log 2 bits
PA
For n equiprobable messages with probability of Pi =1/n is:
1
I i = log 2= log 2 n bits
pi
A memoryless is a source which produces statistically independent messages
The average information carried by a message from a memoryless source is defined as:
H = I average = E{I i } = I i
i=n 1
H = Pi log 2 bits
i =1 Pi
Where H is known as Entropy
For equiprobable messages the binary pulses (bits) representing the signal is
Equivalent to one bit of information and
I i I i max 2
Information transmission rate
The average length of a code word representing a message is:
i=n
L = Pi L i
i =1
Where L i is code word length of message i.
The efficiency (E) of a code is:
H
E= 1
L
Note that you can make the average code word length smaller than H
The information transmission rate is defined as:
R = r H bits/second
Where r is the symbols (message) rate.
3
Example 1
The four symbols A,B,C,D occur with probability 1/2,1/4,1/8,1/8,
respectively
a. Compute the information in the three-symbol message X=BDA
b. ,assuming that the symbol are statistically independent.
b. The average information per symbol.
c. The average information per symbol if the symbols are
equiprobable.
Solution
a. Because the symbols are independent then:
1 1 1
I x = log 2 + log 2 + log 2
PB PD PA
1 1 1
I x = log 2 + log 2 + log 2 = log 2 4 + log 2 8 + log 2 2
1 1 1
4 8 2
I x = 2 + 3 + 1 = 6 bits 4
i=n 1
b. H = Pi log 2 bits
i =1 Pi
1 1 1 1 1 1 1 1
H = log 2 + log 2 + log 2 + log 2
2 1 4 1 8 1 8 1
2 4 8 8
1 1 3 3
H = + + + = 1.75
2 2 8 8
1 1
C. H = 4 * log 2 = 2 bits
4 1
4
5
Example 2
Consider a discrete memoryless source with source alphabet So,
S₁, S₂ with respective probabilities of P₀=1/4, P₁=1/4, P₂=1/2.
a. Determine the entropy (H) of the source.
b. If two source alphabet is combined to form a set of symbols ,
determine the entropy H of the Extended source.
Solution
a. i=3 1
H = Pi log 2 bits
i =1 Pi
1 1 1 3
H = log 2 4 + log 2 4 + log 2 2 = bits
4 4 2 2
b. The output of the extended source with messages probability is shown below.
Message
Probability
i=9 1
H( 2) = Pi log 2 bits = 3 bits
i =1 Pi Note that for memory less source H(n)=nH
6
Channel Capacity
S
C = B log 2 (1 + ) bits / sec ond
N
C = Maximum channel capacity
B Channel bandwidth
S
signal to noise ratio of the channel
N
1. This is most the important equation which determine maximum transmission rate
on a Gaussian channel
2. It is known as “Hartley Shannon’s equation”
3. Shannon proved it is impossible to transmit information on a Gaussian
channel without error if R<C.
4. However it is proved that for R<C there is a coding technique by which
Information could be transmitted free of error or with small amount of errors.
7
Channel with unlimited bandwidth
Although Hartley-Shannon’s equation relates C to B and S/N, there is a limit
which “ C “could not exceeds even if the bandwidth approaches infinity
S S
C = B log 2 (1 + ) = B log 2 (1 + )
N N 0B
1
N0 S
N0 S S S S B S S
C = lim log 2 (1 + ) = lim log 2 (1 + = lim log 2 (1 +
S BN 0
lim B ) )
B→ B→ S N 0 N 0 B N
B→ 0 N 0 B N
B→ 0 N 0 B
1
S S
let x = then lim
C = lim
log 2 (1 + x ) X
N 0B B→ B→ N0
1
S
As B → , x → 0 and lim
C = lim
log 2 (1 + x ) X
B→ x→0 N0
1 Conclusion: The Maximum C for
(1 +
But lim x) X =e Fixed S and infinity bandwidth is
x→0
S S
C= log 2 e = 1.44
and lim bits / sec ond S
N0 N0 1.44 bits / sec ond
B→ N0 8
Example 3
A black and white television picture may be considered as composed
of approximately 3x105 picture elements. Assume that each picture
element is equiprobable among 10 distinguished brightness levels.
Thirty picture frames are transmitted per second. Calculate the
minimum bandwidth required to transmit the video signal assuming
that a 30 dB signal to noise ratio is necessary for satisfactory picture
representation.
Solution
1 1
Information per picture element (I i ) = log 2 = log 2 = 3.32 bits
Pi 1
10
Information per picture frame (I i ) = 3.32 3 105 = 9.96 105 bits
Information per rate r = 30 9.96 105 bps
S
C = B log 2 (1 + )
N
But R ≤ C
C 29.9 106 29.9 106
B= = = = 3 10 6
= 3 MHz
log 2 (1 + ) log 2 (1 + 10 )
S 3 9.96
Practical black and white TV uses9
N
bandwidth of 4 MHz.
Example 4
A voice-grade channel of the telephone network has a bandwidth
of 3.4 kHz.
(a)Calculate the information capacity of the telephone channel for
a signal-to-noise ratio of 30 dB.
(b) Calculate the minimum signal-to-noise ratio required to
support information transmission through the telephone channel
at the rate of 9,600 b/s.
a. Channel bandwidth B =3.4 kHz.
Received signal-to-noise ratio SNR =103 = 30 dB
𝑆
𝐻𝑒𝑛𝑐𝑒 𝑡ℎ𝑒 𝑐ℎ𝑎𝑛𝑛𝑒𝑙 𝑐𝑎𝑝𝑎𝑐𝑖𝑡𝑦 𝑖𝑠: 𝐶 = 𝐵 𝑙𝑜𝑔2 1 + = 3.4 × 103 × 𝑙𝑜𝑔2 (1 + 103 )
𝑁
C = 3.4 × 103 𝑏𝑖𝑡𝑠/𝑠𝑒𝑐𝑜𝑛𝑑
(b) 9600 =3.4 x 103 log2 (l + SNR)
Solving for the unknown SNR, we get
SNR = 6.08 (7.8 dB) 10
Ideal Demodulator
For ideal demodulator Co= Ci Si , ni , So , no ,
Bi , Ci Bo ,Co
Ideal Demodulator
Si S
B i log 2 (1 + ) = Bo log 2 (1 + o )
ni no
Bi
S B S S
log 2 (1 + o ) = i log 2 (1 + i ) = log 2 (1 + i ) B o
no Bo ni ni
Bi
So Si Bo
Or (1 + ) = (1 + )
nO ni
Bi
So S
= (1 + i ) B o − 1
no ni
So Si
But usually 1 and 1
no ni
Bi
So Si Bo
( ) 11
no ni
Example 5
An engineer claims to have designed an analog communication
system that has: a. S₀/N₀ =60 dB b. Si/Ni=5 dB and Bi=10 B0.
Do you believe this claim?
Bi
So S
= (1 + i ) B o − 1
no ni
10 B 0
So
= (1 + 10 )
0.5 B o
− 1 = (1 + 3.1)10 − 1 = 1560667
no
So S
= l 0 log(1560667) = 61.93 dB ( Maximum 0 )
no n0
The claim is highly doubtful
12
Introduction to Coding
1.Most of messages are not Compatible with transmission channels and so some types
of coding are necessary
2. Channels are analog or digital and specified by various characteristics
a. Bandwidth
b. Transmitted power
c. Distortion
d. Error rate
3. The information should be transmitted as fast as possible.
4. The BER should be as small as possible
5. High speed transmission requires low redundancy code
6. Low bit error rate requires codes with redundant information
7. Source coding could be used to increase transmission rate
8. Channel coding could be used to reduce error rate
13
Source Coding for digital data
X P(x) Code 1 Code2 Code3 Code4 Code 5 Code 6
a 0.73 00 00 0 1 1 1
b 0.25 00 01 1 10 00 01
c 0.02 11 10 11 100 01 11
Properties of the codes
1. Uniquely decodable: Codes which allows us to invert the mapping to original symbol
alphabet
a. Code 1 is not uniquely decodable why?
Because a and b are assigned the same binary sequence
b. Code 3 and 6 are not uniquely decodable why?
Decode the sequence 10111 in code 3
The sequence is either: babbb, Or babc, Or bacb
Decode 10111 in code 6
The sequence is either: abc, Or abaa
14
2. Prefix-free property: A sufficient (but not necessary) condition to assure that a code i
Uniquely decodable is that no codeword be the prefix of any other code
Prefix free code has the property that they are instantaneously decodable
Instantaneously decodable code is one for which the boundary of the present codeword
can be identified by the end of present codeword.
The advantages of instantaneous codeword is its simplicity and required no additional
Space to separate symbols
X P(x) Code 1 Code2 Code3 Code4 Code 5 Code 6
a 0.73 00 00 0 1 1 1
b 0.25 00 01 1 10 00 01
c 0.02 11 10 11 100 01 11
Code 2 and 5 are free-prefix codeword they are uniquely and instantaneously decodable
Code 4 is not a free-prefix code but it is uniquely decodable but not instantaneously
decodable ie when transmitting 10 the receiver cannot determine if this the whole
codeword for symbol b or partial code word for codeword C.
15
Some Coding Techniques
1.Huffman Code
Is a free-prefix , variable length code that provides an organized technique
For the best-length code for a given set of messages
Example6: Suppose that we wish to code five messages S1,S2,S3,S4 and S5 having
The probabilities of 1/16,1/8,1/4,1/16 and 1/2 respectively, using Huffman’s code
technique.
Solution: The process follow the following steps:
First: List the messages along with their probabilities in descending order of
occurrence.
Message Probability
S5 1/2
S3 1/4
S2 1/8
S1 1/16
S4 1/6 16
Second: Merge the two enters with lowest probabilities to form a new branch with their
Composite probability. Continue this process Until you ends with no branch is rises
1 2 3 4 5
S5 1/2 1/2 1/2 1/2
0 1
S3 1/4 1/4 1/4
1
0
S2 1/8 1/8 1/2 The code is given by:
0 1 S1→1110, S2→110,S3→10,
S4→1111,S5→0
i =5
L = Li Pi
i =1
S1 1/16 0 1 1/4 L = 4 * 1 + 3 * 1 + 2 * 1 + 1* 1 = 15
16 8 4 2 8
i =5
S4 1/6 1 1/8 H = Pi log 2 Pi
i =1
15
H= 17
8
Example 7: Find the Huffman code for the following seven messages with probabilities as
indicated
Message S1 S2 S3 S4 S5 S6 S7
Probability 0.05 0.15 0.2 0.05 0.15 0.3 0.1
18
S6 0.3 0.3 0.3 0.4 0.6
0.3
0.3
S3 0.2 0.2 0.2
0.3 1.0
0.6 .4
0.2
S2 0.15 0.15 0.2
0.3
0.4
0.15
S5 0.15 0.15
0.3 0.2
0.15
S7 0.1
S1 0.05 0.2
0.1
S4 0.05
19
S6 0.3
0
S3 0.2
1
0
S2 0.15
1
0
S5 0.15
0 1
S7 0.1 1
0
S1:1110
S1 0.05 1 S2:010
0
S3:10 i =5
L = Li Pi
1 S4:01111 i =1
S4 0.05 L = 4 (0.05 + 0.05) + 3 (0.15 + 0.15) + 2 (.2 + .3)
S5:011
S6:00 = 2.6 bits
H=2.57 bits
S2:110
L 2.6
E= = = 0.9885 20
H 2.57
Shannon- Fano coding
Shannon-Fano technique is similar to the Huffman’s technique except that the
Operation s are performed in a forward direction.
Example 8: Solve the following problem using Shannon-Fano technique.
Suppose that we wish to code five messages S1,S2,S3,S4 and S5
having The probabilities of 1/16,1/8,1/4,1/16 and 1/2 respectively.
21
1 2 3 4 5
S5 1/2 S5: 0
0
S3 1/4 1 0 S3: 10
S2 1/8 1 1 0 S2:110
S1 1/16 1 1 1 0 1110
S4 1/6 1 1 1 1 1111
22
Example9: Find the Shannon-Fano code for the following seven messages with probabilities
as indicated
Message S1 S2 S3 S4 S5 S6 S7
Probability 0.05 0.15 0.2 0.05 0.15 0.3 0.1
23
1 2 3 4 5
S6 0.3 0 0 S6: 00
S3 0.2 0 1 S3: 01
S2 0.15 1 0 0 S2: 100
S5 0.15 1 0 1 S5: 101
S7 0.1 1 1 0 S7: 110
S1 0.05 1 1 1 0 S1: 1110
S4 0.05 1 1 1 1 S4: 1111
24
Example10:
A moving vehicle equipped with a monochrome TV camera is proposed to
continue the exploration of the surface of Mars. The TV pictures will be
digitized for transmission back to Earth. We want to estimate the time
required to transmit one picture, given the following information.
A digitized picture will consist of p = 400 X 300 pixels (picture elements),
each pixel having one of 16 possible brightness levels.
The Mars-to-Earth link is a microwave radio system with carrier frequency
fc = 2 GHz and path length L= 3 X 108 km (L=268 dB}. The transmitter on the
vehicle delivers ST = 20 W signal power to a dish antenna one meter in
diameter (GT=26 dB}. The Earth station has a 30-m dish antenna(Gr=56 dB)
and a low-noise receiver with noise temperature Tr = 58 K.
25
ST=20 watts
TX
Gr=26 dB
Gr=56 dB
Path loss(L)=268 dB
S TG TG r
Sr =
L Sr=? Rx
20 102.6 105.6 Tr= 58 Kelven
Sr = = 5 10−18 Watts
1026.8
N r = KBr Tr
Nr
= N 0 = K r Tr = 1.38 10− 23 58 = 8 10− 21 W / Hz
Br
No transmission bandwidth is given so assume that B>10 R
For ideal system R≤C≈ 1.44 S/N₀=9000 bits/sec
RT=400x300x4=480000 bits
26
T=480000/9000≈53 second
A keyboard machine with 64 different symbols is connected to a
voice telephone channel having B = 3 kHz and S/NoB = 30 dB.
(a) Calculate the maximum possible symbol rate for errorless
transmission.
(b) Assuming B can be changed while other parameters are fixed,
find the symbol rate with B = 1 kHz and B→∞.
a.
R≤ Blog₂(1+S/N)=3x103 x log₂(1+103 )=30000 bits/sec
r log₂64=30000
Or r=30000/6=5000 symbols/sec
b. S/N₀B=3x103
And S/N₀=3x10⁶ , B=1 KHz
C=103 log₂(1+ 3x10⁶)=1.2x10⁴ bits/sec
r= 1.2x10⁴/6=2000 symbol/sec
For B→∞
C=1.44 x S/N₀
27
C=4..32x 10⁶ bits/sec And r=4..32x 10⁶ /6=720000 symbols/sec