0% found this document useful (0 votes)
23 views24 pages

Dcmod 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views24 pages

Dcmod 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

MODULE 1

Information Theory
CHAPTER 1
and Source Coding

University Prescribed Syllabus


Block diagram of digital communication system, Information content of a source symbol, Source entropy, Average
information rate, AWGN channel, and Shannon- Hartley channel capacity theorem. Introduction of source code,
Huffman code, Shannon-Fano code.

1.1 Introduction ...... 1-3


1.1.1 Transmitter Section 1-3
1.1.2 Receiver Section. 1-5
1.2 Reading Purpose 1-7
1.3 Measure of Information. 1-7
1.4 Entropy and its Properties........ 1-8
1.4.1 Derivation of Maximum Entropy 1-9
1.4.2 Examples on Entropy **... 1-10
UEx. 1.4.1 MU -Q. 1(a), May 17, 5 Marks 1-10
1.5 Important Types of Channels.... 1-11
1.5.1 Examples on Information Rate, Channel Diagram. 1-12
1.6 Channel Capacity. 1-14
1.7 Channel Capacity Theorem/ Shannon - Hartley Theorem. 1-15
1.7.1 Examples of Channel Capacity....... 1-18
UEx. 1.7.1 MU- Q. 1(a), May 15,5 Marks 1-18
1.8
Channel Coding Theorem /Shannon's Second Theorem, (Reading purpose) 1-19
1.8.1 Concept of Channel Coding. 1-19
1.8.2 Statement of Channel Coding Theorem... 1-19
1.8.3 Application of Channel Coding Theorem to Binary Symmetric Channels 1-19
1.9 Shannon's Source Coding Theorem (Reading purpose)... 1-20
1.10 Data Compaction 1-21
Digital Communication (MU - Sem. 5 - E&TC) (nformation Theory &Source Coding)...Page No. (1-3)
1.1 INTRODUCTION Modul
Digital communication is the physical transfer of data over point-to-point or point-to-multipoint communication
channel. It is transfer of discrete messages.
Working Principle
The information signal is many times in analog format. To digitally modulate this information signal, it must be
converted into the binary bits. Formatting coverts the source information into bits and makes the signal compatible for the
network.
Following are the details of Transmitter section and Receiver section
Blocks of Transmitter section Blocks of Receiver section
1. Format 2. Source encoder 1 Multiple access 2. Frequency despread
3
Encrypt 4. Channel encoder 3 Demodulate and sample 4 Detect
5. Multiplex 6. Pulse modulate 5. Demultiplex 6. Channel decoder
7. Bandpass modulate 8. Frequency spread 7 Decrypt 8 Source decoder
9. Multiple access 9 Format
DCS Transmitter
Information Message Channel From other
SOurce symbolsie symbols sources

Fomat Source Channel Pulse BandpassFrequencyl Multiple


encode Encrypt encode
Multiplex |modulate modulate spread access

Digital 9(0)
input C
h
ht)
Bit Synch Digital Digital Channel
stream ronization baseband bandpass impulse n
waveform wavefom
Digital response e
o0 bosooos outputvas
Z(T) r()
De
Source Channe De modulate |FrequencyLMultiple
oFormat decode Decrypt decode MultiplexDetect and despread access
sample
Message Channel
Information symbols To other symbols
sínk destinations
DCS Receiver

Optionale
Essential

(1A2)Fig. 1.1.1 : Block diagram of typical digital communication system


- Refer Fig. 1.1.1. Signal processing in digital communication system is shown in the Fig. 1.1.1.
1. Format
A 1.1.1 Transmitter Section
Input : Analog signal x(t)
The steps followed at the transmitter side are : Process : formatting
1. Format 2. Encode
Output : digital bits
3. Encrypt 4. Channel encode
The information signal is many times in analog format.
5. Multiplex 6. Pulse modulate To digitally modulate this information signal, it must
7. Bandpass modulate 8. Frequency spread be converted into the binary bits. Formatting coverts
9. Multiple access the source information into bits and makes the signal
compatible for the network.

(New Syllabus w.e.f academic year 21-22) (M5-69) Tech-Neo Publications..A SACHIN SHAH Venture
(Information Theory &Source
Digital Communication(MU- Sem. 5-E&TC) Coding). Page No(4 Digital Communication (MU- Sem. 5- E&TC) (Information Theory &Source Coding)..Page No. (1-5)
Quantized signal
(i) Frequency Division Multiple Access (FDMA), Modu
Output : multiplexed signal 1
Multiplexing is used to combine the different signals. (ii) Time Division Multiple Access (TDMA),
These different signals may have originated from (ii) Code Division Multiple Access (CDMA) and
different sources.
Digital
Multiplexing allows sharing of communication (iv) Space Division Multiple Access (SDMA).
encoder
resource like frequency, time, etc, on local basis. 1.1.2 Recetver Section
This allocation of resources is for long term basis. It
Sampling Quantzing Encoding 11...1100 can be done ahead of channel coding or before The steps followed at the receiver side are exactly
Digital data modulation processes. reverse as that of transmitter section. They are:

Analog signal 6. Pulse modulation 2. Frequency despread


1. Multiple access
Modulation is a process of converting data bits into 3. Demodulate and sample
waveforms that are compatible to the channel. 4. Detect 5. Dermultiplex
After pulse modulation, digital baseband signal is 6. Channel decode 7. Decrypt
obtained. 8. Source decode 9. Format
PAM signal
Various pulse modulation methods are available in
Fig 1.1.2 Conversion of Analog Signal to Digital Signal digital communication system like PCM Pulse Code
Modulation), DM (Delta Modulation), ADM 1. Multiple access
Analog signals are converted to digital format by three 3. Encrypt Same as transmiter section, multiple access techniques
(Adaptive Delta Modulation), etc.
processes like sampling, quantization and coding. are used for sharing the communication resources.
Refer Fig. 1.L2. Input : digitally encoded bits For making the wave compatible to the channel, line
Process : encryption coding may also be used in the DCS. 2. Frequency despread
Sampling : IL is he process of conversion of
continuous time domain signal to discrete time domain Output: secured encrypted bits 7. Bandpass modulation Input : transmitted data
signal. To maintain the authentication and security of the Process: multiplication of transmitted data and
When the channel does not support the digital pulse
Quantization : It is the process of approximation of information data signal, encryption of the data is wideband signal
carried out. like waveforms, bandpass modulation is used instead of
the signal. pulse modulation. Output modulated signal
Coding: Each quantized level is allocated with binary Encryption data prevents any unauthorized receiver
to disturb the signal. It also prevents data from the
If the frequency is spread at the transmitter section,
code. 8. Frequency spread
then despreading is done at the receiver.
If the data is already in digital format, then this process spurious signals entering in the channel. This is originally developed for military applications to For this, synchronous wideband signal is needed at
of formatting is bypassed. The signal is directly given 4. Channel encode maintain secrecy of the data. But, it is also used in the receiver. This wideband data is same as that of
to pulse modulation block. multiple access techniques. transmitter.
Input : digital bits stream
2. Encode In the process of spreading, the input signal spectrum is 3. Demodulate and sample
Process : application of error control mechanisms multiplied with wide-band signal and the output
Output : digital code words with redundant error obtained is spread spectrum signal. Definition: Demodulation is recovery of a baseband
Input: quantized levels. pulse waveform.
Process : encoding the quantized levels in digital bits. control bits It provides interference protection and also privacy of
Channel encoders are used to add redundancy to the the data is maintained. Input : Received waveform + noise
Output : digital bit stream.
Source coding is a process of conversion of analogy to binary sequences. 9. Multiple access
Process : Demodulation, i. frequency down
coaversion
digital signals. It adds the error detection and correction bits. Hence,
It is a process of sharing of communication resource on Output : Sampled pulse waveform
It also removes redundant (extra) bits, if any present in channelencoding reduces the probability of error but it Received waveform is generally not in proper shape as
the data stream. DCS uses either formatting or also increases decoder complexity sometimes. remotely basis. This allocation of resource is on
source noise may be present in it. During travelling. he signal
coding mechanisms. As the redundant bits are added, the required temporary basis. Resource allocation is done as per
demand from the user. may undergo inter-symbol interference (ISI -
Source encoding reduces the required sysiem transmission bandwidth also changes after channel
coding mechanisms. interference / overlapping between adjacent
bandwidth. Multiple access allows sharing of frequency, time, symbols). ISI distorts the signal and hence needs to be
For encoding process, tbe basic pre-requisite 5. Multiplex space or codes. Depending on the sharing parameter, tackled at the receiver before detection phase.
required are sampling and quantization. processes there are four different methods of multiple access
Input : muliple signals Equalization circuit is an important block in
techniques viz. demodulators. Equalization reduces or removes
Process : multiplexing
(in some cases) the distortions present in the pulse
(New Sylabus w.e.f academic year 21-22) (MS-69) (New Syllabus w.e.f academic year 21-22) (M5-69) Tech-Neo Publications..A SACHIN SHAH Venture
L Tech-Neo Publications..A SACHIN SHAH Venture
New Digta
6
Output Detec4.
redundant
Process Input Process:Demultiplex channel
Input present, else symbol.
5. Output Process block.Input making Definition: detection
providedshaped Fiprocesses.
oalwaveforms,
Syflabus Channel
decode Communication
: : : step
Inomaton Informaton Digital :Message
: but Demultiplexing : : of to
pulse
SOUrce decoding messageEstimation Sample the in
wef bits strean Detection detextion It
added bit symbols
digital demodulaistion makes (MU-
adernic received
of pulse sampled
streams svabol.symbol of block che
Analog nonator ornatonnfonaton uringchannel bit, can
ornonTerda Analog Tera the Sem.
from ar received e, be i and is pulse
year channel channel ifmessage logicdefined 5
syabols, ao
L-22) demultiplerer samplthis ing. compatble E&TC) -
symbols. channel symbol from sample
coding. Le, or as
S-69) Sercaatz logic the The
Format removal demodulator
or for
block coding channel '1decisio0
. valproperl
ue y further
Fomat
Decode is
of is

Ld Encode to information
digital IfInformation ofmessage Process:message
signalOutput 9. Process
Formatmessage
Output : Input Source 8.encrypion
decodeOutput
Output
Process InputDecrypt 7.(Information
the the
the information : Input:
-Neo Fig., channel :
Stream communication Fig. decrypted : : bit :
where Bit conversion decoding : decrypted
1.1.3. continuous done :
stream bit
Demodulatel sink decryption Coding).
Theory&
stream Source
cnnel modulate
detect Pulse
tions 1.1.1coding orlike
digital consists symbols
received at
are analog symbols of bit bit he channel
received
coding is tim e of th stream
e without
forns wavePulse not
system information. transmitter)
not digital bit
stream (exactly
needed; used, information theofdomain from error
ls stream
CHIN A not Receive Channel Transmit diagram to
some various source
decode
from control
used signal, analog reverse
in
SHAH hat
can toptional signal,different bits
cse,bes
be isignal
,cnig e.,
Ven redct blodk terh h

Consider Information(B) 2 1.
Information
(A)
(New GQ Digtal
3. 2. 1. No. Sr.
information.
contains
But Every These information
It signal.
symbols. Discrete Examples:
produces sampling.
process ofinformation
It ItisAnalog
sourceSource
Sylabus will Zurich. cold in
dayToday is Wnte 1.3 Hence,others. consists can PURPOSE
READING
1.2
sDowfall
possibilityThere
Zurich.today of isa Zurich.sunnyTomorrow veryStatement
the continuous Communication
some
a it message symbols be
following shortMEASURE is information
source
day be
w.e.f necessary messages of continuous transformed
Micropbone
in a note are discrete
academic predictable. in cannot generated valued (MU-
uDcertain. this. as
II much certain,
Almost
as Table
1.3.1 3 emitted
itthan Less predictable. statements on
is is it Certainty
measure OF toconvey set DiscreteAnalog
very case rare is quantifythe amplitude. into
information Sem.
very INFORMATION by at of
year information.
certain Hence,
predctpeopleevent, actuated discrete
Duch and I : more the
discrete letters 5-
of E&TC)
21-22) inforraton information by source.
SomeDot :0.0.98,or
information is
coaveyed piece0.02, information
Approx Probability
So,no 1.= information.
information intervals. or continuOus information
Probabiity
(MS-69) alphabets voice
so
0.01 or
great source
but of than timesignal by
of a

4. 2. 1.
Na. Sr. (m)is probability
knowledge basis
3.
respective
Folowings baseGenerally
base actual onlyHence,
uncertainHence. Frorm
Henceprobabilities Definitionquantity content.
one Information (Infomation
the
ech-Neo 10 2 Not Baselogof Table Therefoe
specified has
of
inverselyTherefore, on
of or the
132: 'tolog' amount than q amount amount
of
units. be the the contin it the Table
are 2. proportional possible occurrence
likeliboodis Theory
lications. Log isunit measure its uncertainty
ofSuppose : contained
Refer thspecified.
e of
occurrence
P+P,+P, probab1lity ofmostshown of
1.3. 1
nat
HartleyDecit Binit Cnit base of
Bit messages information information &
and Table base information infomation
of information. that of of weSource
or information to
an in
If of occurrence an can
4
respective 1.3.2. used
it P infomation
P,, of a the the
SACHIN A 4 in message say
event Coding).Page
in is . . + P, m,, occurrence. in
event messages
nat Hartleys specifiedNot for symbol,
binary
then is in a received that,
Formula +P,=1
= units binit' is P, m, message of is
SHAH log given k source s ratber the
related on
log, used - an
and message m a which event from No.
Venture when as, separate than
depends intuitive
with emits to (1-7)
their
no its are the the
I
Modu
1
4 Property 3.3 Propertles
Property 2.2 Property 1.1 8Digital
P,=0.75
Given : Min 0.75. Ex.
(New bits, |I moreHenceStatement Therefore, ts
shouldStateiment Therefore,;20
Soln.:Determine Lecontent.
t StatementProperty 4Therefore,
independent
P,respective
1.3.1: (m, Hence information For m, Statement:
Syllabus Harileys nats bem, Let highest Communication
their information. the
P,
and A m)joint and P : approach
4 Hartleys. binary probabilities m,
the information : < symbol and Let value For :
w.e.f = Information
= information I besymbols Total Py P,there 14’0 P’1 asits a (MU
logsp= 0.2876nat 0.415bit log symbol (m) two with 1,
symbol
academic bint beare lowest the
+I independent information I, Sem. -
is
(m) respectively. should lower their two valuamount
e. with should
logs occurs givenas,
year associated probabilitiesdifferent 5-E&TC)
D5= log: = be
probability probabioflity always
21-22) p with conveyed
symbols the information
0.124 log 2 bi) with a symbols be
(MS-69) 63) probability sum respectwiilvely. approaching
the and positive.
Harleys by contain m,
logN) symbol P, theirof in
and two and it
of
6Q. 4. 2. Find messageSaduri
re :ng1.3.2:Ex.
TLet Concept UQ. 3. Given
SSin Let (1) () 1.4 I,= Soln. : the (Information
S, important Definition
ZFor symbol
DefineExplain L,=1,=log: I,=logs : each
-Neo
SHAH S,
PpP, P, a
information
content.
nessage instantaneous
changes.
Statistical also As defining log,,= P,-6 4. 1.information
S, source A
is Entropy ENTROPY
the P, messagesource
m the
parameters
p= Theory
Sy emits
statistically a the called : concept 16 =;
ns..
Venture dependence The 1=
1_log log: log: content puts
respectively. thP, ebe sequence selected
information of 8 4 interval.
one
as an log g4 2 &
flow 'sourceaverage (MU out
Coding).Source
/ =log lo1_
of must information of AND 16 log, 2= of
independent of entropy bits2
=4iog2=3 log 8 one
ACHIN.Aprobabilities
theof M will Q. -Q. ITS each
information
of symbol be entropy'.
information 2=2 The of
possible symbols content 1(b), 1(c),
log2
alter
considered: PROPERTIES
in bits log thesc
of probabilities five
May Dec. source. detail. bits 2 possible
sequenc.symbols changes, of mesSsages
present
the 19, 15,
in the content bit 1
a 2
symbou averag system symbol, Marks) Dec. of
D in per 18
D the he

(New message.
As UQ. GQ. WhenUQ. is UQ, UQ. Digital
andWhere, source For It message.
dividingtheby
Now each
Similarly. information
contentas,Hence information
Hence P, Let
Generally, Hence,
Syllabus P+P, ?
maximumproperties.Derive Prove ofShow 1.4.1 is k
P determination Derive a source there
is of discrete averageTherefore, of timesCommunicabion
the the the the P
+P, the P, k entropythe Hmn
symbol are
w.e.f symbols
= Dertvation entropy
entropy total
M k S,and
probability+P,+ expression equation = occurrences
+.. 0, MUlog,M.memoryless infornation symbols wil so k
S,
academic (sa) information on. symbols
* .. - is cquation. =og:1S) content
p of occur
i wil (MU-
P, +P, i= we of MU Q.
maximum7 MUmaximum i=1 M
of maximum
need, for 1(b), MU- of term occurring
=1 1,2, -Q. for Q.
P, occur in Sem.
year occurence =1 maximum source Maximum per i=1 M
of k
... entropy.
1(c), by content times. one
May Q.1(c), 1(3,
symbol symbol P, 5-
21-22) k when number k bits symbol
entropy Dec. are Dec. 19,
ofE&TC)
When entropy. May times. wil times, kthe
of Q. Dec.19,
equiprobable. al
S,
(M5-69) M, 16, 18, 1(c), long
18, theEntropy bits/symbol ofcan be contribute is
symbol of 5 is 2 symbols be S,
entropy Give Marks Dec. 4 4symbols the messages.
.(1.4.2) ..(14.1) discrete Marks Marks Marks obtained bits wil
sum
in its 15. occur
a in of an
Properties
Entropy of Taking
ch-Neo
Prespectively. messageS than
Let discreteLet memory-less
a source.symbol If givenby. kExample : ThissourceHence,
the symbol -1- . dPd Now (intomation
the H=0. H
maximum
antilog Adjusting
source log -P, P, dP ::
probabilities m, produced0sHS log, k produces
maximum P,
discrete IfP, : on P log has
tions.A M,, has Hence, log, =
k +Since.dP
both 1 . the P, become =
Theory
memoryless
source
myhas memory, = i=1 valueM - 1-
decreases. memoryless equiprobable -Z sides log +log terns. P,
of H 1. of
value dP log
(P all =P, P, P, P, dP a (P, &
thesem-1* other entropy we we P) function
then of =-1,getwe
, =0
+Source
SACHIN Hence 1)= entropy get, log = =0 get, P,
messages uncertainty <H_s P =0 +
source = is,
symbols. P, dP, of
0, Coding)..Page
..
its al +
SH has #ji occurs P,
be entropy P
related s. +
P,
geDerated an ...(forany i)
vetue entropy ...(14.5) when ..
P, is ...(1.44)
less with
..(14.3) 1 + No.
P, the P,.) (1-9)
Mod
(Intormation Theory &Source Coding).
Digital Communicatlon (MU- Sem, 5-E&TC)
Ex. 1.4.2:A discrete memoryless source .Page No (146
i
Digltal Communication (MU- Sem. 5-E&TC) (tntomation Theory &Source Coding)..Page No. (1-11)
transmitting three distinct symbols mg, m,
Property 1
H(M) = 0 if and only if P ’1 for some
k. The Drobabilities are 1/2, 1/4 and 1/4 capable Extension of a discrete memoryless source Mod

remaining probabilities in the set are all zero.


H(M) = 0 source entropy. respectively. Calculae Source entropy
8
- Let M be the extended source of symbols,
1
is the lower bound on entropy and it corresponds to no M Soln.: H = P(S) log, where M= distinct symbols
uncertainty in the event. ke0

2 P,=i 3. P, In case of discrete memoryless channel, all the symbols


Proof
Given: 1. P,- -

are statistically independent.


3
Hence probability of source symbol of an extended
As we know, for no uncertainty, P, ’1
H = 2 P, log, 1p +log, (8) + log, (16) +6log, (60 source is equal to the probability of the product of 'n'
1 k=1
source symbols present in each data block
+log, (8)+ log, (169) + log, (6) Example : Let n = 2 and M=2
H >0
. M = 2'=4 distinct blocks of data present
Also, P, log, p=0 iff P, =0 orP =1 :. H . H =3 bits /symbol . Source symbols produced are 00, 01, 10 and 11.
.. H- 0 iff P, =0 or P, = 1 ..for some
Their probabilities are :
Hence Proved. 1.5 MPORTANT TYPES OF CHANNELS P(00) = P(0) × P(0)
P (01) = P(O) x P(1)
Property 2: UQ. Describe how channels can be classifed briefty
P (10) = P(l) x P0)
:H = bits / symbol explain each. MU- Q. 1(c), Dec. 17, 5 Marks
H(M) = log, k if and only if P, =t, for all k. This P(11) = P(1) xP1)
means, all the symbols in the alphabet M are . Source entropy =bits / symbol (i) Discrete memnory less channel Because P(AB) = P(A) · P(B)
equiprobable. () Binary Symmetric Channel (BSC)
This is upper bound on entropy and it corresponds to Ex. 1.4.3: Find the entropy of second order extended soures
Tberefore entropy of extended source is given as,
maximum certainty. of the discrete memoryless source.
() Discrete memoryless channel HX) = n H(X)
Proof : Given in section 2.4 - Derivation of Hmnmax This source is capable of transmitting 3distinct symbols m. . H =nH
(maximum entropy)
Therefore, 0sHs log, k is true
m, and m,. Their probabilities are .2'4 and 1 Definition : A discrete memoryles channel is a
As per properties of probability we have,
statistical model with an input X and output Y. the 0sP(y,Ix) S1 for all jand k.
V Soln. : output Yis assumed to be the noisy version of X.
a 1.4.2 Examples on Entropy Now if J K, this means size of input alphabet Xand
UEx. 1.4.1 MU -Q. 1(a), May 17,5 Marks
Given : 1. n=2 2. P(m) = Here both X and Y are random variables. output alphabet Y is not same, then the transition
probabilities can be expressed as follows.
Discrete channel
Prove that the entropy of extremely likely and extremely 3. P(m) =4 4. P(m) =i PJ,Ix) Py, Ix) P K-lx)
unlikely message is zero. A channel is said to be discrete when the messages X
As it is second order source, each block will contain and Y have finite sizes.
Py, Ix) P, Ix) P(ys-Ix})
Soln.: two symbols out of mo, m, and m,.
P=
1 Extremely likely message Memoryless channel
Possible symbols are enlisted below. P(y,lxj-) Po,lx,-) . Pk-, INj-) J
Let the message be m, and its probability be P. As the
When the current output symbols depend only on
message is extremely likely, P, = 1. S S S S S S, S, S
current input symbols and not on the previous ones, the
This matrix is known as 'channel matrix'.
In this channel matrix,
channel is called as 'memoryless'. Hence, a discrete
Entropy =H= P, log( m, m m, m m m
Row ’Fixed 'channel input
mo m, m, m m m, m. m memoryless source has both properties
log 1 Column ’ Fixed channel output
= llog, 1=;log2 0 1. Alphabets Xand Yare of finite sizes
. H = 0
Probabillties of symbol 2. The current output value of Y depends only on
Fundamental property of channel matrix
P(S) = P (M) XP (M) current input value of X.
Therefore a discrete memoryless channel can be Statement - The sum of the elements along any row of
2. Bxtremely unlikely
described as, the channel matrix is always equal to one.
4 Ptends to zero . P,=0 K-1

Similarly other symbol probabilities are given below X = (x, x,, , Xy-) ...X= input alphabet)
Entropy- H =P loa ) and Y = (yo»a Yh-d.Y= output alphabet) Therefore.2 Po,Ix) =1, for allj
Symbol S,S,S, S, 8, S, S, S Hence the transition probabilities are: K=0
H 0
Probabllty 1 1 1 1 P(y/x) = P(Y=y,/X= x) ...for all iand k
4888 16 16 8
(New Syllabus w.ef academic year 21-22) (MS-69) (New Syllabus w.e.f academic year 21-22) (M5-69) Tech-NeoPublications...A SACHIN SHAH Venture
Tech-Neo Publications..A SACHIN SHAH Venturt
(Information Theory &Source
Coding.
messages/ .second
Page No {
5- E&TC)
Digtal Communication(MU Sem. r = Number of
wbere,
Jointprobablity distributlon of Xand Y H = Source entropy (infomation Theory &Source CodngPage Na. (1-13)
Digital Communication (MU - Sem. 5- E&TC)
It is given as, Tbe unit of R' is bits/ second. 3. f = 8 KHz
Mod
4 Information rate : I
P,) P(X =x, Y
=) Let the messages have folowing binary coóes Reír
a 1.5.1 Examples on
- P(Y=y,IX=x) P(X=x)
- P(y,Ix) P(4) Channel Dlagram Information Rate, ..R
R =r H= 1.875 x 0.9182
= 1.72 bits /second
Table P. 153
Table P. 153
Ex. 1.5.2: An analog signal is bandlimited to 4kHz It is
a Marglinal probability distributlon of Y. Ex. 1.5.1:Considerratelegraph source Quantization level Probability Binars code
dot and dash. The dot duration is 0.2 seconds having twO Symbr sampled at the Nyquist rate and the samples are quantized
into 4levels. The quantization levels Q,. Q, and Q, are
It is given as, duration is 3 times the dot duration. The and the t
P) P(Y=y)
J-1
dot occurring is twice that of the dash and the
symbols is 0.2 seconds..Calculate the
probablity
time d
independent messages and have the probabilities P, =P, =
: PO)
J-1
PY=yIN)P(X=x) telegraph source.
Soln.:
information
rate of and P, =P,
V Soln.:
Find the information rate of the source.
10

} Pl)P(A for K=0,1, .., k-1


Given : 11
Given :
je) Duration of dot = Tp =0.2 seconds 1. 2h=P, 3. f, =4 kKHZ P,
where, P () for j= 0, 1, 2, ... J - l= a priori 4 1 Rate of message generation, r =,
pusbilities. Hence we can say that, if at input, a pion Duration of dash =T =3 XT ... (Nyquist criteria)
prtubilities P (x) and the channel matrix are given, 1. Source entropy, H= k=0 Plog: p
then the prtabilities of vanous output symbols P () 3x0.2 =0.6 secoads = 2x S000
1 .:.r 16000 messages/ second
can he determined. P (dot) 2P (dash)
But it is binary PCM. Hence number of bits per
() Binary symnetric channel. 4. Space between symbols =T=0.2 seconds message =2
is a special case of diserete memorvless channel with . number of bits/ second = 16000 x 2 = 32000 bits
J=Ke2. 1. P(dot)+P (dash) =l ...[since P (sample spuce)|) second
Let the channel has two inputs x =a x = l and two . 2P(dash) + P(dash) =l
utputs y, =0 y, = 1.
The channel is called symmetric hecause the . P (dash) + P (dash) =1 .. Since P, =P, and P, =Pl2 Souree Entropy, H=2 P, og:
. H
Ply, =|when s, =0sent] =Ply, =0when x, = l sent) P(dash)
This is conditional prhability. Let P denotes the
oonditinal prohability.
: P(do) =2P(dash) =2x
.. H
-]"2(0s306)
= 18112 bits/ message
Transition phability diagram is shown in Fig. 1.5.1.
1 2. Rate of message generation, r = f,
2 H=P(do) x log, prae+ P(dash) xlog: Pihhi
.Nyquist eriteria)
: H x log, (3/2) + 1/3 x log; (13) = 2x 4000 =S000 Hz ... P= P, and P, =PA
.r S000 message/ second

-(og
3, Information rate :
R=r H =S000 xL.8112
0.3899 +0.5283 =0,9182 bits l symhk R = 14490 bits second = |+0.6225
Ex. 1.5.3:Consider Q,.Q. Q, and Q, are to be transmitted = l.622S bits/
3 Symbol rate : message
using binary PCM. Calculate the information rate. The 3 Information rate (R)
B . 131:ransition probability of bìnary symmetric
channel T,= (TpexP (dot) ]+(T,u NP(dash) ]+Tpe probability of the messages is as given below. P, = P, R =xH= 16000 x1.6225
.R = 26.96 kbps
Intormation Rate P,= P,Assume f, =8KHZ.
Ex. 1.5.4: An analog signal is band-limited to 8Hz san1pled
t de surne of th messages generates T aumber of
= 0.5333 sevonds Soln. : at Nyquist rate and quantized at5 levels with probabil1ties
esaes per sevn then the infomabon rate is given as : 0.5, 0.125, 0.0625, 0.25 and 00625. Cakkulate entropy and
RH Now symbol rate, 1
r=t-hT=l875 symbols / ! Given : 1. P=P, information rate.

New Sylabus wefacademic year 21-22) (M5-69) E Tech-Neo Publications.A SACHIN SHAH Vent (New Syllabus w.e.f academic year 21-22) (M5-69) è Tech-Neo Publications.A SACHIN SHAH Venture
Source Coding)..Page No. (1-15)
(Information Theory & Coding)...PageoNo (Information Theory &Source
ploltal Communication (MU. Sem. 5
- E&TC) 1 (1-44 Digltal Communicatilon (MU-
Sem. 5-E&TC) Observations/Results
Modt
bínary symmetric 1
Soln. : 86 : Channel capacity of the channel is noise-free, that
() Refer Fig. 1.6. 1. When
4 Example
= 0, the channel capacity achieves its
Glven : 1. f8Hz channel means P
0.0625, channel described by maximum value.
2. P(x) = 0.5, P(x,) =0.125, P() = Consider a binary symmetric channel use. It is same as that
transition probabilities as shown in Fig.
1.5.I. It is given as C= l bi/
P(x,) =0.25, P(x) = 0.0625 +++;-1875 bits second The conditional probability of error is P. of channel input. At this
H(p) 0.
value of C,
(0) Source entropy (H) Informatlon rate (R) Let H (X) be the entropy. Entropy is
maximum when is noise present in the channel P=.
(2) When there
.R = XH=1x 1.875 the symbols are equiprobable.
= 1. C ’ 0. It is minimum value of
capacity and
.. P() = P (%,) = h where x, =0 and x, case, channel is not of
H .R= 1.875 bits/ second
Hence capacity is written as,
H P) ’ H (P) L’1. In this
kel any use.
1.6 CHANNEL CAPACITY THEOREM/
C = IX;Y)I P(Ko) =P(x) = 1.7 CHANNEL CAPACITY THEOREM
SHANNON - HARTLEY
From Fig. 1.8.I we can say that,
GO. Define channel capacity with one example
1 P(y, Ix,) = P(y, Ixo) =P UQ. Describe the Shannon-Hartleycapacity theorem.
MU -Q. 1(a), May t6. 5 Marks
- 0.5 log, as+0.125 log: 0 25+ 0.0625
Consider a discrete memoryless channel with innw
alphabet X and output alphabet Y. Let the tranti and P(yo Ix) = P(y, Ix) = 1-P UQ. Explain with the required diagrams
Shannon

log: D0595 +025 lo8, 025 probabilities be P ,


information is given as,
I x. Hence the mutusl Now definition of mutual information is,
K-IJ-I
Hartley Theorem for Channel Capacity.
MU- Q. 6(b), May 19, 10 Marks
+0.0625 lo8: 0.0625 J-1 K-1 P)
K=0 j=0 It is also known as information capacity theorem. It is
= 0.5+ 0.375 + 0.25+0.5 + 0.25
J=0 K=0
P() Substituting transition probabilities in I (X; ), described for band-limited, power-limited Gaussian
1.875 bits/ message channels.
J=K=2 we get,
(1) Signal is sampled at Nyquist rate Here P(x,y)= P(y, lx) P(x) 1 Statement: The information capacity of a continuous
2f, =2x 8 =16 Hz J-1
f, 1X;Y)= P, y) og: P channel having additive white Gaussian noise of
.r 16 Hz But P(y) = P,x) P() k=0 j=0
bandwidth B Hertz, power spectral density
(il) Information rate (R) j=0 By putting J= K=2 and other values, it can be shown
XH = 16x 1.875 Hence for determination of mutual information, input that
.R given by,
= 30 bits/ second probability distribution (P (x) 1j = 0, 1, .., J-1) is
..R
necessary. c -1+P log, P+(1 - P) log, (1 -P) C=Blog. 1 Bbits per second
Ex. 1.5.5:Consider the five me ssages given by probabilities Therefore mutual information of a channel depends on But entropy of binary memoryless source is given as,
1/2, 1/4, 1/8, 1/16, 1/16. Calculate average information and the channel as well as the way in which the channel is HP) = - P, log, P, -(l- P) log, (1-P) - Blog, 1+Rbitssecond
information rate if r= I messages per second. used. Hence, where S = Signal power
(P (x)), ie., input probability distribution is C 1-H (P) N= N,B = Noise Power
independent of the channel. C
1.0+
capacity
Channel P= average transnitted power
Hence, I (X: ) can be maximized with respect to
B= Bandwidth in Hz
{P (x)). From this, channel capacity can be maximized.
P)6 Definitlon : The channel capacity of discrele 80.5
This is shannon's third theorem.
2. r=lmessages/ second memoryless channel can be defined as the G Derivation of channel capacity
Source entropy (H) maximum average mutual information I X; ) in
any signaling interval of A channel where GQ. Derive the Channel capacity equation.
maximization is over all possible input probability 0.5 1.0 Let X (t) is zero mean stationary process. Let X (t) is
distributions (P ()] on X. bandlimited to "B' Hz. Let Xk, K = 1, 2, K denote
(2010)FIg. 1.6.1 : Varlation of channel capaclty of a binary
Channel capncity is denoted by C. continuous random variable obtained by sampling.
ymmetrlc channel with transltlon probability P
Therefore, C max 1(x;y) Sumpling is done at the Nyquist rate, 2B samples per
(P (x)) second. Suppose, these samples are transmitted every
The unit of channel capDcity is bjts per channel use. T seconds.

(New Syllabus w.e.f academic year 21-22) (M5-69) Tech-Neo Publications...A SACHIN SHAH Venture
(New Sylabus wef acadermle year 21-22 (M5-69) LTech-Neo Publicatl ons..ASACHIN SHAH Ventu
Publicatlon.A. WTech-Neo
Venture SHAH SACHIN Publications.A Tech-Neo (M5-69) Ventut SHAH SACHIN (M5-69) 21-22) academic
year w.ef Sylabus (New
21-22) academic
year w.e.Syllabus
f (New
-N=N 4CeLC,iY)X,
Gadn
diagram. C
efficiency bundwidth shows It
1.7.2. Fig. Refer know we&s
Now, varliable. randon Gaaman betohas X,Also
1.7.2) ..(From
Fig, KHz B=4 (0)7, (1) Case diagram efmlelency Bandwldth variable. rendom Gausian le
oaxium
Y,if (YJis h
1.7.(ig.
2: bit. ener3Y
per transmitted E,
where, mAximizing
hY
Example EC powertransmitted
P Average i, requires Y) I
6,macimizing Therefore
.(1.7.13)
diagram. efficiency bandwidth Ry
Cm
capacityinformation C0. independeat
of iN) know,
h we A -
17.
of2 Fig. shown
in versa
as vice and SNR Nmit
.(1.7.8) (Nk)(Xk)-h h YK) (XK; I .
Bhennon k rateR bit data
at
atransmits system ideal anLet
bandwidth
and between trade-off alwaysa There
is
error)(Probability
of 30 30 24 18 12 diagram. efficiency bandwidth explain also and (1.7.6)
we Equation (1.7.7)
in EquationSubstituting
P,and SNR bandwidth
and between offTrade (3) theorem capacity channel Implicatlons
of Describe GQ
...(1.7.7)
>C. R, theoremn
occurs
transmission
if error-free non- condition
and which Regi
for on capaclty
information Impllcatlons
the of entropy
Nofdifferential
C <R,possible
for transmission
is Error-free RC which tor equal is given of differential
Y, entropy
boundary Capacity to
10 properties.
transmission. variables,
the randomindependent are N,and ,As
X,
like noiseGaussian have transmitted
must signal X)
(Y,(X)-b
I h =I(X,:
Y)
-error-rec transmission
from error-free separates 20 ..(1.7.6)
30 For
the this, channel. Gaussianband-limited limited,
I C, R,rate bicritical
t the for curve the is It which foRegi
r on power transmissionafor error-free ofrate the information
get, we mutual properties
of using By
diagram', efficiency fundamental theorem capacity Channel
sets
limlt
on xñof
PDF is
Definition "bandwidth known
us E,/N,
is versUs R/B of
plot A
results Comments
on Gwhich (x) respect
fto with maximization
done is Here
boundary Capacity (2) diagram efficlency Bandwidth-
second. per bits Blog,1+
NE =
.c [X]=P} {1%4:XJ:E («) ,
denslty. spectral power noise is.(1.7.15)
N, C/B ...(1.7.5) max C= ..
P
and power single where
is S 1.x44 bewil wldth band (1.7.3Equat
) ion ...(from Now,
weas Y7: and between
Xg
2C= 1
B+N NB =knowo terms
information mutual average maximum of
Inflnite wichaanel
th capucity
foar maximum Thus C E, .(1.7.12) capacity
a ofinformation Now
tansmisien log,(1+bits
per given
in channel
is
get, we
sides both antilog
on Taking power
transmitted average = whereP
= C ..,K. 2,1,K=P,X]
= E
CNlog,e C. i: ...(1.74)
get, equation
we capacity get, we inputchannel each
P lim Nobtained. i IM
og, .. chanel (1.7.11)
in (1.7.10)
and Equations Substituting (3) assigning
cost limited.
By power transmitter
is The
to
be
capacity
can channel value
of limiting this, From o`) (ame log, hN) ...(1.7.3) NB =
.(1.7.11)
Shannon's
limit. known
as This
is ogi+log by, given variance
is Its mean.
is, entropy
NofDifferential :. it AWGN. due
tosamples noise are N7
.7.16) dB
-1.6 zero has Hence,
noise variance
of The (2) time Dlscrete 1.7.1: Fig.
0.693 ,2 .(1.7.10) =N,
gsamples model channel Gaussian memoryless
(1.7.1S)) Equatlon ...Fron 1log B c (2re-log,
o`) +(P h(Y)
ofdiferential
isY,entropy and
lim get, efficiency
we bandwidth of (129)
No
density spectral power Noise Detining P+o ..(1.7.2) 1,2,3,
K .. XK+ Y7=
K= Ng i.
value. limiting its terms in output,
bit percncrgy Slgnal variance The (1) =(K noise the be LetN
pproaches -Bo1 signal
Y, received
is of K). 2,1,samples.
ratio infinite,
the bandwidth
is the If.(1.7.14) received continuous the be LetY
.C followed betoare at variables random Gaussian
noise
is whiteAdditive
Llmlt
Shannon's (1) thrccupaci
e ty channel. to
the added
(1.7.13)
..(Equatlon EC puttingP :: of
channel determination for Now .(1.7.1) 2BT = K
efflclency bandwidth Observations
from E is K
by. given samples, number
of the Hence
(Information E&TC) Sem. (MU -
- 5Communicatlon Digltal P(X]-
Coding)..PaNo
(1-19) Source Theory & 5-E&TC) (MU-
Sem.Communication Digital
Coding)..PNo, age Source Theor(Information
&
y
Ventut SHAH
SACHIN Tech-Neo (MS-69) 21-22) yearacadenic w.Sylabus
af
Venture SHAH SACHIN
Publications..A Tech-Neo a (M5-69) 21-22) academic
year w.e.Syllabus
f (New (New
second. T J1lo[g, 1°+0]=
xI lo1.[ 101x -
every once channel utilizes encoder channel The
second symbols
per effectiveness. bits/second 10378 =
improve redundancy
to reduces Decoder reliability. (11) loE, 3000 =10) C=3000
+(1
log, 1+
Bhg| copci
-C y Gend
transmission
rat=e symbol encoded :. improves
the whichredundancy adds Encoder (1+ log, B
C=
capacity Channel i.
seconds. T, system.
communication effectiveness
the of noisy.=0. extremely channel
is the As
every symbol produces encoder channel Let overall optimize
the to
done should
be coding Channel :=10
coding.
Soln.:
'r'. rate code with -
encoder channel applied
ato sequence
is source the Let channel indone mapping nverse andmapping for log 10=10 . 1KHz. B=-(1} Givea
1]. =(2) log, =responsible decoder
are channel aencoder
nd Channel capacity? channel could
be What KHz 1
loH=
g, and H
symbols
= likely cqually noise. channel overall reduction
of helps
in It bandwidth
of havinag channel noiextremely
sy Considera
for (Because
'H 1 H(=M) entropy source the If coding'. "channel Marks 515, May 1(a). 0. MU
- 1.7.1 UEL
seconds. cvery
T, -
known
as sequence
is data output intosequence 128 symbols= characters
or ofNo. (3)
emitted symbols
are The ls). and symbols
(0s binary output channel mapping of inverse andsequence input 10dB = (2) B=300OHz Capacity Channel Examples
of 1.7.1 a
channel isequence
nto incoming
data mapping
of The Given-
(I)
likely equally emits source
memoryless discrete aLet
communication
system. Soln. : M Shannon'
limit s which
is L.6dB boundary at sbows
-
Channels digital diagrarn
of block shows It
1.8.1. Fig. Refer diagram
t rate eror sbows 1.It
7.3. Fig. ploted
in is It
errors. without
Symmetric Binary Theorem
to communication
system digital diagram
of Block 1.8(2817Fig.
.1: computer the terminal
to the from
transmitted be dingram rate ETor L73: 2819Pig
Coding Channel Application
of 1.8.3 a can data which (theoretical)
at
rate maximum the Find (b)
Receiver Noise Transmitter
channel the capacity
of the Find (a)
done. bechannel
can lessmemory discrete decoder channel Bource Probability if
memoryess encoder mamonyess characters.equiprobable of
transmission
over reliable which atrate the on C Destination Channetl Dlscreta Channel Discrete
fundarnental
of
limut the theorem
sets coding Channel sequences
independent consist
of terminal the from sent data
Coding Channel Concept
of 1.8.1
and
the that characters has
128 terminal the thatAssume
probability
of small wireconstruct
ath it anchannel
d PURPOSE) (READING symbol
error P
10dB. S/N output
of of
anand3000Hz bandwidth
theinformation
over transmit possible
to not is it THEOREM, SECONDSHANNON'S usable havinga telephone
line grade voice througha
THEOREM/ CODING CHANNEL 1.8 comnute Connected
the to CRT The
Is computer. inato data -
alphanumere cnter toused terminal
is CRT :A 1.7.3 Ex. 10
-16
Converse, (2) second.characters/ 1482 errors
is withouttransmitted E,/NJ
<I2n 1,
ratecriHere(=
tical bits/second 30000 =
becan data which atmaximum
rate the Hence, 1000] 1+ lo[g, 3000 = |,
2InE/NJ
2
-

1482 < r capacity


C= Channel
emOr.probability
of small reconstructed
with <10378 Tr :. described
as, cansystem ideal The
be
becan chanoel
it
and thtransmitted
e over becan output
source which fscheme
or coding existsa there then <C rH .. Soln.: also Shannon's
can Iimit Ihe
V terms
OI defined
in be
C :.R
< bandwidth. on
limit lower by
H<C transmission
=r R errorless For 10= Given between Trade
of
ifThen
, 3000Hz B= (2) 1000 -(1)= limited and
SNR bandwidth
not is
=xH. Information
ratRe Assume output. sec bits/ 10 12x = C :.
seconds. and channel the V=
character bits/ 7 = H :.
dhan eWhid te Gaussian benoise
to usable =
a |-n=l-(97.21%)
2.79%
every
T, used is itand channel
Cbe memoryless the at10 bandwidth
of
disKrete capacity
of the seconds.
Le every
T,generated 128 log, 128 H. wih and(R 3000Hz =
15) (l+ log, 10' 3x .C
symbois
are The HM). entropy have symbols M channel Calculate
the 1.7.2: Ex. log,(1+ B:c
-
pass low capacity
aof
a
message with sourcememoryless discrete aLet () k-1
parts 2stated
in channel
is KHZ B=3 15, () (2) Case
Tryiess discree theorem
for
cod1ng channel The 1.7.2) FigFrom
.
Theorem channel extremely
nolsy for Note: sec. bits/ 10 x12 C :.
=0. the +7] (llog, 104x = C
Coding Channel Statement
of 1.8.2 equiprobable. characters
are The (b) S
bits/second 0:.C Digital
5-EATC) Sem.Communication
(MU- Digital (1 4) 5-E81O) Sem.Communication
(MU -
(1-19) rgPage
tio. Srs (lnfomatn
Theory3
Coding)..Page (Intormation
Theory
No. Source &
ntomaon Theory &Souroe Cosingi Page to (1

Dgta onnt M-Ser5-ESTC source encoder


Rde Fig. 19.1. t sbows Dgta omuicaton (MU-Sem5-ETC (intomation Theory &Soa otngPare tio (1-21)
Decre S 1.10 DATA COMPACTION Step 4 : Repeat step 2and step 3until each messzge Mota
- Theree,
menoryees
Sourca
eirary
Channel capacity per nlt tme
bies'secg encooer
Sequere GQ Ecan daa comton in bref
is alooe in partition group.
R llustrative
Hence by charnei codng heorem G Fig 19.1 : Source encoder Redundant information is always present in the signals
length Example : Construct the Shannon-Fano code for the
z Concept of average codeword
generated by physical sources.
It has to be removed before transnission. iven messages and their probabilities.
As shown in Fig
19.1 (source encoder)
memory less SOurce generates S, symbols.
disctee This operation is known as data compaction or lossless
data compaction.
Mesagrs
symbols Probabtlity 12 18 1 /16 1/16 1/16 132 1/32
wtich s capebie of
Source encoder converts S, into b, codewor' Advantages of data compression
Hence, Sor rsC. there eists a code with k Z Soln.:
ctieving om prubability of eqor. of 0's and 1's. Here SOurce different symbols (1) Code obtained by this is efficieat in terms of average
2Ksumed Step 1 : Arrange the probabilities of synbols in
number of bits per symbol. decreasing order.
1.9 SHANNON'S SOURCE CODING Let the probability of symbol S, be P,. (2) Original data can be reconstructed from it without loss
THEOREM (READING PURPOSE) given K= 0,1,..K- 1. Symboks Probabilities in
of information.
decreasing order
Let codeword assigned to symbol S, has the length L.
Itsaso kaown as Shannon's frst theorem'. (3) Data compaction is achieved by assigning short codes 12
(for erample,codeword 010 has , =3 bits]. for frequently occuring symbols and long codes for
Concept of source coding Hence average codeword length is defined as. rarely occurring symbols.
1/8
m
In dptal counacation system. i s important to K-1
G Prefix coding
1/8
encoder
Teprenent he data neident format Source m 1/16
pertorms his tas k=0 For a source encoder to be efficient, the code should be
m 1/16
For bete effcescy of the source encoder. prior K-1 uniquely decodable.
knsiee of the generztson of the codes is important m 1/16
%Average codeword length = (probability of the It is possible only when, each symbol generated by
Thas s becaune few source symbols occurming more k=0 source is assigned with different codeword. I32
cfen thn others Source encoder can asign short This condition is satisfied with prefix coding
des fo sch frequently ocouTing syabols. t can
symbol S,)x(codewordlength assigned to S, ) I32

sgn Song codewords to rarely occuring symbols.


Prefix : Any sequence made up of initial part of the Step 2: Divide the symbols into two partitions or sets
B Statement of the source coding theorem codeword ís called as 'prefix of the codeword".
Variable length code - The code n which the source such that sum of probabilities of each partition group
encrder ags differeat length codewords to source Given a discrete memory less source of entropy H, Prefix code is almost same or close to each other.

symbol is kaon as 'vanable length code'. eg. More the average codeword length L for any distortion Symbols Probabilities
Definition:It is a code in which no code word is the Partition group
code. less source encoding is bounded as,
prefix of any other codeword. m, 12 12 Partition group 1
Requlrements ol the cficent sourcs encoder / L H
Need of souroe codng /Oblectve of eource Hence, the entropy sets a fundamental limit on the 1.11 SHANNON-FANO CODING m L/8
coding 2verage number of bits per source symbol. m, L/8 L/8 Sum is equal
1 The codewords generatod by sorce cmcoder sbould be : L H Shannon Fano algorithm [Stepwise procedure] to group 1= 12
m, /16 /16
Step 1 : Arrange the probabilities of the symbols in
I. The ooe code shold be ydocodrbMe. Due to Coding etficlency decreasing order. 1/16 I/16
his originl duta ntnce ces be gperfectly docoded at
kisgiven as, Step 2 : Divide the symbols into two partitions or sets 1/16 1/16 Partition group 2
3 The enoofng nd hi otao t9 fhe sybols mt be such that sum of the probabilities of each partition
- IL2Ly1s1. Hence, source encoder is efficien. group is almost sarne or close to each other. m L/32 1/32

4 Les feendy ocoering ords heve more mber af Using Shannon's souroe coding heorem, coding
Step 3 : Assign bit 0 to the upper partition set and m, I32 I32
, honts freqcnty oocmng words have les eticiency caa be written as, assign bit l toall the symbols in other partition group.
ber af bits

codewond kong. Redondency of code()m1-n=1- code efficien)


New Sylabus wef academic yea 21-22) (M5-69) (New Syllabus w.e.f academic year 21-22)(M5-69) elTech-Neo Publications..A SACHIN SHAH Venture
WTech-Neo Publications.ASACHIN SHAH Vent
gal
New Probablibes
Symbol Step
Step Comicaton
Sylabus 3:
4: Assign
Repeat
1/16 /I6 18 LS 12
ef L32 VI6
step bit (MU-
scadernc 2
0touyper
and Sem
Pariooa
Group 2 Partbn
Group 1 Stage I
step Phabiity
Spmbols 5-
year 1 1 1
3 auh E&TC)
1-2)
Group 2 Patitia
Group 1 Stage each l /I6
0 132 132 /16 L/16 18 12 grup
M5-69) messageis
and
Partition 1
Group 1 Group 2 Group 1 1 asin
Group 2
- 0
1 1 aloe
+Sage bt
in
message in partttoa Partition
Lower
group Upper (bit I(nt'Io' maton
message in group I Single Stage
Group 2 Group 1 Group 2 Group 1 group 2Single
ch-Neo Partsinment)
ition svmbolasl
group.
Theory&
onsA message in message in
group in
message in Stage V paribon
group.
lower
Group 2 Group 1 group 1Single group 2Single group 1Single
Source
eCoding).Page
Codeword
ACHIN 11111 11110 1110 1101 1100 101 100 0

SHAH message bits


per No.of No.
4 3 3 (1-22)
Ventus 5

Sylabus
(New
w.e.f Soin. :Z Ex. A Digital
1.11.1 where,
In message". It
It Average
1.1 the is is
Communicaton
Symbol given
m m m, m, Construct : also
m :L1))G) A
1.1 Lexample P, L
Examples discrete = = = = by. known
codeword
k=l k=l
Shannon-Fanon 2i6=2.3125 Probability
8
solved, M
academic Probability "«(G)-G)-s)-s4) (MU-
memoryless P,x P, as
0.10 0.15 0.16 0.19 04
Based x 'averagelength
(Number (Number
of Sem.
year message 5-
on bits/ source (L) E&TC)
21-22) Group Stage code
Shannon of number
Group 2 1 I Probability
Symbol m message of
1 1 0 bits/message) m, bits/
-. for has
(M5-69) the an
message) of
source alphabet - bits
Group 2 Group 1 Group 2 Group Stage
1 II Fano
1 1 0 0 0.40.19 per
and
m,
m- ofCode
calculate five
0.160.15 Note
ch-Neo Group 2 Group 1 StageIlI symbols (lntomation
=325=1
: Hence n example,
solvedfor know,As
we
efficiency
Code(n)
code probabikties
Huftman
coding.
other
probab1tiesUnlike
s
m and
effciency, with n
Codeword the H H Theory
0.10 m.their =
ations..A 111 110 01 00 example
solved 2.31252.3125
H -og, k=1
probabilities such
redundancy &
exactlynot
that bits/ Source
Length (2) log:( R,
their messsage
shown. -(xg;8) Coding).Page
SACHIN of as
of shown. sum
3 3 2 2 2 the
the equal,
dividethe
code. is it
SHAH codew close the
sum
Venture ord to No.
each of (1-23)
the

Modul
Dgal
L
(New codeword
Average
length
Soln.: 1.112: Entropy H Comniaon
Sylabus entropy
Source() Averege
.H2.1551
sybalbaaf H=
Px
L |S) efficiency
Construct L
drscrete A bits
225 = = =
waf = codeword mbol
kel 08+0.38 08+038 +0.15 (04x }P,x (MJ-
ademic Probability and
Shannon memoryless
0.15 0.15 0.19 0.40 x 2) Sem.
0.11 code 3) +
length symbol +0.32 10.19
+ (length
+(length 5-
year 03+0AS redundancy.-Fano (0.10 x2)
(L) source + of E&TC)
21-22) 0
Stage 0.45 x +
3) each
each of Probability0150.11
code0.400.1Symbol
0.519 +03 (0.16
+ i has codeword)
M5-69) 3
0.3codeword) and an x
Slage determine 2)
= 1 1 0 1 alphabet
2.26
1|
bits/ (0.4 = of
Stage following
symbol 1 0 five R
2)x I1
eo symbols Code (infometiParone
(0.19 + Codeword parameters, Redundancy
L-H
x 111 110 10 01 with etficiency
2) = = Theory
Stheir 2.25-
-, 21495
sA (0.15 + Eotropy, of 04 3&
Length probabilities the (n)
lo g:
x 2.100=x100
1495 bits/ +015 Source
2) o
+ of Average code 2.25
(0.15 3 2 2 each
2 =0.1005
symbol log: +oding).
3 as 0. 191
HIN x codeword codeword shown.
015 log:
3)
+ +0.1
0.15+0.16
(0.11 =
HAH length.cod
95 log,a No
x3) 534 (1-24,
Ven ka
Ex. S 1.113:Ex Code Digtal
R
(New
1.11A: Average Soln.:
Syllabus efficlency
Code(n) entropy
Source(H) Redundancythe of Communication
Using A
%n L |Symbol etficiency(n)
L
discrete codeword S< S S Constructthe A
w.e.f Shannon = = log, -H = = = discrete
L k=l
= 0.7+0.46+0.32 (0.35 k=1
x 2.1958 0.35 XP, (MU-Sem.
EsTC)5-
cademic memoryless Probability
100 log, x 0.10 0.16 0.16 0.23 0.35 memoryless Y
- bits/ 2) (length xlength Shannon code()
Fano = o35 1 + =l-n
2.26 symbol (0.23 -
year |Probability
code source of (L)
21-22) Symbol 023 + + x Stage - 100=
find 100 0.48 2) each 1 Probability0.350.230.16
1 =1-95.35%
Fano 0.160.10Symbol source
emits log, +
entropy, = +0.3 (0.16 codeword) I code has
(MS-69) 97.15% 23+016
six =2.26 Stage an
0.11 x 1 1 1 0 and
average messages 2) alphabet
+ II find 4.65% = 10
0.13 S
bits/ (0.16
log: Stage the
codeword 0.16 with symbol x 0 of 9553% =
S
0l6+ 3) effciency. frve
their + II (lntomaton
(0.10 symbols
ech-Neo
0.31 0.10 Codeword
length, probabilities
x 111 110 10 01 00
0.230.06 log: 3) with
code D.10 Thecry
tions..A probabities.the
efficiency S as +0.16 Length
shown. &
Sourte
lo8: of
and 3 3 2 2 2each Codng
SACHIN code
o6
codeword
redundancy. Page
SHAH
tio
Venture (1-25)
Digtal
s
Ex
1.115:Redundancy efficlency
Code(n) Source codeword
Average
length Communicaton
Syatbis |
1-n $n Symbol
Conttract
Shrno- A entropy LP,
dscrete = = kl
acndeic = kel 0.62 (0.31 (MU
1-(97.99% 2.4106
*100=246of 0.11 (H) 0.16 0.23 Probabi
0.31 l ty -Sern.
memoryess the log: +0.46 x x 0.06 0.11 0.13
bits/ 2)
code + (length
) 24106 0i+013
symbol +0.48 (0.23 5-ETC)
yer 201% = (v) x of Stage
source =97.99%100 x +0.39 1 1 1 0
Prebablity0.19
0380.11
0220.10 2) cach 1
22-M5 Symbel log: + lj
emits
0i+0.16 +0.33 (0.16 codeword) Suge
1 1 0 0 1
fve x3)
+0.18 + lI
syobol (0.13
log: = 1 0
Sage
0
with 06+031 2.46 x3)
+ l
probabilities s/
bit(0.11x3) (inlormation
symbol 110 101 100 01 Codewor00d
edh-Neo erage log, 11
031 + Theory
codeword Se as (0.06
onsA shown. +0.23 Length &
x Source
3)
Jength. log, feach of
3 2 2
3 3 Coding).
028
codeword
SACHIN S code +0.0lo6hk
8:
efficiency (1-26
PageNo
SHAH
and
Vetu cod
(New Wbycoding.
average Shannon-Fano S ß Digital
UQ,
gives I
Huffman's It HUFFMAN
CODING
1.12Redundancy efficiency
Code(n) Source Average Soln. :
is Explain Communicatton
Sylabus word
type
Jength. better the entropy S SynbolS
of codeword
w.e.f code
prefix Huffman of
academic it results
HenceBVen is (MU-
MU 1-n the %n =0.19 :H (H) Probability
known coding 0.10 0.19
encoding 0.11 0.22 0.38
as -Q. code Llength
Sem.
entropy,
known is
compared =1- -fx = =0.76
0.44 + = =XP,
k=l
-,k=l log: H
year as that 2(a), symbol
2.1484
bits/ (0.38
optimum (v) 5-
procedure. (97.21%) (L)
21-22) performs Dec. =x100 100 log: x Stage E&TC)
'optimum as it
provides 1 1 1 0
to
2) (length x
17, 2.1484 016+0.22 +0.38+0.33 (0.22 + I
(M5-69) Shannon-Fano code?
10 =
better 2.79% Stage
minimum Marks x of 1 1 1
code'. 2) each II
than log: + ----------.

+0.3 (0.19codeword) Stage


97.219% = 023+
x 0
= 2)
0.1 2.21 IIMCodeword (Intomation
Theory &
Step and 1. Step procedure]in
Step Hutfman +
Tech-Neo [1f list Stepobtain probabilities decreasing
order.
log:o+ bits/symbol (0.11
the in
4 1 111 110 10 00
such 2 x3)
: a :3 :
new Then new :Arrange coding 0.38
ications.A a Combine +
probability
way [last The (0.10x
probab1lity new log: Length
that 2two thealgorithm
probability probabilities]
these
088+0.11log: 0 1 3) Coding)..Page
SourceNo
decreasing probabilities
source
value. of
(lowest)to 3 3 2 2 each2
SACHIN value probahilities
value symbols (stepwise codeword
order are
is of
SHAH equal is is assigned given
placed
maintained with
Verture o messages
in bilowest
ts (1-27)
he the
0
Modul
1
Dlgltal
N ep w Step repeat
step.probabion
lity
Communkcation
Step deveasing Example
det. Step whah
lustratlve LSBpresent Stpartiular this, and Step toptoStep Horemaining
Syobus in 3 find 2 and 1 to 5: top
4:Plae Shannon In flnd Repeat priority
1 the in vnskler out Fdow
the already
: last Wnimeasage
te bot o m
Arange We ndewrd. the codeword
fin al ste p in (MU
wef the mbalProbabilty S Fano stage al
wil proab1lities, hat ,
3 tpresent
cdernit new is
doWn has the bits putthhe stage, Sem.
aassigned and group,
pnha1ty message consrder de MSR the prhability frm hit for 4
is contnhution. until th5-enE&TC)
yoar ymbola encrated. and
codeword tolast 0each
the
for new
that trheespectively. and only
1-22) value
probabilities same values stage message? bit value convenience)
Probablttoe
of prohubilities.
such l
two
on fi rst to is
04569 example where frst values is
the stage that assigned placed
list in are and
the for is bit
that For stage
maintaining
3tage 1
1.12 g,
eo A deeasing new Step symbols. Step
(Intormation
probability 2
conventencepelory
tor Top symbol
followed
paths
and
positions
byShows 3:
onder. Combine Probability
Symbol Assign Theory
m, m
Probability
Symbol
s.. M &
value bit Source
'0"
alowest
nd lowest and
1/16 1/16 1/8 1/32 \/32 1/16 1/16 Coding).
1/32 1/32 /16 1'
probabilities to
HIN Pago
0 No
probabit
SHAH o (l
Ve u
(New Sinilarly,
code Average Digital
euch
infomationHenve, Similarly
Symbols
etieiency m M7 ma Communication
academic
Syllabus m,- 00O00m,- m
code 00001 00
m, - -010Codewords
-0010 -m0001m, m,
H L -011 m, -1
m, step
n word Probabltties
w.e.f givingbinary =12.3125
meSsage
2.312S
bits 5
H M 1/1 1/16. 1/16 1/8 1/8 112 is
LSB carried (MU
100 dgit bit Px(Number length No.
message of out Sem.
year
efticiency. on bits
1/16 1/16 1/8 1/2
Stage and 5
average 3 1/8
21-22) per -E&TC)
= of all
Entropy, symbol I
bits stages
carries per
(M5-69) Stage
message) are
and 1/16 178 1/8. 1/2
determined
l
bit I hence (2BFig. contributed)
of (hasm,
StageI I
average
message. and
Consider
the(MU-0.20) 1.12.1
UEx. 1.12.2 8
Symbol 1/8 1/2 as
M m, follows.
By Huttman's their (Intormation
Tech-Neo | theProbabitity intomation 1.12.1
Message pobabilities
Also ìve StageIV'
rvaure
Stage l
0.1 02 02 04 source 1/8 1/2
algonthm
ind May Examples 18 1/4
ations..A m m m m m m Theory
er 15, Codewords
Q8sig.
L121 P. of Table
1.121 P. the sy has(mg
m, acssage as Ma
mhols yy Stage V &
Stage ll
Huffman's
04 average shown to 0000100000 0011 0010 0001 01010 Source
02 16, on /2
tind of contributed) 14
02 m, Huffman's
SACHIN discrete
in May symboper bitsNo
of Coding)..Page
Stagel11 code code the Table 17. Stage V1
02010 code 4 4 3
we word memryics 10 12
SHAH get, wonds P. Marks) Code STMSB
1.12I backward
Staye I\ length No.
Venture 06 fv path
Followsoure (1-29)
and ech

Modu
(Information Theory & Source Coding).
Stage III
. StPaage No.(1-h
ge 1Vo
(MU- Sem. 5-E&TC) Meesagee
Digital Communication 0.30030 0.30
-0.45. Digital Communlcation (MU - Sem. 5- E&TC) (Information Theory &Source Coding)...Page No. (1-31)
m
M Soln.: Modu
Message m 0.25 0.25 0.25
030 Symbol Stage I Stage IV Stage V 1
Probability0.4 0.2 0.2 0.1
0. 0.20 -0.20
o25 0.251 Stage II Stage III Stage VI

012013 020 0.25 0.25. -0.25 -0.25. 0.5


10 11010 011
Codeword 00 o12 0.25
0.09 0.25 0.25. 025 025
Average codeword length 0.051
S2 0.125 -0.125 0.25 o25 o25
word) (296)Fig. P. 1.12.2 0.125 0.125 o.125 0.25 1|
L= P, x (number of bits in code
k=l S4 0.125 0.125o.125 1
Fromthe Fig. P. 1.12.2 the codewords are as follows:
(0.2 x 2) + (0.1 x 3)+ (0.1 x3) .06250
=(0.4 x 2) + (0.2 x 2) + 0.1251
Sc

Message X X
= 2.2 bits/message 0.0625 1
Probability 030 0.25 0.20 0.12 0.0800s
Entropy of the source
00 10 11 011 0100 0101 (287)Fig.P. 1.12.3
5 Codeword

k=l
(1) Average codeword length (L) Symbol Probability Code word No. of bits in codeword
6 So 0.25 10
L = P, (length of each code word) S 0.25 11 2
k=1
S 0.125 001 3
L =(0.3 x 2) + (0.25 x 2) + (0.20 x2)
0.125 010 3
= 0.4 log, 0.4) +0.2 log, + 0.2 log, + (0.12 x 3) + (0.08x4) +(0.05 x4)
= 0.6+0.5 +0.4 +0.36 + 0.32 +0.2 S 0.125 011 3
0.0625 0000
+0.] logao.1) = 2.38 bits/message
S, 0.0625 0001
=0.5287+0.4643 +0.4643 + 0.3321 + 0.3321 (2) Source entropy (H)
Source code efficiency (n) :
.H 2.1215 bits/message 6

Ex. 1.12.2: Consider adiscrete memory less source X with


k=l
As we know, n=
siX symbols X, , y X and . Find a compact code n

for every symbol if the probability distribution is as


follows :
- 0.3 log:(o3) +0.25 log: 02s :. H = source entropy = P, log: p
k=1
1 6
px,) =0.30. plx,) =0.25, pz,) =0.20 +0.2 log, +0.12 loga 012+0.08 log 0.08
: H = Plog, p
px) =0.12, p(x) =0.08, p(r) =0.05 k=l
Calculete entropy of the source, average length of the oode, +0.05 loga 0.05
efficiency d Tedundaney of the code. =0.5210+ 0.5+0,4643 +0.3670 + 0.2915 +0216
= 2.3508 bits/message
alpabe
Meunge Ex. 1.12.3:A discrete memoryless SOurce has an = 1+1.125 +0.5=2.625 bits / message
O 1even synbols with probabilities for its outpu ont Average code word length :
Probably0.30 0.250.200.12 0.057 0.05 described in Table P. 1.12.3. Compute the Huffnan
Show that the computed sOurce code has efficiency L= P x (average number of bits per codeword)
peroent. k=l

Table P. 1.12.3 .L = (0.25 x2)+ (0.25 x2) + (0.125 x 3) + (0.125 x 3) + (0.125 x 3) + (0.0625 x 4) + (0.0625 x 4)
S :L = 0.5 + 0.5 + 0.375 + 0.375 + 0.375 + 0.25 + 0.25 = 2.625 bits/message
Lmbel S S Ss S
0.0625/o088
0.250.25 0.1250.1250.125
(New Sylabus wef academic year 21-22) (New Syllabus w.e.f academic year 21-22) (M5-69) Tech-NeoPublications..A SACHIN SHAH Venture
(M5-69) /SHAHVentb
dTech-Neo Publications..A SACHIN
M
Ede Clasider HenceHence Therefore, Digtal
UÊ. provd.
New entropy
Source(H) Average
Soln.: 1.12.4 suNe
Sylabus the a Communication
:L codeword Huffman's soure DA(S, Code
MU codeetticiency
L -
w.ef = = = S Q.
kel
using = a)efficiency
0.4+05 (0.40 (S,. (MU
ademic length algonthm HutProbabillty
inn S., Dec,
x Px is Sem
1) S,
+0.45 + (00. - 15, 100
(0.25 ( is ....S.
year 0.05 0 015 0.25 10 . 5-E&TG)
+0.4+0.25S of 10 040040 cariedalgonthm.
(MS-69)
21-2) x bits 040 Marks
2) with
+(0.15 per -005
out
0.15 0.25 as Find
codeword) 0.10 0.25 following
+0.18 x follows. the
3) aerage
+0.12 +(0.10 0.25 040 015 nessage
S,
Fig. 0.15
1.124P. oode
= x4)
23 Ss SSmbo!
L (0.05 length 0.10 S, (ntormatiton
040 Sage in mies
ch-Neo x 0.02 0.03 0.05 0.10 0.15 0.25 0.40 Ps) 0.02
25 and
5)
+ Codewond ctticiency. 0.05
Stage Iv S, Theory
(0.03 000001000000 00001 0001 0.40
001 1
ns.A
x 0.03 S, &
6)
+ codeword No.of
bits
Stage
Source
(0.02 6 5 4 3 per
2 1 06 0.02 Coding)..Page
x6)
ACHIN

SHAH No
(l
Vven
obtain redundancy
MConstruct symbols AUEx. effliclency
Code(n) Digital
(New 0.06, The UEx. Symbol m m
Probablty
Symbol discrete
M m Soln.
M. M, M, encoded
wvords. 1.12.5
0.13,0.1, nine 1.12.6 M .%n=97.90 % Communicatlon
Syllabus ProbabilitySymbol M,M with
symbols : Hutfmanof n H
(MU Probability 0.16 0.19 0.4 Huffman's the
memory MU
their
0.10 0.5 , -
w.e.f 0.16 0.19 04 -
Detemine viz. - 0.10 0.15
Q. code. probabilities Q. 0.5287 +0.03 04log:
academic (289)Fig.
1.12.5P. code 0.4 less 2(0), (MU
Al, 2(a).
0.161 0.25 0,4
Stage I coding source +0.5 log,
the A2, Dec. Codewords o19 and 0.19 Dec. 0.4 Sem.
HuffmanA3,.. 010 001 000 shown. 16, +0.4105 h0t +0.2 5
year 18, algorithm calculate 0.16 M has 5-
0.9790
10 Dec. +0.02 E&TC)
21-22) A9 0.4
Stage II an log,
code, Marks) 025 M 17, +0.3321
have codeword No. code 0.15 alphabetfiveof 10 lo ga 025
calculate is
(M5-69) of 0.10 Marks
coresponding 3 3 -0.6 Stagellfollowed efficiency, M
Bits
0.02 +0.15
04 +0.2160
the per
to log,
average +0.1517
probability efficiency
Code(n) entropy
Source(H) Average 015
Redundancy
a code (Intormation
+0.10
Tech-Neo +0.1128
word ..:%n :H codeword
of n L= log:
occurrences Y
length, log:0.16 bits/symbol2.2 - 0.0.45+7 = = 0.10 Thoory
= of = = =+0.log,I o+0.16 = k=l kal 2.2518
ublications..A the 97.70 0.4 +
(0.4x
entropy 1-n=1-(97.70 H +0.33210.5287 (0.15 +0.05
code x log: o length
% 100 1) P bits Source &
as +0.4552 xx(Number
(Y) = 3) + log,
and 0, = +0,48 (0.19x3)
+ L /message
12, 2.2 2.1495 +0.19 (0.10 a0s
Coding)..Page
(1-33)No.
SACHIN coding 0.2, + ogs0.+0.115 +0.4S
+0.3
0.08, %) 0.4230+0.4105
bits lono
g, x of
efticiency, = 3)
(0.16
+ bits
SHAH 0.2, 2.3 00 /symbol
per
% x
0.02, codeword)
3)
Venture
0.04,
Modul
1
Digital
) EntropPY 0
Soln. :
ew Communi
Averag codewordper
Number A
ProbablitSes
3ymbots cation
bisetProbabt
Codeword A
Sytalbus L Symbol
= - 24$ - =
=bitsmessag
06+04+032+04+0.10012x3 0070+04643
codeword9 013 02
adenic Q06 0.2 (MU-
P 01
010.12
x( C11 012 A
)+02 Number Sem.
length
+0915+04643
t00=x x -008 5-
x 10 Az Wo12 012 02
02
year 2) of 2 E&TC)
+(0.08 bits
21-22)
mesage / -02 02
4 As
$-69) +0.20+0.3N+039 4)x
100= +02x2) +0.1128
)
1012% 1.1Rg.26P. 11 02
A -02
2
+Q18S7
+0.2x
+0,4 01001 002 As
5 024 Informatlon
+02435+0.3826 02
Stage (IV
5)
-eo +(0.04 01000 0.04 A
5 Theory
x5)+ 02 02 0.24 0.31 3age V
sA &
+0.3321 0101 0.06
(0.06 4 Ar Souroe
StageVI
x4)+(0.13 0.24
0.3104 0.4 oCidP)gn...age
(1341No.
001 0.13 As
HIN 8tage
VII
x 0.550
3) log006
+0.06 0000 0
A
HAH +
(0.l
Ven X4)
(iv) (i) (i) 0.40,
()Consider UEx. Digital
(New Average(u)
soln.: Determine
Determine
Let
Huffman Determine 1.12.7
respectiveCreate 0.20, Communication
Syllabus the an
0.12,
messages aalphabet MU
L coding
entropy the length
Huffman 0.08,
codewrord
length(L) the -
w.e.f coding average Q.
be of 0.08, 2(a). (MU
cademic 0.4+0.6 (0.4x Probabilty
Messages
StageI algorithm ot of
m
M,. codewords Tree discrete
P ms m the
efficiency. codeword 0.08,
m
May Sem.
)+(0.2 X M, specified following
+0.36 (length Message and 18,
year ....... is memory 5-E&TC)
0.04 0.08 0.0802012 0
0.404 followed for 10
+0.32 x M
M, M M, M, M, M,
o 12 length. 0.04.
08 discrete Marks
(MS-69)
21-22) 3) of
+ M, each the
(0.12x3)codeword Probability less
+0.32 as of
standard
0.04 0.08 0.08 0.0S 0.12 0.20 0.40 -0.12 source
0.12 shown memoryless the
+0.32 for given
+(0.08 p810Fig.
1.12.7P. algorithm having
each +02 Stage Il
below.
+0.16 Codeword -016. 04
source
message)
x O101 0100 0011 0010 000 source.
4) following
symbols. for
+(0.08 StageIlI
(lnformation
2.48
0.4 Huftman
tions.A
ech-Neo bits x
Length source
/symbol 4)
++ Stage Theory
(0.08 3 3 of 0.4
encoding symbols
codeword IV
x &
4) with Source
+(0.04x Stage und
V compute their
Coding)..Page
Venture
SACHIN
SHAH 4) respective
the
codeword
probabilitiesI No.
(1-35)
and

Modu
(nformation Theory &
Source Coding). Pa
E&TC)
Sem.5-
CouncasonMU-
Dgt Digital Communication (MU - Sem. 5 - E&TC) (Information Theory &Source Coding)..Page No. (1-37)
entrogy()
ISoure Average codeword length (L) Modu
og8+004.1 n

02log.85 +0.12lg, *o12*10.08 log: L =


k=l
P, x (length of codeword for each message )
- 04lsg: + 0.8745
+0.1857=24202 bits/symbol = (0.15x 3) + (0.11x 3) +(0.19 x 3) +(0.4 x l) +(0.15 x 3)
+0.4643 +03670 = 0.45 +0.33 + 0.57 + 0.4+0.45
= 05287
:L 2.2 bits /message
(
r Codngeidency S Code etficiency (n)

%n = x 100
=2.2x100=97.95 %
24202
Code Redundancy =y= | -n =|-(97.95 %) = 2.05 %
2.48
five symbols with thei probabilities ass
memoryless source has an
alphabet of
S
shown below. Ex. 1.12.9: Adiscrete memoryless source has an aiphabet of five symbols with probabilities :
E 1.128: Adiscrete S S,
Syabol Symbol S, S, S, S, S.
0.400.15
Probabiity 0.15 0.11 0.19 Probability 0.35 0.23 0.16 0.10 0.16
Construct Hufman code for each symbol and determine entropy, average codeword length, code efice ) Construct Huffman code. Find entropy and average length of the code.
(iü) Calculate code efficiency and the redundancy of the code.
and code redundancy.
V Soln. :
Z: SyrnbolProbabilities StageI Stage II Stage III Symbol Probabilities Stage I Stage II Stage II
-06 S 0.35 +0.35 0.39 0.61,
0.40 +040 0.40
041 S 0.23 -0.26 o.35 0.39
0.19 r026
Sn 0.16 0.23D 026
0.15,

0.15

0.11
Ss
SA
alelelalel
0.16

0.10
0.16

2812)Fig. P. 1.129
1 g P. 1.128 Symbol Probability Codeword Length of the codeword
Sbolrobability Codeword Length of the codeword S, 0.35 00

3 S, 0.23 10
S 0.15 001
S, 0.16 11 2
S 0.11 011 3
S, 0.10 011 3
0.19 000 3
S. 0.16 010
S 040
Average codeword length (L)
0.15 3

L = P, x (length of each codeword)


k=l

.L =(0.35 x2) +(0.23 x2) +(0.16 x2) +(0.10 x3) +(0.16x 3)


= 0.7+0.46 +0.32 +0.30 +0.48 =2.26 bits/symbol

40+03502+64552+05287 2.1S1 Mts/meange


(New Syllabus w.e.f academic year 21-22) (MS-69) Tech-NeoPublications...A SACHIN SHAH Venture
Plos Syflabas wef
academic yer ZL-22 MS-69) LTech-Neo Publications. ASACHIN SHN
Ex.
Redundancy Code entropy
Source(H) Digital
(New
Huffman's Soln. : 1.12.10:
SyAsbus nx
efflclency(n)
0021958: Communicatlon
lengthYl-Fl-(97.15%
&H
was algorithm Using discrete A theof
of
Prubabllity
Huffman 0.110.13Symbol 0.5301 035 (MU
bcademic the code
is code, memoryless
Symbol followed
Probablty
Symbol
(Y) +0.4876+ log: Sem,
S. S 2.26 05
yaar code code, S 5-E&TC)
asefficiency find soune x +023
21-22) 0 0. 100
4 230+0. 3321
Probabilltle 013 shown the S )
0 011
06 0.29023
031031 16 log
0.06 0.23 0.31 0.16 0.13 0.11 emits 2.85 =
(MS-69) helow. entropy S
and
0.160.31023 06 six % a+016
Stago I 97.15 %
a1pPlg
1.12.10P. 0.17 code messages
o13J o16 of S +0.4230
Codeword redundancy. the
11 10 00 010 011 110
Btage Il source. S with lo8ot6
017 0.31
their 2.1958 (Inlomation
-Neo Length Obtain +0.10
SLageIV probabilitics
0.29 0.4
he
bltNaymhbol log,
of Theory
2 3 3 the 3 compuct ao
s..A -0.6
codeword 8tage V as &
04
1 shown
+0.16
binary (1-388ourrooee
averagecode below, log:0K Coding)..aPga
CHIN
and
find
SHAH the No
Ventut
(New algorlthm
Hufman : Ex. ta
Dlgital
Hence 1.12.11 Source
Syllabus Actually Soln. : Code officlency
Code(n) Average
Communlcation
we Y1-nl-(97.99 : %n
will redundancy
redundancy
A entropy L2
w.e.f P( Construct discrete H H
Sample oode
consider /symbol
2.4106bit0.3502
s 0.11 k1 3
0,3(0.1|x
scademic x
Space) Huffman memoryless (H) + word (MU-
of (Y) 10 +
log, E,log, 0.39 P,xlength
P(S,) the 3) length Bom,
0.3826 oT
year Probability
SymboB = code. code, 24106x2.46 +0.48 (0.13 +
But 0.24
I +0.4230 +0.3 6
21-22) source +0.62 x of (L) EATC)
0.200.160.10
find 0.15Symbol
ProbabilitySymbol S,addition Probabllity0.30 %) 3) each
has 2.01
10 log, +
(MS-69) 0.16 0.30 entropy 97.99 % + (0.16xcodeword
010 0.20 024 0.5237 +
an % o 0.46
of alphabet + 3)
(2814lg.
1.12.11P. 0.30 P(S) and S +0,16
-0.26 -030 +04876 0.18m2,46 + )
ro20 = average (0.31
024 S, P(S,J of log,
0.20 Sfive A2)
+ a6
lications
h-Neo A 0.16| + length symbols 0.2435 bit/synbol +
S, P| S, (lntonatkon
-044 Stage 1l (0.23*2)
S,] 031
0261J o30 0.10 S
of
+ the Swith
P|S,] lo.g+023
024 S. code S, the (0.06 + Theory
-056 Stngail +
o P| 2. probabilities:
44 Cakkulate 3) furco &
S,)
+ lou,
P| Coing)Page
Venture
SACHIN
SHAH 5] code 93
1 efficiency +0.06
lok, o
und 9M (1-39)
the

Moul
Digital
S S
Ex. S
(New 1.12.12: Code Source codeword
Average
length Communication
Soln. Redundancy
Sylabus =x %n
100=efficlency L
Huffman : Y H log: H
A =
pentropy =L=
Construct
Entropy, discrete = k=l
w.ef =1-n=1-(98.87 0.5210 0.30 = k=l 0.6+0.4 (0.30 (MU
of (H)
algorithm the (n) log, P, Sem. -
ncademic Average
Huffman memoryless x
code +0.4643 1
+ 2)(length x
+ S. ProbabiSymboll ty
2.2345 030+0.20 0.48 S, S, S 6
2.26 (0.20 -
Probabillty is
codeword () E8TC)
year Symbolfollowed code +0.4230 +0.3+0.48 of
source - 2)
x each
for
Probability Symbol %) 98.87 %
21-22) log: + 0.24 0.10 0.16 0.20 0.30
length,
as each has 1.13 = (0.16xcodeword)
0.19 0.2 0.3 shown
+ 090
(M5-69) 0.140.19
040 1 below : symbol an % 0.3321 =
code alphabet 2.26 3)
P1.12.12
G19g 0.19 S, +0.4941 0.16 10 011 010
Codeword
11
-0.38
bits(0.10x +
0.22 Stage I efficiency and log:
determine of symbol /
021 0.22|0.10| S.
S five = 16
D. 3) (Informat
and symbols 2.2345 + Length ion
+0.10 (0.24
Stage I code the
Tech-Neo
SHAH 0.4 bits of
0.22 0.0.348 following
redundancy. 0.380.11 S,with x
/symbol log2 2)e 2 3 3 2 2 the Theory
their ).10 codeword
tions.A
SACHIN
Ventut StagelI parameters : probabilities &
0.6 0 +0.24 Source
1o8 Coding)..Page
as 024
shown

below
No.
:

(i) A S S Digital
Syllabus
(New Codewords (iü) (i)
discrete UEx.
Soln. Determine
: Verifysystem.
Compare Code Source Average
1.12.13 efficiency
Code
(n) Communication
.L
memoryless : L
using the (MUredundancy entropy
andaverage =
H H = bits/symbol
2.21 = = = codeword
w.e.f the =
Huffman's comment -
Q. 2.14840.4552 0.19 = k=1 0.38 (0.19 k=1
Minimum (MU-
academic code-word source 2(a), 6 (H) +0.44
(n) log, P x P,x
Probability
Symbol Dec. Y (Y) bits0.4805 + 2)
code on has
=l-n=l-(97.21 019 log: ( length Probability
Codeword
Symbol Sem.
the Variance =x symbol / +0.3 (0.22 +length
year an 19, p S S S; S, 5-
(minimum results length +0.+ 22 +0.76
21-22) alphabet 10 0.3321 x of (L) E&TC)
Huffman Marks) 10 2) each
ofusing log, +0.33 (0.10
+ 0.11 0.38 0.10 0.22 0.19
variance) both. =214842.21 + code
(M5-69) of 0.5304
03+0.1
Shannon six
code-words 2 MI x word
symbol %) =97.21 e 3)
+ +
Fano,
= 0.3502 (0.38 )
1/4 M2 2.79 log8: 100 00 101 01 11
with
and % 0 x
their 2)
ech-Neo average 1/8 M3 +0.38 + (Information
probabilities (0.11 Lengththe of
1/16 M4 log:038
code-word x3)
1 3 2 3 2 Theory
ications.A 1/32 MS
as
length shown: +0.11 codeword
1/32 M6
&
Source
and log:
SACHIN hence 0.1i Codling)...Page
find
SHAH Entropy
Venture No.
of (1-41)
the
Modu.
() Digital
() (W)
(New Entropy
Codeworde L=
Average Communication
Sylabus
|Symbol kel
kel 6
M.
-}+;+++i19975bits/message ) of
waf
}+**1937s
bilswmessage
-(t:)-(7-9-(-)-()-(s)
M codeword P, the
uslng x ystem
ProbablIes
8ymbol
m
s! (MU-
ecademic Probabltltles (Number Numberbitsof
symbolper CodewordMessage Sem.
Shannon length
5-
year of
bits E&TC)
21-) StageStage Fano in
code
1 1 m
(MS-69) Coding word)
0 Stage I
Sage 1.12.Fi1g3. P. 2
II

Stage 3 01 8tage ll
(Infomation
Tech-Neo Stage 0001 m
-(h) A
Stage lI
Theory
Codeword 00000
lications. 10 0 5 &
StageIV Source
000001
message
bitsper 5 m Coding)..Page
SACHIN A No.of
2 1

SHAH No
(142
Ven
(V) Digital
Syllabus
(New
w.e.f (ii) (i) i) A bothvalues Comment I
discrete UEx.
CompareVerifysystem. Determine algorithms From Average Communication
1.12.14 both L=
memoryless
-}+;++*
=19375 are the k=1
Symbol
the (M U ) 6 M, M, M,
and -()-3)-(:)-(%-)-
(h)-() average
issame. codeword M
average the P,
comment -same
Q.
Minimum Thus x( Probabilities (MU
academic source 2(a), in codeword Number
code-word
May this both length 32 32 16 1 Sem. -
on
Variance
ProbabilitySymbol has case.
year the 19, e
thlength of 5-
results
both.of
length an bits
alphabet 10 algorithms Stage E&TC)
21-22) Huffman Marks) in 1 1
of code
using Huffman
(M5-69) of word Stage
Shannoncode-words six works 1 1
symbol bits'message )
M with code IIIStage
1
Fano,
with same and
and 0.25 M Stage
Shannon IV
average 0.15 their efficiency 1 1 0
(Information
M,
Tech-Neo probabilities
Fano 1
Codeword
Stage
code-word M, and
0.12
code, Theory
redundancy.
lications..A 0.080.10 11111 11110
M, as it 1110 110
length shown: is &
seen Source
and Also that messagebits
per
hence Coding)...Page
SACHIN compression for No.of
the 4 3
find
SHAH Entropy given
achieved
probability No.
Venture of (1-43)
the
by
Modul
1
Hutfman
coding S Digital
Shannon-Fano
Code e (0) ()
(New L= lH=
og R Soln. :
=0.6+0.5 = kel Average=0.5210 =0.3 k=l
Entropy Communication
Sylabus (0.3 6 6

x P log,
2) Codeword
x +0.5 1
of
w.af +0.45+ +
(Number 03+ the
(0.25 (MU-
academic M M, Symbol
+0.4105 0.25 ProbabilitySymbol
Codeword
system Number
x ProbablBes
ymbols 8
0.36 2) of m Sem.
log:
Probabilites s length
(0.+15 + bit+0.3670+0.2915 of
year 02s bits 5-E&TC)
0.12 0.15 0.25
0.24 codeword) /
0.3 x +0.1 5 0.08 0.10 0.12 0.15 0.25 0.3
Z1-22) +0.30 3) per
+ codeword 1
(0.=12 log2
(M5-69) Stage 2.45
1 x +0.3321 05
bits/message 3) 0.18 I
+0.12 1.12.Fi1g4. P. 012 15 0250.27 03 Stage
+ 00 0.3
0 Stage (0.08 2 M
=
x 2.4221
log,012+0.08 0.25
Stage 3) 2 10 M, 0.3
8age II
0 +
(0.10x3) bitsmessage 0181 (Information
0.15
Tech-Neo Codeword 3 010 M
10 100 01 00 8tage Il
log 043
0.12 M4 o7 Theory
0.08+0.10 3 011 0.30431
ations..
messagebits
per .08 &
3 111 Ms Stage IV Source
No.of 057
3 2 2 log2 0.10 Coding)..Page
A 010 3 110 M,
SACHIN
SHAH

No.(1A
Venture
cases.Comments G Digital
(New 6 5 4. 3 2 1. No.
7. 6. 5 4 3 2 1. No. Sr Sr.
9 8 From
1.13 Thus Communication
Syllabus
1.14 Encoding
method Efficiency
Generation code of length Principleof
Code inputs algorithm
Adoptability
communication Correlation
Usesystem Stage Complexity
encoder
data decoder ofSize
ofRedundancy Parameter
Definition
Example inComplexity it the
of
COMPARISON COMPARISON can
Huffman
Parameter
w.e.f data be
of Symbol
optimal said (MU
academic block for Ms M,
between that code
changing -
code the tree Probabilitles Sem.
year
OF OFaverage and
Removed storage
space
and is
Itacoding 0.08 0.10 5-
Huffman
coding
Formatting Initially compressed
Simple
ItComplex Small
and redundancy Source SOURCE
21-22) decorelates p to codewords
BottomtoAlways Alwaysthe probabilities
Highproduces and
thisTree their It HUFFMANShannon-fano E&TC)
is codeword
source constructed
used
(M5-69) generates Stage
of eliminating
process
to of CODING needs Huffman
coding 1
baseband coding the
data. transmissionmake CODING length table,
1.14.1
Table 1.13.1
Table Stage
an to on 1 1 II
is efficient optimal be set in it
signal carried AND both is
preserved of AND seen Stage
Codeword
channels. CHANNEL shortest symbols cases 1 0 (Intormation
rech:N out. use code that
Tech-Neo SHANNON-FANO
for wil the
of be 111 110
number
Application
Hamming Channel
codes encoder. Complex expanded
It Simple Largeand Addedmapping
error
make Channel
Itcoding Easy their
bottomTop to It Lowalways.Does probabilities It Theory
correlates
data.
the is CODING is is equal.
cations...A
process suboptimal to also
not of bits
encoder adapL.
constructed bits &
applied
correction
possible. Shannon-Fano
coding Source
of produce per No. of
error by CODING per
is code No 3 3
message
after
which codeword Coding)..Page
SACHIN to need
correction on
a shortest
the digital pre to set
SHAH source preserve of
transmission are
codes. symbols
signal codewords same
Venture tree. No
and in (1-45)
to both

Modu.
Digital Communication (MU- Sem. 5- (information Theory & Source

1.15 AWGN CHANNEL


E&TC)
Coding). .Page No.(lt
S, (t) = transmitted signal
o(t) = AWGN sample function
Let w(t) be the white Gaussian
Refer Fig. 1.15.land Fig. 1. 15.2.
It has zero mean. Its power
noise process.
Transmitted Recelved
spectral density
signal signal
s,(0) x() Now let X be the random
correlator. variable at
output
White gaussian The output of correlator j [product
nolse w() integrator combination] is the
variable X
modulofator
sample value a
(106) Fig. 1.15.1: AWGN channel
T rands
It is given by,x, =
i(0 . 1.15
.x = Sy + O, j = 1, 2, ..., N
S() where S,i = quantity contributed by S, (t) ..(1.13
0, = Sample function of
Here, S,is deterministic quantity.
random variable W
T

As we know.S, = Js(o) ,(0) dt ..(115


(1D7)Fig. 1.15.2: Continuous AWGN channel into vector channel 0
T
According to the AWGN channel equivalent model we
have, and c, = o() 9,) dt ...(1.155
0stsT
x(t) = S,(t) +
where x(t) = output of
ot)(¡=1, 2, ..,
M...(1.15.1)
channel (AWGN),
Chapter End
D0.

You might also like