Dcmod 1
Dcmod 1
Information Theory
CHAPTER 1
and Source Coding
Digital 9(0)
input C
h
ht)
Bit Synch Digital Digital Channel
stream ronization baseband bandpass impulse n
waveform wavefom
Digital response e
o0 bosooos outputvas
Z(T) r()
De
Source Channe De modulate |FrequencyLMultiple
oFormat decode Decrypt decode MultiplexDetect and despread access
sample
Message Channel
Information symbols To other symbols
sínk destinations
DCS Receiver
Optionale
Essential
(New Syllabus w.e.f academic year 21-22) (M5-69) Tech-Neo Publications..A SACHIN SHAH Venture
(Information Theory &Source
Digital Communication(MU- Sem. 5-E&TC) Coding). Page No(4 Digital Communication (MU- Sem. 5- E&TC) (Information Theory &Source Coding)..Page No. (1-5)
Quantized signal
(i) Frequency Division Multiple Access (FDMA), Modu
Output : multiplexed signal 1
Multiplexing is used to combine the different signals. (ii) Time Division Multiple Access (TDMA),
These different signals may have originated from (ii) Code Division Multiple Access (CDMA) and
different sources.
Digital
Multiplexing allows sharing of communication (iv) Space Division Multiple Access (SDMA).
encoder
resource like frequency, time, etc, on local basis. 1.1.2 Recetver Section
This allocation of resources is for long term basis. It
Sampling Quantzing Encoding 11...1100 can be done ahead of channel coding or before The steps followed at the receiver side are exactly
Digital data modulation processes. reverse as that of transmitter section. They are:
Ld Encode to information
digital IfInformation ofmessage Process:message
signalOutput 9. Process
Formatmessage
Output : Input Source 8.encrypion
decodeOutput
Output
Process InputDecrypt 7.(Information
the the
the information : Input:
-Neo Fig., channel :
Stream communication Fig. decrypted : : bit :
where Bit conversion decoding : decrypted
1.1.3. continuous done :
stream bit
Demodulatel sink decryption Coding).
Theory&
stream Source
cnnel modulate
detect Pulse
tions 1.1.1coding orlike
digital consists symbols
received at
are analog symbols of bit bit he channel
received
coding is tim e of th stream
e without
forns wavePulse not
system information. transmitter)
not digital bit
stream (exactly
needed; used, information theofdomain from error
ls stream
CHIN A not Receive Channel Transmit diagram to
some various source
decode
from control
used signal, analog reverse
in
SHAH hat
can toptional signal,different bits
cse,bes
be isignal
,cnig e.,
Ven redct blodk terh h
Consider Information(B) 2 1.
Information
(A)
(New GQ Digtal
3. 2. 1. No. Sr.
information.
contains
But Every These information
It signal.
symbols. Discrete Examples:
produces sampling.
process ofinformation
It ItisAnalog
sourceSource
Sylabus will Zurich. cold in
dayToday is Wnte 1.3 Hence,others. consists can PURPOSE
READING
1.2
sDowfall
possibilityThere
Zurich.today of isa Zurich.sunnyTomorrow veryStatement
the continuous Communication
some
a it message symbols be
following shortMEASURE is information
source
day be
w.e.f necessary messages of continuous transformed
Micropbone
in a note are discrete
academic predictable. in cannot generated valued (MU-
uDcertain. this. as
II much certain,
Almost
as Table
1.3.1 3 emitted
itthan Less predictable. statements on
is is it Certainty
measure OF toconvey set DiscreteAnalog
very case rare is quantifythe amplitude. into
information Sem.
very INFORMATION by at of
year information.
certain Hence,
predctpeopleevent, actuated discrete
Duch and I : more the
discrete letters 5-
of E&TC)
21-22) inforraton information by source.
SomeDot :0.0.98,or
information is
coaveyed piece0.02, information
Approx Probability
So,no 1.= information.
information intervals. or continuOus information
Probabiity
(MS-69) alphabets voice
so
0.01 or
great source
but of than timesignal by
of a
4. 2. 1.
Na. Sr. (m)is probability
knowledge basis
3.
respective
Folowings baseGenerally
base actual onlyHence,
uncertainHence. Frorm
Henceprobabilities Definitionquantity content.
one Information (Infomation
the
ech-Neo 10 2 Not Baselogof Table Therefoe
specified has
of
inverselyTherefore, on
of or the
132: 'tolog' amount than q amount amount
of
units. be the the contin it the Table
are 2. proportional possible occurrence
likeliboodis Theory
lications. Log isunit measure its uncertainty
ofSuppose : contained
Refer thspecified.
e of
occurrence
P+P,+P, probab1lity ofmostshown of
1.3. 1
nat
HartleyDecit Binit Cnit base of
Bit messages information information &
and Table base information infomation
of information. that of of weSource
or information to
an in
If of occurrence an can
4
respective 1.3.2. used
it P infomation
P,, of a the the
SACHIN A 4 in message say
event Coding).Page
in is . . + P, m,, occurrence. in
event messages
nat Hartleys specifiedNot for symbol,
binary
then is in a received that,
Formula +P,=1
= units binit' is P, m, message of is
SHAH log given k source s ratber the
related on
log, used - an
and message m a which event from No.
Venture when as, separate than
depends intuitive
with emits to (1-7)
their
no its are the the
I
Modu
1
4 Property 3.3 Propertles
Property 2.2 Property 1.1 8Digital
P,=0.75
Given : Min 0.75. Ex.
(New bits, |I moreHenceStatement Therefore, ts
shouldStateiment Therefore,;20
Soln.:Determine Lecontent.
t StatementProperty 4Therefore,
independent
P,respective
1.3.1: (m, Hence information For m, Statement:
Syllabus Harileys nats bem, Let highest Communication
their information. the
P,
and A m)joint and P : approach
4 Hartleys. binary probabilities m,
the information : < symbol and Let value For :
w.e.f = Information
= information I besymbols Total Py P,there 14’0 P’1 asits a (MU
logsp= 0.2876nat 0.415bit log symbol (m) two with 1,
symbol
academic bint beare lowest the
+I independent information I, Sem. -
is
(m) respectively. should lower their two valuamount
e. with should
logs occurs givenas,
year associated probabilitiesdifferent 5-E&TC)
D5= log: = be
probability probabioflity always
21-22) p with conveyed
symbols the information
0.124 log 2 bi) with a symbols be
(MS-69) 63) probability sum respectwiilvely. approaching
the and positive.
Harleys by contain m,
logN) symbol P, theirof in
and two and it
of
6Q. 4. 2. Find messageSaduri
re :ng1.3.2:Ex.
TLet Concept UQ. 3. Given
SSin Let (1) () 1.4 I,= Soln. : the (Information
S, important Definition
ZFor symbol
DefineExplain L,=1,=log: I,=logs : each
-Neo
SHAH S,
PpP, P, a
information
content.
nessage instantaneous
changes.
Statistical also As defining log,,= P,-6 4. 1.information
S, source A
is Entropy ENTROPY
the P, messagesource
m the
parameters
p= Theory
Sy emits
statistically a the called : concept 16 =;
ns..
Venture dependence The 1=
1_log log: log: content puts
respectively. thP, ebe sequence selected
information of 8 4 interval.
one
as an log g4 2 &
flow 'sourceaverage (MU out
Coding).Source
/ =log lo1_
of must information of AND 16 log, 2= of
independent of entropy bits2
=4iog2=3 log 8 one
ACHIN.Aprobabilities
theof M will Q. -Q. ITS each
information
of symbol be entropy'.
information 2=2 The of
possible symbols content 1(b), 1(c),
log2
alter
considered: PROPERTIES
in bits log thesc
of probabilities five
May Dec. source. detail. bits 2 possible
sequenc.symbols changes, of mesSsages
present
the 19, 15,
in the content bit 1
a 2
symbou averag system symbol, Marks) Dec. of
D in per 18
D the he
(New message.
As UQ. GQ. WhenUQ. is UQ, UQ. Digital
andWhere, source For It message.
dividingtheby
Now each
Similarly. information
contentas,Hence information
Hence P, Let
Generally, Hence,
Syllabus P+P, ?
maximumproperties.Derive Prove ofShow 1.4.1 is k
P determination Derive a source there
is of discrete averageTherefore, of timesCommunicabion
the the the the P
+P, the P, k entropythe Hmn
symbol are
w.e.f symbols
= Dertvation entropy
entropy total
M k S,and
probability+P,+ expression equation = occurrences
+.. 0, MUlog,M.memoryless infornation symbols wil so k
S,
academic (sa) information on. symbols
* .. - is cquation. =og:1S) content
p of occur
i wil (MU-
P, +P, i= we of MU Q.
maximum7 MUmaximum i=1 M
of maximum
need, for 1(b), MU- of term occurring
=1 1,2, -Q. for Q.
P, occur in Sem.
year occurence =1 maximum source Maximum per i=1 M
of k
... entropy.
1(c), by content times. one
May Q.1(c), 1(3,
symbol symbol P, 5-
21-22) k when number k bits symbol
entropy Dec. are Dec. 19,
ofE&TC)
When entropy. May times. wil times, kthe
of Q. Dec.19,
equiprobable. al
S,
(M5-69) M, 16, 18, 1(c), long
18, theEntropy bits/symbol ofcan be contribute is
symbol of 5 is 2 symbols be S,
entropy Give Marks Dec. 4 4symbols the messages.
.(1.4.2) ..(14.1) discrete Marks Marks Marks obtained bits wil
sum
in its 15. occur
a in of an
Properties
Entropy of Taking
ch-Neo
Prespectively. messageS than
Let discreteLet memory-less
a source.symbol If givenby. kExample : ThissourceHence,
the symbol -1- . dPd Now (intomation
the H=0. H
maximum
antilog Adjusting
source log -P, P, dP ::
probabilities m, produced0sHS log, k produces
maximum P,
discrete IfP, : on P log has
tions.A M,, has Hence, log, =
k +Since.dP
both 1 . the P, become =
Theory
memoryless
source
myhas memory, = i=1 valueM - 1-
decreases. memoryless equiprobable -Z sides log +log terns. P,
of H 1. of
value dP log
(P all =P, P, P, P, dP a (P, &
thesem-1* other entropy we we P) function
then of =-1,getwe
, =0
+Source
SACHIN Hence 1)= entropy get, log = =0 get, P,
messages uncertainty <H_s P =0 +
source = is,
symbols. P, dP, of
0, Coding)..Page
..
its al +
SH has #ji occurs P,
be entropy P
related s. +
P,
geDerated an ...(forany i)
vetue entropy ...(14.5) when ..
P, is ...(1.44)
less with
..(14.3) 1 + No.
P, the P,.) (1-9)
Mod
(Intormation Theory &Source Coding).
Digital Communicatlon (MU- Sem, 5-E&TC)
Ex. 1.4.2:A discrete memoryless source .Page No (146
i
Digltal Communication (MU- Sem. 5-E&TC) (tntomation Theory &Source Coding)..Page No. (1-11)
transmitting three distinct symbols mg, m,
Property 1
H(M) = 0 if and only if P ’1 for some
k. The Drobabilities are 1/2, 1/4 and 1/4 capable Extension of a discrete memoryless source Mod
Similarly other symbol probabilities are given below X = (x, x,, , Xy-) ...X= input alphabet)
Entropy- H =P loa ) and Y = (yo»a Yh-d.Y= output alphabet) Therefore.2 Po,Ix) =1, for allj
Symbol S,S,S, S, 8, S, S, S Hence the transition probabilities are: K=0
H 0
Probabllty 1 1 1 1 P(y/x) = P(Y=y,/X= x) ...for all iand k
4888 16 16 8
(New Syllabus w.ef academic year 21-22) (MS-69) (New Syllabus w.e.f academic year 21-22) (M5-69) Tech-NeoPublications...A SACHIN SHAH Venture
Tech-Neo Publications..A SACHIN SHAH Venturt
(Information Theory &Source
Coding.
messages/ .second
Page No {
5- E&TC)
Digtal Communication(MU Sem. r = Number of
wbere,
Jointprobablity distributlon of Xand Y H = Source entropy (infomation Theory &Source CodngPage Na. (1-13)
Digital Communication (MU - Sem. 5- E&TC)
It is given as, Tbe unit of R' is bits/ second. 3. f = 8 KHz
Mod
4 Information rate : I
P,) P(X =x, Y
=) Let the messages have folowing binary coóes Reír
a 1.5.1 Examples on
- P(Y=y,IX=x) P(X=x)
- P(y,Ix) P(4) Channel Dlagram Information Rate, ..R
R =r H= 1.875 x 0.9182
= 1.72 bits /second
Table P. 153
Table P. 153
Ex. 1.5.2: An analog signal is bandlimited to 4kHz It is
a Marglinal probability distributlon of Y. Ex. 1.5.1:Considerratelegraph source Quantization level Probability Binars code
dot and dash. The dot duration is 0.2 seconds having twO Symbr sampled at the Nyquist rate and the samples are quantized
into 4levels. The quantization levels Q,. Q, and Q, are
It is given as, duration is 3 times the dot duration. The and the t
P) P(Y=y)
J-1
dot occurring is twice that of the dash and the
symbols is 0.2 seconds..Calculate the
probablity
time d
independent messages and have the probabilities P, =P, =
: PO)
J-1
PY=yIN)P(X=x) telegraph source.
Soln.:
information
rate of and P, =P,
V Soln.:
Find the information rate of the source.
10
-(og
3, Information rate :
R=r H =S000 xL.8112
0.3899 +0.5283 =0,9182 bits l symhk R = 14490 bits second = |+0.6225
Ex. 1.5.3:Consider Q,.Q. Q, and Q, are to be transmitted = l.622S bits/
3 Symbol rate : message
using binary PCM. Calculate the information rate. The 3 Information rate (R)
B . 131:ransition probability of bìnary symmetric
channel T,= (TpexP (dot) ]+(T,u NP(dash) ]+Tpe probability of the messages is as given below. P, = P, R =xH= 16000 x1.6225
.R = 26.96 kbps
Intormation Rate P,= P,Assume f, =8KHZ.
Ex. 1.5.4: An analog signal is band-limited to 8Hz san1pled
t de surne of th messages generates T aumber of
= 0.5333 sevonds Soln. : at Nyquist rate and quantized at5 levels with probabil1ties
esaes per sevn then the infomabon rate is given as : 0.5, 0.125, 0.0625, 0.25 and 00625. Cakkulate entropy and
RH Now symbol rate, 1
r=t-hT=l875 symbols / ! Given : 1. P=P, information rate.
New Sylabus wefacademic year 21-22) (M5-69) E Tech-Neo Publications.A SACHIN SHAH Vent (New Syllabus w.e.f academic year 21-22) (M5-69) è Tech-Neo Publications.A SACHIN SHAH Venture
Source Coding)..Page No. (1-15)
(Information Theory & Coding)...PageoNo (Information Theory &Source
ploltal Communication (MU. Sem. 5
- E&TC) 1 (1-44 Digltal Communicatilon (MU-
Sem. 5-E&TC) Observations/Results
Modt
bínary symmetric 1
Soln. : 86 : Channel capacity of the channel is noise-free, that
() Refer Fig. 1.6. 1. When
4 Example
= 0, the channel capacity achieves its
Glven : 1. f8Hz channel means P
0.0625, channel described by maximum value.
2. P(x) = 0.5, P(x,) =0.125, P() = Consider a binary symmetric channel use. It is same as that
transition probabilities as shown in Fig.
1.5.I. It is given as C= l bi/
P(x,) =0.25, P(x) = 0.0625 +++;-1875 bits second The conditional probability of error is P. of channel input. At this
H(p) 0.
value of C,
(0) Source entropy (H) Informatlon rate (R) Let H (X) be the entropy. Entropy is
maximum when is noise present in the channel P=.
(2) When there
.R = XH=1x 1.875 the symbols are equiprobable.
= 1. C ’ 0. It is minimum value of
capacity and
.. P() = P (%,) = h where x, =0 and x, case, channel is not of
H .R= 1.875 bits/ second
Hence capacity is written as,
H P) ’ H (P) L’1. In this
kel any use.
1.6 CHANNEL CAPACITY THEOREM/
C = IX;Y)I P(Ko) =P(x) = 1.7 CHANNEL CAPACITY THEOREM
SHANNON - HARTLEY
From Fig. 1.8.I we can say that,
GO. Define channel capacity with one example
1 P(y, Ix,) = P(y, Ixo) =P UQ. Describe the Shannon-Hartleycapacity theorem.
MU -Q. 1(a), May t6. 5 Marks
- 0.5 log, as+0.125 log: 0 25+ 0.0625
Consider a discrete memoryless channel with innw
alphabet X and output alphabet Y. Let the tranti and P(yo Ix) = P(y, Ix) = 1-P UQ. Explain with the required diagrams
Shannon
(New Syllabus w.e.f academic year 21-22) (M5-69) Tech-Neo Publications...A SACHIN SHAH Venture
(New Sylabus wef acadermle year 21-22 (M5-69) LTech-Neo Publicatl ons..ASACHIN SHAH Ventu
Publicatlon.A. WTech-Neo
Venture SHAH SACHIN Publications.A Tech-Neo (M5-69) Ventut SHAH SACHIN (M5-69) 21-22) academic
year w.ef Sylabus (New
21-22) academic
year w.e.Syllabus
f (New
-N=N 4CeLC,iY)X,
Gadn
diagram. C
efficiency bundwidth shows It
1.7.2. Fig. Refer know we&s
Now, varliable. randon Gaaman betohas X,Also
1.7.2) ..(From
Fig, KHz B=4 (0)7, (1) Case diagram efmlelency Bandwldth variable. rendom Gausian le
oaxium
Y,if (YJis h
1.7.(ig.
2: bit. ener3Y
per transmitted E,
where, mAximizing
hY
Example EC powertransmitted
P Average i, requires Y) I
6,macimizing Therefore
.(1.7.13)
diagram. efficiency bandwidth Ry
Cm
capacityinformation C0. independeat
of iN) know,
h we A -
17.
of2 Fig. shown
in versa
as vice and SNR Nmit
.(1.7.8) (Nk)(Xk)-h h YK) (XK; I .
Bhennon k rateR bit data
at
atransmits system ideal anLet
bandwidth
and between trade-off alwaysa There
is
error)(Probability
of 30 30 24 18 12 diagram. efficiency bandwidth explain also and (1.7.6)
we Equation (1.7.7)
in EquationSubstituting
P,and SNR bandwidth
and between offTrade (3) theorem capacity channel Implicatlons
of Describe GQ
...(1.7.7)
>C. R, theoremn
occurs
transmission
if error-free non- condition
and which Regi
for on capaclty
information Impllcatlons
the of entropy
Nofdifferential
C <R,possible
for transmission
is Error-free RC which tor equal is given of differential
Y, entropy
boundary Capacity to
10 properties.
transmission. variables,
the randomindependent are N,and ,As
X,
like noiseGaussian have transmitted
must signal X)
(Y,(X)-b
I h =I(X,:
Y)
-error-rec transmission
from error-free separates 20 ..(1.7.6)
30 For
the this, channel. Gaussianband-limited limited,
I C, R,rate bicritical
t the for curve the is It which foRegi
r on power transmissionafor error-free ofrate the information
get, we mutual properties
of using By
diagram', efficiency fundamental theorem capacity Channel
sets
limlt
on xñof
PDF is
Definition "bandwidth known
us E,/N,
is versUs R/B of
plot A
results Comments
on Gwhich (x) respect
fto with maximization
done is Here
boundary Capacity (2) diagram efficlency Bandwidth-
second. per bits Blog,1+
NE =
.c [X]=P} {1%4:XJ:E («) ,
denslty. spectral power noise is.(1.7.15)
N, C/B ...(1.7.5) max C= ..
P
and power single where
is S 1.x44 bewil wldth band (1.7.3Equat
) ion ...(from Now,
weas Y7: and between
Xg
2C= 1
B+N NB =knowo terms
information mutual average maximum of
Inflnite wichaanel
th capucity
foar maximum Thus C E, .(1.7.12) capacity
a ofinformation Now
tansmisien log,(1+bits
per given
in channel
is
get, we
sides both antilog
on Taking power
transmitted average = whereP
= C ..,K. 2,1,K=P,X]
= E
CNlog,e C. i: ...(1.74)
get, equation
we capacity get, we inputchannel each
P lim Nobtained. i IM
og, .. chanel (1.7.11)
in (1.7.10)
and Equations Substituting (3) assigning
cost limited.
By power transmitter
is The
to
be
capacity
can channel value
of limiting this, From o`) (ame log, hN) ...(1.7.3) NB =
.(1.7.11)
Shannon's
limit. known
as This
is ogi+log by, given variance
is Its mean.
is, entropy
NofDifferential :. it AWGN. due
tosamples noise are N7
.7.16) dB
-1.6 zero has Hence,
noise variance
of The (2) time Dlscrete 1.7.1: Fig.
0.693 ,2 .(1.7.10) =N,
gsamples model channel Gaussian memoryless
(1.7.1S)) Equatlon ...Fron 1log B c (2re-log,
o`) +(P h(Y)
ofdiferential
isY,entropy and
lim get, efficiency
we bandwidth of (129)
No
density spectral power Noise Detining P+o ..(1.7.2) 1,2,3,
K .. XK+ Y7=
K= Ng i.
value. limiting its terms in output,
bit percncrgy Slgnal variance The (1) =(K noise the be LetN
pproaches -Bo1 signal
Y, received
is of K). 2,1,samples.
ratio infinite,
the bandwidth
is the If.(1.7.14) received continuous the be LetY
.C followed betoare at variables random Gaussian
noise
is whiteAdditive
Llmlt
Shannon's (1) thrccupaci
e ty channel. to
the added
(1.7.13)
..(Equatlon EC puttingP :: of
channel determination for Now .(1.7.1) 2BT = K
efflclency bandwidth Observations
from E is K
by. given samples, number
of the Hence
(Information E&TC) Sem. (MU -
- 5Communicatlon Digltal P(X]-
Coding)..PaNo
(1-19) Source Theory & 5-E&TC) (MU-
Sem.Communication Digital
Coding)..PNo, age Source Theor(Information
&
y
Ventut SHAH
SACHIN Tech-Neo (MS-69) 21-22) yearacadenic w.Sylabus
af
Venture SHAH SACHIN
Publications..A Tech-Neo a (M5-69) 21-22) academic
year w.e.Syllabus
f (New (New
second. T J1lo[g, 1°+0]=
xI lo1.[ 101x -
every once channel utilizes encoder channel The
second symbols
per effectiveness. bits/second 10378 =
improve redundancy
to reduces Decoder reliability. (11) loE, 3000 =10) C=3000
+(1
log, 1+
Bhg| copci
-C y Gend
transmission
rat=e symbol encoded :. improves
the whichredundancy adds Encoder (1+ log, B
C=
capacity Channel i.
seconds. T, system.
communication effectiveness
the of noisy.=0. extremely channel
is the As
every symbol produces encoder channel Let overall optimize
the to
done should
be coding Channel :=10
coding.
Soln.:
'r'. rate code with -
encoder channel applied
ato sequence
is source the Let channel indone mapping nverse andmapping for log 10=10 . 1KHz. B=-(1} Givea
1]. =(2) log, =responsible decoder
are channel aencoder
nd Channel capacity? channel could
be What KHz 1
loH=
g, and H
symbols
= likely cqually noise. channel overall reduction
of helps
in It bandwidth
of havinag channel noiextremely
sy Considera
for (Because
'H 1 H(=M) entropy source the If coding'. "channel Marks 515, May 1(a). 0. MU
- 1.7.1 UEL
seconds. cvery
T, -
known
as sequence
is data output intosequence 128 symbols= characters
or ofNo. (3)
emitted symbols
are The ls). and symbols
(0s binary output channel mapping of inverse andsequence input 10dB = (2) B=300OHz Capacity Channel Examples
of 1.7.1 a
channel isequence
nto incoming
data mapping
of The Given-
(I)
likely equally emits source
memoryless discrete aLet
communication
system. Soln. : M Shannon'
limit s which
is L.6dB boundary at sbows
-
Channels digital diagrarn
of block shows It
1.8.1. Fig. Refer diagram
t rate eror sbows 1.It
7.3. Fig. ploted
in is It
errors. without
Symmetric Binary Theorem
to communication
system digital diagram
of Block 1.8(2817Fig.
.1: computer the terminal
to the from
transmitted be dingram rate ETor L73: 2819Pig
Coding Channel Application
of 1.8.3 a can data which (theoretical)
at
rate maximum the Find (b)
Receiver Noise Transmitter
channel the capacity
of the Find (a)
done. bechannel
can lessmemory discrete decoder channel Bource Probability if
memoryess encoder mamonyess characters.equiprobable of
transmission
over reliable which atrate the on C Destination Channetl Dlscreta Channel Discrete
fundarnental
of
limut the theorem
sets coding Channel sequences
independent consist
of terminal the from sent data
Coding Channel Concept
of 1.8.1
and
the that characters has
128 terminal the thatAssume
probability
of small wireconstruct
ath it anchannel
d PURPOSE) (READING symbol
error P
10dB. S/N output
of of
anand3000Hz bandwidth
theinformation
over transmit possible
to not is it THEOREM, SECONDSHANNON'S usable havinga telephone
line grade voice througha
THEOREM/ CODING CHANNEL 1.8 comnute Connected
the to CRT The
Is computer. inato data -
alphanumere cnter toused terminal
is CRT :A 1.7.3 Ex. 10
-16
Converse, (2) second.characters/ 1482 errors
is withouttransmitted E,/NJ
<I2n 1,
ratecriHere(=
tical bits/second 30000 =
becan data which atmaximum
rate the Hence, 1000] 1+ lo[g, 3000 = |,
2InE/NJ
2
-
symbol is kaon as 'vanable length code'. eg. More the average codeword length L for any distortion Symbols Probabilities
Definition:It is a code in which no code word is the Partition group
code. less source encoding is bounded as,
prefix of any other codeword. m, 12 12 Partition group 1
Requlrements ol the cficent sourcs encoder / L H
Need of souroe codng /Oblectve of eource Hence, the entropy sets a fundamental limit on the 1.11 SHANNON-FANO CODING m L/8
coding 2verage number of bits per source symbol. m, L/8 L/8 Sum is equal
1 The codewords generatod by sorce cmcoder sbould be : L H Shannon Fano algorithm [Stepwise procedure] to group 1= 12
m, /16 /16
Step 1 : Arrange the probabilities of the symbols in
I. The ooe code shold be ydocodrbMe. Due to Coding etficlency decreasing order. 1/16 I/16
his originl duta ntnce ces be gperfectly docoded at
kisgiven as, Step 2 : Divide the symbols into two partitions or sets 1/16 1/16 Partition group 2
3 The enoofng nd hi otao t9 fhe sybols mt be such that sum of the probabilities of each partition
- IL2Ly1s1. Hence, source encoder is efficien. group is almost sarne or close to each other. m L/32 1/32
4 Les feendy ocoering ords heve more mber af Using Shannon's souroe coding heorem, coding
Step 3 : Assign bit 0 to the upper partition set and m, I32 I32
, honts freqcnty oocmng words have les eticiency caa be written as, assign bit l toall the symbols in other partition group.
ber af bits
Sylabus
(New
w.e.f Soin. :Z Ex. A Digital
1.11.1 where,
In message". It
It Average
1.1 the is is
Communicaton
Symbol given
m m m, m, Construct : also
m :L1))G) A
1.1 Lexample P, L
Examples discrete = = = = by. known
codeword
k=l k=l
Shannon-Fanon 2i6=2.3125 Probability
8
solved, M
academic Probability "«(G)-G)-s)-s4) (MU-
memoryless P,x P, as
0.10 0.15 0.16 0.19 04
Based x 'averagelength
(Number (Number
of Sem.
year message 5-
on bits/ source (L) E&TC)
21-22) Group Stage code
Shannon of number
Group 2 1 I Probability
Symbol m message of
1 1 0 bits/message) m, bits/
-. for has
(M5-69) the an
message) of
source alphabet - bits
Group 2 Group 1 Group 2 Group Stage
1 II Fano
1 1 0 0 0.40.19 per
and
m,
m- ofCode
calculate five
0.160.15 Note
ch-Neo Group 2 Group 1 StageIlI symbols (lntomation
=325=1
: Hence n example,
solvedfor know,As
we
efficiency
Code(n)
code probabikties
Huftman
coding.
other
probab1tiesUnlike
s
m and
effciency, with n
Codeword the H H Theory
0.10 m.their =
ations..A 111 110 01 00 example
solved 2.31252.3125
H -og, k=1
probabilities such
redundancy &
exactlynot
that bits/ Source
Length (2) log:( R,
their messsage
shown. -(xg;8) Coding).Page
SACHIN of as
of shown. sum
3 3 2 2 2 the
the equal,
dividethe
code. is it
SHAH codew close the
sum
Venture ord to No.
each of (1-23)
the
Modul
Dgal
L
(New codeword
Average
length
Soln.: 1.112: Entropy H Comniaon
Sylabus entropy
Source() Averege
.H2.1551
sybalbaaf H=
Px
L |S) efficiency
Construct L
drscrete A bits
225 = = =
waf = codeword mbol
kel 08+0.38 08+038 +0.15 (04x }P,x (MJ-
ademic Probability and
Shannon memoryless
0.15 0.15 0.19 0.40 x 2) Sem.
0.11 code 3) +
length symbol +0.32 10.19
+ (length
+(length 5-
year 03+0AS redundancy.-Fano (0.10 x2)
(L) source + of E&TC)
21-22) 0
Stage 0.45 x +
3) each
each of Probability0150.11
code0.400.1Symbol
0.519 +03 (0.16
+ i has codeword)
M5-69) 3
0.3codeword) and an x
Slage determine 2)
= 1 1 0 1 alphabet
2.26
1|
bits/ (0.4 = of
Stage following
symbol 1 0 five R
2)x I1
eo symbols Code (infometiParone
(0.19 + Codeword parameters, Redundancy
L-H
x 111 110 10 01 with etficiency
2) = = Theory
Stheir 2.25-
-, 21495
sA (0.15 + Eotropy, of 04 3&
Length probabilities the (n)
lo g:
x 2.100=x100
1495 bits/ +015 Source
2) o
+ of Average code 2.25
(0.15 3 2 2 each
2 =0.1005
symbol log: +oding).
3 as 0. 191
HIN x codeword codeword shown.
015 log:
3)
+ +0.1
0.15+0.16
(0.11 =
HAH length.cod
95 log,a No
x3) 534 (1-24,
Ven ka
Ex. S 1.113:Ex Code Digtal
R
(New
1.11A: Average Soln.:
Syllabus efficlency
Code(n) entropy
Source(H) Redundancythe of Communication
Using A
%n L |Symbol etficiency(n)
L
discrete codeword S< S S Constructthe A
w.e.f Shannon = = log, -H = = = discrete
L k=l
= 0.7+0.46+0.32 (0.35 k=1
x 2.1958 0.35 XP, (MU-Sem.
EsTC)5-
cademic memoryless Probability
100 log, x 0.10 0.16 0.16 0.23 0.35 memoryless Y
- bits/ 2) (length xlength Shannon code()
Fano = o35 1 + =l-n
2.26 symbol (0.23 -
year |Probability
code source of (L)
21-22) Symbol 023 + + x Stage - 100=
find 100 0.48 2) each 1 Probability0.350.230.16
1 =1-95.35%
Fano 0.160.10Symbol source
emits log, +
entropy, = +0.3 (0.16 codeword) I code has
(MS-69) 97.15% 23+016
six =2.26 Stage an
0.11 x 1 1 1 0 and
average messages 2) alphabet
+ II find 4.65% = 10
0.13 S
bits/ (0.16
log: Stage the
codeword 0.16 with symbol x 0 of 9553% =
S
0l6+ 3) effciency. frve
their + II (lntomaton
(0.10 symbols
ech-Neo
0.31 0.10 Codeword
length, probabilities
x 111 110 10 01 00
0.230.06 log: 3) with
code D.10 Thecry
tions..A probabities.the
efficiency S as +0.16 Length
shown. &
Sourte
lo8: of
and 3 3 2 2 2each Codng
SACHIN code
o6
codeword
redundancy. Page
SHAH
tio
Venture (1-25)
Digtal
s
Ex
1.115:Redundancy efficlency
Code(n) Source codeword
Average
length Communicaton
Syatbis |
1-n $n Symbol
Conttract
Shrno- A entropy LP,
dscrete = = kl
acndeic = kel 0.62 (0.31 (MU
1-(97.99% 2.4106
*100=246of 0.11 (H) 0.16 0.23 Probabi
0.31 l ty -Sern.
memoryess the log: +0.46 x x 0.06 0.11 0.13
bits/ 2)
code + (length
) 24106 0i+013
symbol +0.48 (0.23 5-ETC)
yer 201% = (v) x of Stage
source =97.99%100 x +0.39 1 1 1 0
Prebablity0.19
0380.11
0220.10 2) cach 1
22-M5 Symbel log: + lj
emits
0i+0.16 +0.33 (0.16 codeword) Suge
1 1 0 0 1
fve x3)
+0.18 + lI
syobol (0.13
log: = 1 0
Sage
0
with 06+031 2.46 x3)
+ l
probabilities s/
bit(0.11x3) (inlormation
symbol 110 101 100 01 Codewor00d
edh-Neo erage log, 11
031 + Theory
codeword Se as (0.06
onsA shown. +0.23 Length &
x Source
3)
Jength. log, feach of
3 2 2
3 3 Coding).
028
codeword
SACHIN S code +0.0lo6hk
8:
efficiency (1-26
PageNo
SHAH
and
Vetu cod
(New Wbycoding.
average Shannon-Fano S ß Digital
UQ,
gives I
Huffman's It HUFFMAN
CODING
1.12Redundancy efficiency
Code(n) Source Average Soln. :
is Explain Communicatton
Sylabus word
type
Jength. better the entropy S SynbolS
of codeword
w.e.f code
prefix Huffman of
academic it results
HenceBVen is (MU-
MU 1-n the %n =0.19 :H (H) Probability
known coding 0.10 0.19
encoding 0.11 0.22 0.38
as -Q. code Llength
Sem.
entropy,
known is
compared =1- -fx = =0.76
0.44 + = =XP,
k=l
-,k=l log: H
year as that 2(a), symbol
2.1484
bits/ (0.38
optimum (v) 5-
procedure. (97.21%) (L)
21-22) performs Dec. =x100 100 log: x Stage E&TC)
'optimum as it
provides 1 1 1 0
to
2) (length x
17, 2.1484 016+0.22 +0.38+0.33 (0.22 + I
(M5-69) Shannon-Fano code?
10 =
better 2.79% Stage
minimum Marks x of 1 1 1
code'. 2) each II
than log: + ----------.
Modu
(Information Theory & Source Coding).
Stage III
. StPaage No.(1-h
ge 1Vo
(MU- Sem. 5-E&TC) Meesagee
Digital Communication 0.30030 0.30
-0.45. Digital Communlcation (MU - Sem. 5- E&TC) (Information Theory &Source Coding)...Page No. (1-31)
m
M Soln.: Modu
Message m 0.25 0.25 0.25
030 Symbol Stage I Stage IV Stage V 1
Probability0.4 0.2 0.2 0.1
0. 0.20 -0.20
o25 0.251 Stage II Stage III Stage VI
Message X X
= 2.2 bits/message 0.0625 1
Probability 030 0.25 0.20 0.12 0.0800s
Entropy of the source
00 10 11 011 0100 0101 (287)Fig.P. 1.12.3
5 Codeword
k=l
(1) Average codeword length (L) Symbol Probability Code word No. of bits in codeword
6 So 0.25 10
L = P, (length of each code word) S 0.25 11 2
k=1
S 0.125 001 3
L =(0.3 x 2) + (0.25 x 2) + (0.20 x2)
0.125 010 3
= 0.4 log, 0.4) +0.2 log, + 0.2 log, + (0.12 x 3) + (0.08x4) +(0.05 x4)
= 0.6+0.5 +0.4 +0.36 + 0.32 +0.2 S 0.125 011 3
0.0625 0000
+0.] logao.1) = 2.38 bits/message
S, 0.0625 0001
=0.5287+0.4643 +0.4643 + 0.3321 + 0.3321 (2) Source entropy (H)
Source code efficiency (n) :
.H 2.1215 bits/message 6
Table P. 1.12.3 .L = (0.25 x2)+ (0.25 x2) + (0.125 x 3) + (0.125 x 3) + (0.125 x 3) + (0.0625 x 4) + (0.0625 x 4)
S :L = 0.5 + 0.5 + 0.375 + 0.375 + 0.375 + 0.25 + 0.25 = 2.625 bits/message
Lmbel S S Ss S
0.0625/o088
0.250.25 0.1250.1250.125
(New Sylabus wef academic year 21-22) (New Syllabus w.e.f academic year 21-22) (M5-69) Tech-NeoPublications..A SACHIN SHAH Venture
(M5-69) /SHAHVentb
dTech-Neo Publications..A SACHIN
M
Ede Clasider HenceHence Therefore, Digtal
UÊ. provd.
New entropy
Source(H) Average
Soln.: 1.12.4 suNe
Sylabus the a Communication
:L codeword Huffman's soure DA(S, Code
MU codeetticiency
L -
w.ef = = = S Q.
kel
using = a)efficiency
0.4+05 (0.40 (S,. (MU
ademic length algonthm HutProbabillty
inn S., Dec,
x Px is Sem
1) S,
+0.45 + (00. - 15, 100
(0.25 ( is ....S.
year 0.05 0 015 0.25 10 . 5-E&TG)
+0.4+0.25S of 10 040040 cariedalgonthm.
(MS-69)
21-2) x bits 040 Marks
2) with
+(0.15 per -005
out
0.15 0.25 as Find
codeword) 0.10 0.25 following
+0.18 x follows. the
3) aerage
+0.12 +(0.10 0.25 040 015 nessage
S,
Fig. 0.15
1.124P. oode
= x4)
23 Ss SSmbo!
L (0.05 length 0.10 S, (ntormatiton
040 Sage in mies
ch-Neo x 0.02 0.03 0.05 0.10 0.15 0.25 0.40 Ps) 0.02
25 and
5)
+ Codewond ctticiency. 0.05
Stage Iv S, Theory
(0.03 000001000000 00001 0001 0.40
001 1
ns.A
x 0.03 S, &
6)
+ codeword No.of
bits
Stage
Source
(0.02 6 5 4 3 per
2 1 06 0.02 Coding)..Page
x6)
ACHIN
SHAH No
(l
Vven
obtain redundancy
MConstruct symbols AUEx. effliclency
Code(n) Digital
(New 0.06, The UEx. Symbol m m
Probablty
Symbol discrete
M m Soln.
M. M, M, encoded
wvords. 1.12.5
0.13,0.1, nine 1.12.6 M .%n=97.90 % Communicatlon
Syllabus ProbabilitySymbol M,M with
symbols : Hutfmanof n H
(MU Probability 0.16 0.19 0.4 Huffman's the
memory MU
their
0.10 0.5 , -
w.e.f 0.16 0.19 04 -
Detemine viz. - 0.10 0.15
Q. code. probabilities Q. 0.5287 +0.03 04log:
academic (289)Fig.
1.12.5P. code 0.4 less 2(0), (MU
Al, 2(a).
0.161 0.25 0,4
Stage I coding source +0.5 log,
the A2, Dec. Codewords o19 and 0.19 Dec. 0.4 Sem.
HuffmanA3,.. 010 001 000 shown. 16, +0.4105 h0t +0.2 5
year 18, algorithm calculate 0.16 M has 5-
0.9790
10 Dec. +0.02 E&TC)
21-22) A9 0.4
Stage II an log,
code, Marks) 025 M 17, +0.3321
have codeword No. code 0.15 alphabetfiveof 10 lo ga 025
calculate is
(M5-69) of 0.10 Marks
coresponding 3 3 -0.6 Stagellfollowed efficiency, M
Bits
0.02 +0.15
04 +0.2160
the per
to log,
average +0.1517
probability efficiency
Code(n) entropy
Source(H) Average 015
Redundancy
a code (Intormation
+0.10
Tech-Neo +0.1128
word ..:%n :H codeword
of n L= log:
occurrences Y
length, log:0.16 bits/symbol2.2 - 0.0.45+7 = = 0.10 Thoory
= of = = =+0.log,I o+0.16 = k=l kal 2.2518
ublications..A the 97.70 0.4 +
(0.4x
entropy 1-n=1-(97.70 H +0.33210.5287 (0.15 +0.05
code x log: o length
% 100 1) P bits Source &
as +0.4552 xx(Number
(Y) = 3) + log,
and 0, = +0,48 (0.19x3)
+ L /message
12, 2.2 2.1495 +0.19 (0.10 a0s
Coding)..Page
(1-33)No.
SACHIN coding 0.2, + ogs0.+0.115 +0.4S
+0.3
0.08, %) 0.4230+0.4105
bits lono
g, x of
efticiency, = 3)
(0.16
+ bits
SHAH 0.2, 2.3 00 /symbol
per
% x
0.02, codeword)
3)
Venture
0.04,
Modul
1
Digital
) EntropPY 0
Soln. :
ew Communi
Averag codewordper
Number A
ProbablitSes
3ymbots cation
bisetProbabt
Codeword A
Sytalbus L Symbol
= - 24$ - =
=bitsmessag
06+04+032+04+0.10012x3 0070+04643
codeword9 013 02
adenic Q06 0.2 (MU-
P 01
010.12
x( C11 012 A
)+02 Number Sem.
length
+0915+04643
t00=x x -008 5-
x 10 Az Wo12 012 02
02
year 2) of 2 E&TC)
+(0.08 bits
21-22)
mesage / -02 02
4 As
$-69) +0.20+0.3N+039 4)x
100= +02x2) +0.1128
)
1012% 1.1Rg.26P. 11 02
A -02
2
+Q18S7
+0.2x
+0,4 01001 002 As
5 024 Informatlon
+02435+0.3826 02
Stage (IV
5)
-eo +(0.04 01000 0.04 A
5 Theory
x5)+ 02 02 0.24 0.31 3age V
sA &
+0.3321 0101 0.06
(0.06 4 Ar Souroe
StageVI
x4)+(0.13 0.24
0.3104 0.4 oCidP)gn...age
(1341No.
001 0.13 As
HIN 8tage
VII
x 0.550
3) log006
+0.06 0000 0
A
HAH +
(0.l
Ven X4)
(iv) (i) (i) 0.40,
()Consider UEx. Digital
(New Average(u)
soln.: Determine
Determine
Let
Huffman Determine 1.12.7
respectiveCreate 0.20, Communication
Syllabus the an
0.12,
messages aalphabet MU
L coding
entropy the length
Huffman 0.08,
codewrord
length(L) the -
w.e.f coding average Q.
be of 0.08, 2(a). (MU
cademic 0.4+0.6 (0.4x Probabilty
Messages
StageI algorithm ot of
m
M,. codewords Tree discrete
P ms m the
efficiency. codeword 0.08,
m
May Sem.
)+(0.2 X M, specified following
+0.36 (length Message and 18,
year ....... is memory 5-E&TC)
0.04 0.08 0.0802012 0
0.404 followed for 10
+0.32 x M
M, M M, M, M, M,
o 12 length. 0.04.
08 discrete Marks
(MS-69)
21-22) 3) of
+ M, each the
(0.12x3)codeword Probability less
+0.32 as of
standard
0.04 0.08 0.08 0.0S 0.12 0.20 0.40 -0.12 source
0.12 shown memoryless the
+0.32 for given
+(0.08 p810Fig.
1.12.7P. algorithm having
each +02 Stage Il
below.
+0.16 Codeword -016. 04
source
message)
x O101 0100 0011 0010 000 source.
4) following
symbols. for
+(0.08 StageIlI
(lnformation
2.48
0.4 Huftman
tions.A
ech-Neo bits x
Length source
/symbol 4)
++ Stage Theory
(0.08 3 3 of 0.4
encoding symbols
codeword IV
x &
4) with Source
+(0.04x Stage und
V compute their
Coding)..Page
Venture
SACHIN
SHAH 4) respective
the
codeword
probabilitiesI No.
(1-35)
and
Modu
(nformation Theory &
Source Coding). Pa
E&TC)
Sem.5-
CouncasonMU-
Dgt Digital Communication (MU - Sem. 5 - E&TC) (Information Theory &Source Coding)..Page No. (1-37)
entrogy()
ISoure Average codeword length (L) Modu
og8+004.1 n
%n = x 100
=2.2x100=97.95 %
24202
Code Redundancy =y= | -n =|-(97.95 %) = 2.05 %
2.48
five symbols with thei probabilities ass
memoryless source has an
alphabet of
S
shown below. Ex. 1.12.9: Adiscrete memoryless source has an aiphabet of five symbols with probabilities :
E 1.128: Adiscrete S S,
Syabol Symbol S, S, S, S, S.
0.400.15
Probabiity 0.15 0.11 0.19 Probability 0.35 0.23 0.16 0.10 0.16
Construct Hufman code for each symbol and determine entropy, average codeword length, code efice ) Construct Huffman code. Find entropy and average length of the code.
(iü) Calculate code efficiency and the redundancy of the code.
and code redundancy.
V Soln. :
Z: SyrnbolProbabilities StageI Stage II Stage III Symbol Probabilities Stage I Stage II Stage II
-06 S 0.35 +0.35 0.39 0.61,
0.40 +040 0.40
041 S 0.23 -0.26 o.35 0.39
0.19 r026
Sn 0.16 0.23D 026
0.15,
0.15
0.11
Ss
SA
alelelalel
0.16
0.10
0.16
2812)Fig. P. 1.129
1 g P. 1.128 Symbol Probability Codeword Length of the codeword
Sbolrobability Codeword Length of the codeword S, 0.35 00
3 S, 0.23 10
S 0.15 001
S, 0.16 11 2
S 0.11 011 3
S, 0.10 011 3
0.19 000 3
S. 0.16 010
S 040
Average codeword length (L)
0.15 3
Moul
Digital
S S
Ex. S
(New 1.12.12: Code Source codeword
Average
length Communication
Soln. Redundancy
Sylabus =x %n
100=efficlency L
Huffman : Y H log: H
A =
pentropy =L=
Construct
Entropy, discrete = k=l
w.ef =1-n=1-(98.87 0.5210 0.30 = k=l 0.6+0.4 (0.30 (MU
of (H)
algorithm the (n) log, P, Sem. -
ncademic Average
Huffman memoryless x
code +0.4643 1
+ 2)(length x
+ S. ProbabiSymboll ty
2.2345 030+0.20 0.48 S, S, S 6
2.26 (0.20 -
Probabillty is
codeword () E8TC)
year Symbolfollowed code +0.4230 +0.3+0.48 of
source - 2)
x each
for
Probability Symbol %) 98.87 %
21-22) log: + 0.24 0.10 0.16 0.20 0.30
length,
as each has 1.13 = (0.16xcodeword)
0.19 0.2 0.3 shown
+ 090
(M5-69) 0.140.19
040 1 below : symbol an % 0.3321 =
code alphabet 2.26 3)
P1.12.12
G19g 0.19 S, +0.4941 0.16 10 011 010
Codeword
11
-0.38
bits(0.10x +
0.22 Stage I efficiency and log:
determine of symbol /
021 0.22|0.10| S.
S five = 16
D. 3) (Informat
and symbols 2.2345 + Length ion
+0.10 (0.24
Stage I code the
Tech-Neo
SHAH 0.4 bits of
0.22 0.0.348 following
redundancy. 0.380.11 S,with x
/symbol log2 2)e 2 3 3 2 2 the Theory
their ).10 codeword
tions.A
SACHIN
Ventut StagelI parameters : probabilities &
0.6 0 +0.24 Source
1o8 Coding)..Page
as 024
shown
below
No.
:
(i) A S S Digital
Syllabus
(New Codewords (iü) (i)
discrete UEx.
Soln. Determine
: Verifysystem.
Compare Code Source Average
1.12.13 efficiency
Code
(n) Communication
.L
memoryless : L
using the (MUredundancy entropy
andaverage =
H H = bits/symbol
2.21 = = = codeword
w.e.f the =
Huffman's comment -
Q. 2.14840.4552 0.19 = k=1 0.38 (0.19 k=1
Minimum (MU-
academic code-word source 2(a), 6 (H) +0.44
(n) log, P x P,x
Probability
Symbol Dec. Y (Y) bits0.4805 + 2)
code on has
=l-n=l-(97.21 019 log: ( length Probability
Codeword
Symbol Sem.
the Variance =x symbol / +0.3 (0.22 +length
year an 19, p S S S; S, 5-
(minimum results length +0.+ 22 +0.76
21-22) alphabet 10 0.3321 x of (L) E&TC)
Huffman Marks) 10 2) each
ofusing log, +0.33 (0.10
+ 0.11 0.38 0.10 0.22 0.19
variance) both. =214842.21 + code
(M5-69) of 0.5304
03+0.1
Shannon six
code-words 2 MI x word
symbol %) =97.21 e 3)
+ +
Fano,
= 0.3502 (0.38 )
1/4 M2 2.79 log8: 100 00 101 01 11
with
and % 0 x
their 2)
ech-Neo average 1/8 M3 +0.38 + (Information
probabilities (0.11 Lengththe of
1/16 M4 log:038
code-word x3)
1 3 2 3 2 Theory
ications.A 1/32 MS
as
length shown: +0.11 codeword
1/32 M6
&
Source
and log:
SACHIN hence 0.1i Codling)...Page
find
SHAH Entropy
Venture No.
of (1-41)
the
Modu.
() Digital
() (W)
(New Entropy
Codeworde L=
Average Communication
Sylabus
|Symbol kel
kel 6
M.
-}+;+++i19975bits/message ) of
waf
}+**1937s
bilswmessage
-(t:)-(7-9-(-)-()-(s)
M codeword P, the
uslng x ystem
ProbablIes
8ymbol
m
s! (MU-
ecademic Probabltltles (Number Numberbitsof
symbolper CodewordMessage Sem.
Shannon length
5-
year of
bits E&TC)
21-) StageStage Fano in
code
1 1 m
(MS-69) Coding word)
0 Stage I
Sage 1.12.Fi1g3. P. 2
II
Stage 3 01 8tage ll
(Infomation
Tech-Neo Stage 0001 m
-(h) A
Stage lI
Theory
Codeword 00000
lications. 10 0 5 &
StageIV Source
000001
message
bitsper 5 m Coding)..Page
SACHIN A No.of
2 1
SHAH No
(142
Ven
(V) Digital
Syllabus
(New
w.e.f (ii) (i) i) A bothvalues Comment I
discrete UEx.
CompareVerifysystem. Determine algorithms From Average Communication
1.12.14 both L=
memoryless
-}+;++*
=19375 are the k=1
Symbol
the (M U ) 6 M, M, M,
and -()-3)-(:)-(%-)-
(h)-() average
issame. codeword M
average the P,
comment -same
Q.
Minimum Thus x( Probabilities (MU
academic source 2(a), in codeword Number
code-word
May this both length 32 32 16 1 Sem. -
on
Variance
ProbabilitySymbol has case.
year the 19, e
thlength of 5-
results
both.of
length an bits
alphabet 10 algorithms Stage E&TC)
21-22) Huffman Marks) in 1 1
of code
using Huffman
(M5-69) of word Stage
Shannoncode-words six works 1 1
symbol bits'message )
M with code IIIStage
1
Fano,
with same and
and 0.25 M Stage
Shannon IV
average 0.15 their efficiency 1 1 0
(Information
M,
Tech-Neo probabilities
Fano 1
Codeword
Stage
code-word M, and
0.12
code, Theory
redundancy.
lications..A 0.080.10 11111 11110
M, as it 1110 110
length shown: is &
seen Source
and Also that messagebits
per
hence Coding)...Page
SACHIN compression for No.of
the 4 3
find
SHAH Entropy given
achieved
probability No.
Venture of (1-43)
the
by
Modul
1
Hutfman
coding S Digital
Shannon-Fano
Code e (0) ()
(New L= lH=
og R Soln. :
=0.6+0.5 = kel Average=0.5210 =0.3 k=l
Entropy Communication
Sylabus (0.3 6 6
x P log,
2) Codeword
x +0.5 1
of
w.af +0.45+ +
(Number 03+ the
(0.25 (MU-
academic M M, Symbol
+0.4105 0.25 ProbabilitySymbol
Codeword
system Number
x ProbablBes
ymbols 8
0.36 2) of m Sem.
log:
Probabilites s length
(0.+15 + bit+0.3670+0.2915 of
year 02s bits 5-E&TC)
0.12 0.15 0.25
0.24 codeword) /
0.3 x +0.1 5 0.08 0.10 0.12 0.15 0.25 0.3
Z1-22) +0.30 3) per
+ codeword 1
(0.=12 log2
(M5-69) Stage 2.45
1 x +0.3321 05
bits/message 3) 0.18 I
+0.12 1.12.Fi1g4. P. 012 15 0250.27 03 Stage
+ 00 0.3
0 Stage (0.08 2 M
=
x 2.4221
log,012+0.08 0.25
Stage 3) 2 10 M, 0.3
8age II
0 +
(0.10x3) bitsmessage 0181 (Information
0.15
Tech-Neo Codeword 3 010 M
10 100 01 00 8tage Il
log 043
0.12 M4 o7 Theory
0.08+0.10 3 011 0.30431
ations..
messagebits
per .08 &
3 111 Ms Stage IV Source
No.of 057
3 2 2 log2 0.10 Coding)..Page
A 010 3 110 M,
SACHIN
SHAH
No.(1A
Venture
cases.Comments G Digital
(New 6 5 4. 3 2 1. No.
7. 6. 5 4 3 2 1. No. Sr Sr.
9 8 From
1.13 Thus Communication
Syllabus
1.14 Encoding
method Efficiency
Generation code of length Principleof
Code inputs algorithm
Adoptability
communication Correlation
Usesystem Stage Complexity
encoder
data decoder ofSize
ofRedundancy Parameter
Definition
Example inComplexity it the
of
COMPARISON COMPARISON can
Huffman
Parameter
w.e.f data be
of Symbol
optimal said (MU
academic block for Ms M,
between that code
changing -
code the tree Probabilitles Sem.
year
OF OFaverage and
Removed storage
space
and is
Itacoding 0.08 0.10 5-
Huffman
coding
Formatting Initially compressed
Simple
ItComplex Small
and redundancy Source SOURCE
21-22) decorelates p to codewords
BottomtoAlways Alwaysthe probabilities
Highproduces and
thisTree their It HUFFMANShannon-fano E&TC)
is codeword
source constructed
used
(M5-69) generates Stage
of eliminating
process
to of CODING needs Huffman
coding 1
baseband coding the
data. transmissionmake CODING length table,
1.14.1
Table 1.13.1
Table Stage
an to on 1 1 II
is efficient optimal be set in it
signal carried AND both is
preserved of AND seen Stage
Codeword
channels. CHANNEL shortest symbols cases 1 0 (Intormation
rech:N out. use code that
Tech-Neo SHANNON-FANO
for wil the
of be 111 110
number
Application
Hamming Channel
codes encoder. Complex expanded
It Simple Largeand Addedmapping
error
make Channel
Itcoding Easy their
bottomTop to It Lowalways.Does probabilities It Theory
correlates
data.
the is CODING is is equal.
cations...A
process suboptimal to also
not of bits
encoder adapL.
constructed bits &
applied
correction
possible. Shannon-Fano
coding Source
of produce per No. of
error by CODING per
is code No 3 3
message
after
which codeword Coding)..Page
SACHIN to need
correction on
a shortest
the digital pre to set
SHAH source preserve of
transmission are
codes. symbols
signal codewords same
Venture tree. No
and in (1-45)
to both
Modu.
Digital Communication (MU- Sem. 5- (information Theory & Source