KUMARAGURU
COLLEGE OF TECHNOLOGY
COIMBATORE – 641 049
B.E-ELECTRONICS AND COMMUNICATION ENGINEERING
U18ECI6201T-COMMUNICATION ENGINEERING-II
Sixth Semester
ASSIGNMENT-2
Post date: 11.03.2023 Submission Date: 24.03.2023
Course Outcomes Assessed:
CO1: Demonstrate digital communication system and estimation techniques used in the
receiver(K2,S3).
CO2: Apply and verify source coding techniques (K4,S3).
Assignment Rubrics:
Depth of Understanding-3
Prompt Submission and Neatness-3
Correct Solution-4
Assignment Questions:
1. Construct orthonormal basis functions for the following three signals using Gram Schmidt
Orthogonalization procedure and express each of these signals in terms of the set of basis
functions. Also draw the signal space diagram.
2. Obtain the Ortho normal basis functions for the following signals using Gram Schmidt
Orthogonalization procedure and express each of these signals in terms of the set of basis
functions. Also draw the signal space diagram.
3. Verify whether two signals are orthogonal over one time period of the signal with smallest
frequency signal.
x1(t) = Cos 2πft and x2(t) = Sin 2πft
x1(t)) = Sin 4πft and x2(t) = - Cos (πft –π/6)
4. Consider the signal s(t) as shown in the figure. Determine the impulse response of the filter
matched to the signal and sketch the signal. Plot the matched filter output as a function of
time. What is the peak value of the output?
5. Derive the optimum receiver for the signals transmitted over AWGN channel.
6. Derive the impulse response of filter to match the given signal.
7. A high-resolution black-and-white TV picture consists of about 2 x 106 picture elements and
16 different brightness levels. Pictures and repeated at the rate of 32 per second. All picture
elements are assumed to be independent, and all levels have equal likelihood of occurrence.
Calculate the average rate of information conveyed by this TV picture source.
8. Encode the following source using Shannon-Fano and Huffman coding procedures and
calculate entropy of the source, average code length, efficiency, redundancy and variance.
Compare the results.
X x1 x2 x3 x4 x5 x6 x7
p(X) 0.4 0.2 0.12 0.08 0.08 0.08 0.04
9. Construct Shannon-Fano code for the alphabet containing six symbols with probabilities
0.125, 0.125, 0.125, 0.125, 0.25 & 0.25 and calculate its efficiency.
10. Consider a discrete memory less source which emits five possible symbols Xi={1,2,3,4,5}
with equiprobable probabilities .Construct Huffman code and calculate the coding
efficiency.
11. Consider a discrete memoryless source with source alphabet X = {x0,x1,x2} and source
statistics {0.7,0.15,0.15}.Calculate the entropy of the second-order extension of the source.
12. Consider the four codes given below.
Identify the prefix codes and construct their decision trees.
13. State and prove Mutual Information properties.
14. Find the mutual information of the channel shown in Figure. Given P(x1)= 0.6 and
P(x2)= 0.4
15. Derive the channel capacity of Binary Symmetric Channel.
16. Calculate the capacity of an AWGN channel with a bandwidth of 1 MHz and S/N ratio of 40 dB.
17. Find the bandwidth of picture signal in TV if the following data is given where TV picture
consists of 3 x 105 small picture element. There are 10 distinguishable brightness levels
which equally likely to occur. Number of frames transmitted per second = 30.
18. Consider a DMS X with two symbols x1 and x2 and P(x1) = 0.9, P(x2) = 0.1. Symbols x1 and
x2 are encoded as follows:
Find the efficiency and the redundancy of this code. Determine the probability distribution
for the extended source of order 3 and design Huffman encoder. Compare the efficiency with
the previous case.
****************