IET Faculty
Dr. Wassim Alexan
COMM1003 Information Theory Grade:
Quiz 1 Duration: 30 minutes
20 multiple choice questions (MCQs) follow. You are required to solve them all, by making a single
choice out of the available answers and fill in your choices in the provided table, by shading the cells
corresponding to them:
MCQ # A B C D MCQ # A B C D
1 11
2 12
3 13
4 14
5 15
6 16
7 17
8 18
9 19
10 20
1. The entropy of a discrete random variable X is defined by
1
A) H(X) = - 2 p(x) log p(x)
x
1
B) H(X) = - x p(x) log p(x)
x
C) H(X) = ∑x p(x) log p(x)
x
D) H(X) = - x ∑x p(x) log p(x)
2. For a given event with entropy Hⅇ (X) = 1.56, the units should be
A) bits
B) 4-its
C) ⅇ - its
1
D) nats
3. The entropy for a fair coin toss is exactly
A) 5 bits
B) 3 bits
C) 2 bits
D) 1 bit
HofCT = - 0.5 Log[2, 0.5] - 0.5 Log[2, 0.5]
1.
4. The following statements describe which quantity?
“It is the amount of information one random variable contains about another random variable”
“It is a measure of dependence between two random variables”
“It is the reduction in uncertainty of one random variable due to another random variable”
A) I(X; Y)
B) H(X, Y)
C) I(X Y)
D) H(X Y)
5. The following information diagram is missing some quantities. These are...
A) ♣ = H(X, Y), ☺ = H(X Y), = I(X; Y)
B) ♣ = I(X; Y), ☺ = H(X, Y), = H(X Y)
C) ♣ = H(X Y), ☺ = I(X; Y), = H(X, Y)
D) ♣ = H(X Y), ☺ = H(Y X), = I(X; Y)
2
6. For which value(s) of p is the binary entropy function H(p) maximized?
A) 0
B) 0.5
C) 1
D) 1.2
7. The capacity of the binary symmetric channel is given by
A) C = log2 13 bits per channel use
B) C = 1 - H(p) bits per transmission
C) C = 1 - α bits per channel use
D) C = 1 - 3 H(p) bits per transmission
8. A certain communication channel is characterized by a transition matrix as follows
0.3 0.2 0.5
p(y x) = 0.5 0.3 0.2
0.2 0.5 0.3
and thus, its capacity is
A) C = 0.09948 bits per transmission
B) C = 0.9812 bits per transmission
C) C = 0.07871 bits per channel use
D) C = 0.7871 bits per transmission
9. Nowadays, most cryptanalysis is carried out at ...
A) universities, academic institutions and research centers
B) intelligence agencies and international organized crime teams
C) scientific recreation centers worldwide
D) international crypto conferences
10. Symmetric cryptography is generally divided into three subclasses, these are:
A) Modern, mathematical and automated
B) Substitution, mechanical and automated
C) Automated, classical and modern
D) Classical, mechanical and modern
3
11. Some examples of mono–alphabetic ciphers are
A) The Caesar cipher, the Atbash cipher, the Pigpen cipher
B) The Homophonic cipher, the Atbash cipher, the Caesar cipher
C) The Railfence cipher, the Homophonic cipher, the Atbash cipher
D) The Pigpen cipher, the Baconian cipher, the Vigenère cipher
12. The ciphertext XLNN was obtained using the Atbash cipher. Its plaintext is
A) cell
B) roll
C) comm
D) luck
13. The ciphertext VXFFHVV was obtained using the Atbash cipher. Its plaintext is
A) repress
B) enhance
C) depress
D) success
14. The Affine cipher is based on a mathematical formula, given by
A) E(x) = (a x + b) mod a
B) E(x) = (a x + b) mod b
C) E(x) = (a m + b) mod x
D) E(x) = (a x + b) mod m
15. The Data Processing Inequality, such that X → Y → Z, is given by
A) I(X; Z) ≥ I(X; Y) and its main idea is that processing destroys information
B) I(X; Z) ≥ I(Y; Z) and its main idea is that processing enhances information
C) I(X; Y) ≥ I(X; Z) and its main idea is that processing destroys information
D) I(X; Y) ≥ I(Y; Z) and its main idea is that processing destroys information
16. A noisy typewriter channel uses the German alphabet (which is basically the same as the English
alphabet, in addition to four extra letters: ä, ö, ü and ß). Its capacity is approximately
4
A) 3.7 bits per channel use
B) 2.7 bits per channel use
C) 3.91 bits per channel use
D) 2.91 bits per channel use
Log[2, 30] - 1 // N
3.90689
17. Considering the provided table of values, the Kullback–Leibler distance between p1 and p3 is
given by
p1 p2 p3
COMM 0.3 0.7 0.6
NETW 0.1 0.2 0.1
ELCT 0.6 0.1 0.3
A) 0.08717
B) 0.1537
C) 0.5108
D) 0.2359
THIS QUESTION IS CANCELLED FROM GRADING, BECAUSE IT HAS A MISTAKE!
18. Which of the following is NOT representative of the mutual information?
A) I(X; Y) = H(X) - H(X Y)
B) I(X; Y) = I(Y; X)
C) I(X; Y) = H(Y) - H(X Y)
D) I(X; Y) = H(X) + H(Y) - H(X, Y)
19. The Rotokas language is spoken by 4,320 people living on the island of Bougainville, which is
part of Papua New Guinea. This language has an alphabet consisting of only 12 letters. A letter
frequency analysis was carried out on a short body of text written in Rotokas, shown next. One
can calculate the entropy of the Rotokas language getting
5
Letter % Letter %
a 23.2 p 5.8
o 13.2 t 4.7
r 12.3 e 4.3
u 11.4 s 1.9
v 11.4 k 0.5
i 10.9 g 0.4
A) H(Rotokas) = 4.721533
B) H(Rotokas) = 3.11711
C) H(Rotokas) = 2.55962
D) H(Rotokas) = 4.44111
- 0.232 Log[2, 0.232] + 0.132 Log[2, 0.132] + 0.123 Log[2, 0.123] + 2 × 0.114 Log[2, 0.114] +
0.109 Log[2, 0.109] + 0.058 Log[2, 0.058] + 0.047 Log[2, 0.047] + 0.043 Log[2, 0.043] +
0.019 Log[2, 0.019] + 0.005 Log[0.005] + 0.004 Log[2, 0.004] // N
3.11711
20. The input source to a noisy communication channel is a random variable X over the three
symbols a, b and c. The output from this channel is a random variable Y over these same three
symbols. The joint distribution of these two variables is as follows. What is the value of H(Y X)?
x=a x=b x=c
1 1
y=a 0 6 6
1 1
y=b 6
0 6
1 1
y=c 6 6
0
A) H(Y X) = 1.05664
B) H(Y X) = 1.52832
C) H(Y X) = 2.58496
D) H(Y X) = 1.75821
6
7