0% found this document useful (0 votes)
29 views22 pages

Practical File

The Laboratory Manual of Fundamentals of Information Theory and Coding outlines objectives for students to learn and implement coding algorithms related to information theory. It includes a list of experiments, expected outcomes, and equipment needed for the course, focusing on concepts like entropy, binary channels, and coding algorithms such as Shannon-Fano and Huffman coding. The manual provides detailed programming tasks and theoretical questions to enhance understanding of the subject matter.

Uploaded by

andrewtatehey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views22 pages

Practical File

The Laboratory Manual of Fundamentals of Information Theory and Coding outlines objectives for students to learn and implement coding algorithms related to information theory. It includes a list of experiments, expected outcomes, and equipment needed for the course, focusing on concepts like entropy, binary channels, and coding algorithms such as Shannon-Fano and Huffman coding. The manual provides detailed programming tasks and theoretical questions to enhance understanding of the subject matter.

Uploaded by

andrewtatehey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Laboratory Manual of Fundamentals of Information Theory and Coding

etwork
Shri Vaishnav Vidyapeeth Vishwavidyalaya,
Indore

Laboratory Manual of
Fundamentals of Information
Theory and Coding
(BTDSE321M)

Prepared By
Dr. Swapnil Jain

July-Dec, 2024

Page 1
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

INFORMATION THEORY AND CODING LABORATORY


OBJECTIVES:
The student should be made to:
 Be exposed to the information theories and their coding.
 Learn to implement the algorithms of coding.
 Learn to use and apply the information theory and coding algorithms.

LIST OF EXPERIMENTS:
1. Implement the following coding algorithms:
o APPLIED THE ENCODING

o DISCRETE ENTROPY FOR PROBABILITY


o IMPLEMENT ENTROPY FOR PARTS OF MESSAGE

o COMPUTE THE ENTROPY OF MESSAGE/TEXT

o NOISELESS (NO NOISE) BINARY CHANNEL

o BINARY SYMMETRIC CHANNEL BSC CAPACITY

o SHANNON- FANO CODING ALGORITHM

o THE HUFFMAN- CODING ALGORITHM

OUTCOMES:

At the end of the course, the student should be able to:


o Define the information theories and the types of coding.
o Define the algorithms used in coding.
o Implement the information theories techniques.
o Compute the capacity of various types of channels.
o Develop the various coding algorithms.
o Use different OOP and open source tools (C++/C tutorials) for information theory and
coding.

LIST OF EQUIPMENT

SOFTWARE:
C / C++ or equivalent compiler like DEV C++.

HARDWARE:
Standalone desktops/Laptops
.

Page 2
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

INDEX

S.NO EXPERIMENT NAME Page No.

1 DISCRETE ENTROPY FOR PROBABILITY 4


2 IMPLEMENT ENTROPY FOR PARTS OF MESSAGE 5
3 COMPUTE THE ENTROPY OF MESSAGE/TEXT 7
4 NOISELESS (NO NOISE) BINARY CHANNEL 9
5 BINARY SYMMETRIC CHANNEL BSC CAPACITY 11
6 BINARY SYMMETRIC CHANNELCAPACITY: PRIVATE CASE 12
7 SHANNON- FANO CODE ALGORITHM 14
8 THE HUFFMAN- CODING ALGORITHM 17

Page 3
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork
DISCRETE ENTROPY FOR PROBABILITY

AIM: Develop a program to compute the Entropy in case of Discrete Algorithm.

ALGORITHM DESCRIPTION:

Find the Entropy for each following probability, and


o The 0.1 Probability
o The 0.15 Probability
o The 0.2 Probability
o The 0.25 Probability
o The 0.3 Probability

Then the Entropy of all message.

PROGRAM:

#include <iostream>
#include <cmath>
using namespace std;
int main(){
int i;
float p,sum=0, it;
p=0.05;
for(i=1;i<=5;i++) {
p=0.05+p;
it=p*(log(1/p))/log(2);
sum=sum+it;
cout<<"probability="<<p<<" Entropy="<<it<<endl;
}
cout<<"The all Entropy="<<sum<<endl;
}

stdin:
Standard input is empty

stdout:
Probability =0.1 entropy =0.332193
Probability =0.15 entropy =0.410545
Probability =0.2 entropy =0.464386
Probability =0.25 entropy =0.5
Probability =0.3 entropy =0.52109
The all Entropy =2.22821

RESULT:
Thus the program was executed and verified successfully.

Page 4
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

Viva Questions :-

Q.1) What is Entropy? Explain it.


Ans. Entropy is defined as measure from information theory that tells how much uncertainity or information
content is present in a message or symbol.
Entropy is a key measure that quantifies the average amount of information, uncertainty, or surprise associated
with a random variable or a data source. Introduced by Claude Shannon in 1948 (thus often called Shannon
entropy), it plays a foundational role in understanding data compression, coding, and communication systems.
Mathematical definition :-
For a discrete random variable X that can take values x1, x2, x3, ….xn with probabilities p1, p2, p3, ….pn, then the
entropy H(X) is defined by :-

Entropy = -∑ p(x) log 2 p(x)

Q.2) In the given program why we are getting the maximum entropy around 0.5 and minimum entropy
around 0.3?
Ans. In the given program, we are calculating the entropy of a single probability value p using a formula :-
Entropy = p * log2(1/p)
 The maximum entropy near 0.5 (at p=0.3) and the maximum near 0.3 (at p=0.1)
 When the event is less likely (like p=0.1), the entropy is lower.
 As the event becomes more likely (up to p=0.3), the entropy becomes higher.
 Entropy increases from around 0.3 to 0.5 in output.

Page 5
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

IMPLEMENT ENTROPY FOR PARTS OF MESSAGE

AIM: Develop a program to Compute Entropy of 4 Parts of Message

ALGORITHM DESCRIPTION:

Suppose you have 4 messages each of them has the following probability:
Enter The 1 Probability : 0.1
Enter The 2 Probability : 0.2
Enter The 3 Probability : 0.3
Enter The 4 Probability : 0.4
Find the Information Content of these Messages

PROGRAM :

#include <iostream>
#include <cmath>
using namespace std;
int main(){
int i;
float p,sum=0, it;
p=0.0;
for(i=1;i<=4;i++) {
p=0.1+p;
it=p*(log(1/p))/log(2);
sum=sum+it;

cout<<"probability="<<p<<
" Entropy="<<it<<endl;
}
cout<<"The all
Entropy="<<sum<<endl;
}

Stdin:
The Input is Empty

OUTPUT:

Page 6
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork
Probability =0.1 entropy =0.332193
Probability =0.2 entropy =0.464386
Probability =0.3 entropy =0.52109
Probability =0.4 entropy =0.528771
The all Entropy =1.84644

RESULT:
Thus the program was executed and verified successfully.

Viva Questions :-

Q.1) Write the output of the following line if p = 0.2.

Ans. For p=0.2 the output will be:-

Page 7
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork
Q.2) What will be the value of ‘p’ during the 3rd iteration of the loop in the given program?
Ans. During the 3rd iteration of the loop in the program, the value of p will be :-
= 0.1 * i ;
Since i = 3 in the third iteration:
p = 0.1 * 3 = 0.3
So the value of p is 0.3 during the 3rd iteration.

Q.3) What is the significance of log(2) in the entropy calculation? What happens if we don’t use it?
Ans. Log(2) ensures the result is in bits. In information theory, entropy is typically measured in bits, which
represent the minimum number of binary (0/1) decisions required, on average, to encode a random variable.
This is why we use this formula:-
it = p * (log(1/p) / log(2));
here, dividing by log(2) effectively converts the natural logarithm (base e) to a logarithm with base 2. This
conversion makes the result represent bits of information.
 If we don’t use log(2)
o The computed entropy values will be about 1.44 times smaller, since 1 nat ~ 1.44 bits.
o The mathematical structure of entropy is unchanged; only the units change.
o Using log(2) makes the result interpretable in the context of digital (binary) data storage and
transmission.

Q.4) Suppose we change the probabilities to 0.25, 0.25, 0.25, 0.25. What would be the total entropy. Justify
your answer theoretically and modify the code.
Ans. If the probabilities are changes to 0.25, 0.25, 0.25, 0.25 (i.e. a uniform distribution over 4 outcomes), the total
entropy will be 2 bits .
 Theoretical Justification :
Shannon entropy for a discrete random variable X with n equally likely outcomes (pi = 1/n) is given by:
H(X) = -∑ pi log 2 (pi)
Substituting pi = 0.25 and n = 4;
H(X) = -4 * 0.25 * log2 (0.25)
Since log2(0.25) = -2
H(X) = -4 * 0.25 * (-2)
= 4 * 0.25 * 2
= 2 bits
This is the maximum entropy for 4 outcomes since all events are equally likely, representing the maximum
uncertainty or information content.
 Modified code :

Page 8
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

Page 9
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

COMPUTE THE ENTROPY OF MESSAGE/TEXT

AIM:

To write a program to find the Entropy of certain message.in C++

DESCRIPTION ALGORITHM
o Entropy is the expected value of the measure of information content in a system. In general, the
Shannon entropy of a variable X is defined as:

o Where the information content I(x) = − logbP(x). If the base of the logarithm b = 2, the result is
expressed in bits, a unit of information. Therefore, given a string S of length n where P(si) is the
relative frequency of each character, the entropy of a string in bits is:

o For this task, use "1223334444" as an example. The result should be around 1.84644 bits.

PROGRAM :
#include <iostream>
#include <string>
#include <cmath>
using namespace std;
double calcEntropy(string str) {
int freqs[256]={0};
int i;
double entropy=0.0;
double size=str.size();
for(i=0;i<str.size();i++) freqs[str[i]]++;
for(i=0;i<256;i++) {
if(freqs[i]!=0) entropy+=(freqs[i]/size)*log2(freqs[i]/size);
}
entropy=-entropy;
cout<<"String = "<<str<<"\n";
cout<<"Entropy = "<<entropy<<"\n";
cout<<"\n";
return entropy;
}
int main() {
string inputs[]={
"1223334444"
};
for(int i=0;i<sizeof(inputs)/sizeof(string);i++) {
calcEntropy(inputs[i]);
}
return 0;
}

Page 10
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

stdin:
Standard input is empty
stdout:

Executing the program…


$demo
1.846439

RESULT:
Thus the program was executed and verified successfully.

Page 11
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

NOISELESS (NO NOISE) BINARY CHANNEL


AIM:

Develop and Implement Program to Compute the Capacity of Noiseless Binary Channel.

DESCRIPTION :
Consider a binary symmetric communication channel, whose input source is the alphabet X = {0,
1} with probabilities {0.5, 0.5}; whose output alphabet is Y = {0, 1}; and whose channel matrix
is:

 Where ǫ is the probability of transmission error:


 What is the entropy of the source, H(X)?
 What is the probability distribution of the outputs, p(Y ), and the entropy of this output
distribution, H(Y )?
 What is the joint probability distribution for the source and the output, p(X, Y ), and what
is the joint entropy, H(X, Y )?
 What is the mutual information of this channel, I(X; Y )?
 How many values are there for ǫ for which the mutual information of this channel is
maximal? What are those values, and what then is the capacity of such a channel in bits?
 For what value of ǫ is the capacity of this channel minimal? What is the channel capacity
in that case?.
Solving the Problem:
 Entropy of the source, H(X), is 1 bit.
 Output probabilities are p(y = 0) = (0.5)(1 − ǫ) + (0.5)ǫ = 0.5 and p(y = 1) = (0.5)(1 − ǫ)
+ (0.5)ǫ = 0.5. Entropy of this distribution is H(Y ) = 1 bit, just as for the entropy H(X) of
the input distribution.
 Joint probability distribution p(X, Y ) is:

0.5(1 − 𝑒) 0.5𝑝
0.5𝑝 0.5(1 − 𝑒)

and the entropy of this joint distribution is H(X, Y ) = −Xx,y p(x, y) log2 p(x, y)
= −(1 − e) log(0.5(1 − e)) − e log(0.5e) = (1 − e) − (1 − e) log(1 − e) + e − e log(e)
= 1 − e log(e) − (1 − e) log(1 − e)
 The mutual information is I(X; Y ) = H(X) + H(Y ) − H(X, Y ), which we can evaluate
from the quantities above as: 1 + e log(e) + (1 − e) log(1 − e).
 In the two cases of e= 0 and e= 1 (perfect transmission, and perfectly erroneous
transmission), the mutual information reaches its maximum of 1 bit and this is also then
the channel capacity.
 If e= 0.5, the channel capacity is minimal and equal to 0.
PROGRAM:
#include <iostream>
#include <cmath>
using namespace std;
int main(){
int i;
float p, it;
p=0.5;
for(i=1;i<=2;i++) { // 1st if 1 then 0, 2nd if 0 then 1
it=p*(log(1/p))/log(2)+(1-p)*(log(1/(1-p)))/log(2);
cout<<"Probability="<<p<<" Capacity="<<it<<endl;
}
}

Page 12
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

Stdin:
The input is Empty
Stdout:

Executing the program…

Probability =0.5 Capacity=1


Probability =0.5 Capacity=1

RESULT:
Thus the program was executed and verified successfully.

Page 13
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

BINARY SYMMETRIC CHANNEL BSC CAPACITY

AIM:
Can computing Binary Entropy Function (Channel Capacity) as follow:
C  1  H ( p)
Write Program for BSC when px=0.1 find the Hp= 0.468 ~ 0.47 and Capacity=
0.53~0.531.

ALGORITHM:
1. Can Compute binary Entropy function (Channel Capacity) as follows
H2(p) = H(p,1-p)
2. Then we can find
H2(p) = H(p,1-p)= p log1/p + (1-p)log1/(1-p)
3. And we have
1−𝑝 𝑝
𝑃 =
𝑝 1−𝑝
4. Binary symmetric channel capacity is
C =1 – H(p)
PROGRAM
#include <iostream>
#include <cmath>
using namespace std;
int main(){
float p, Hp, C;
p=0.1;
// 1st if 1 then 0, 2nd if 0 then 1
Hp=p*(log(1/p))/log(2)+(1-p)*(log(1/(1-p)))/log(2);
C=1-Hp;
cout<<"Probability of p="<<p<<" Binary Entropy Function of Hp="<<Hp<<endl;
cout<<"1-Hp -->Capacity="<<C<<endl;
}
Stdin:
The input is empty
Stdout:

Executing the program…


$demo
Probability of p=0.1 entropy function of Hp =0.468996
1-Hp== capacity=0.531004

RESULT:
Thus the program was executed and verified successfully.

Page 14
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

BINARY SYMMETRIC CHANNELCAPACITY: PRIVATE CASE

AIM:
Can Computing BSC (Channel Capacity) in Private Case Study As Follow:

Write Program For BSC of Private Case Study To Compute Capacity.

DESCRIPTION:
Consider the binary symmetric channel again, with f = 01.5 and Px: (p 0 = 0.9, p1 = 0.1). We already evaluated the
marginal probabilities P(y) implicitly above: P(y=0) = 0.78 P(y=1) = 0.22. The mutual information is:

I(X;Y) = H(Y) – H(Y/X) where H(Y/X) is the weighted sum over x of H(Y/x); but H(Y/x) is same for each value of
x: H(Y/x=0) is H2(0.15) and H(Y/x=1) is H2(0.15). So

I(X;Y) = H(Y) – H(Y/X)


= H2(0.22) – H2(0.15)
= 0.76-0.61 = 0.15 bits.
This may be considered with the entropy of the source H(X) = H2(0.1) = 0.47 bits

PROGRAM:
#include <iostream>
#include <cmath>
using namespace std;
int main(){

float py, pyx, Hy, Hyx, C;


py=0.22;
pyx=0.15;
// 1st if 1 then 0, 2nd if 0 then 1
Hy=py*(log(1/py))/log(2)+(1-py)*(log(1/(1-py)))/log(2);
Hyx=pyx*(log(1/pyx))/log(2)+(1-pyx)*(log(1/(1-pyx)))/log(2);
C=Hy-Hyx;
cout<<"Probability of PY="<<py<<" Probability of PYx="<<pyx<<“Capacity="<<C<<endl;
system(“pause”);
}

Page 15
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

Stdin:
The input is Empty

Stdout:
Executing the program
Probability of PY = 0.22 Probability of PYx =0.15 Capacity = 0.150327

RESULT:
Thus the program was executed and verified successfully.

Page 16
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

SHANNON- FANO CODE ALGORITHM

AIM:
This is a basic information theoretic algorithm. A simple example will be used to
illustrate the Shannon Fano algorithm:

DESCRIPTION:
Symbol A B C D E
Count 15 7 6 6 5
Encoding for the Shannon-Fano Algorithm:

A top-down approach
1. Sort symbols according to their frequencies/probabilities, e.g., ABCDE.
2. Recursively divide into two parts, each with approx. same number of counts.
Symbol Count log(1/p) Code Subtotal (# of bits)
A 15 1.38 00 30
B 7 2.48 01 14
C 6 2.70 10 12
D 6 2.70 110 18
E 5 2.96 111 15
TOTAL (# of bits): 89

PROGRAM:
#include<stdio.h>
#include<conio.h>
#include<string.h>
struct node
{
char sym[10];
float pro;
int arr[20];
int top;
}s[20];
typedef struct node node;
void prints(int l,int h,node s[])
{
int i;
for(i=l;i<=h;i++)
{
printf("\n%s\t%f",s[i].sym,s[i].pro);
}
}
void shannon(int l,int h,node s[])
{
float pack1=0,pack2=0,diff1=0,diff2=0;
int i,d,k,j;
if((l+1)==h || l==h || l>h)
{
if(l==h || l>h)
return;

Page 17
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

s[h].arr[++(s[h].top)]=0;
s[l].arr[++(s[l].top)]=1;
return;
}
else
{
for(i=l;i<=h-1;i++)
pack1=pack1+s[i].pro;
pack2=pack2+s[h].pro;
diff1=pack1-pack2;
if(diff1< 0)
diff1=diff1*-1;
j=2;
while(j!=h-l+1)
{
k=h-j;
pack1=pack2=0;
for(i=l;i<=k;i++)
pack1=pack1+s[i].pro;
for(i=h;i>k;i--)
pack2=pack2+s[i].pro;
diff2=pack1-pack2;
if(diff2< 0)
diff2=diff2*-1;
if(diff2>=diff1)
break;
diff1=diff2;
j++;
}
k++;
for(i=l;i<=k;i++)
s[i].arr[++(s[i].top)]=1;
for(i=k+1;i<=h;i++)
s[i].arr[++(s[i].top)]=0;
shannon(l,k,s);
shannon(k+1,h,s);
}
}
int main()
{
int n,i,j;
float x,total=0;
char ch[10];
node temp;
printf("Enter How Many Symbols Do You Want To Enter\t: ");
scanf("%d",&n);
for(i=0;i< n;i++)
{
printf("Enter symbol %d ---> ",i+1);
scanf("%s",ch);
strcpy(s[i].sym,ch);
}
for(i=0;i< n;i++)
{
printf("\n\tEnter probability for %s ---> ",s[i].sym);
scanf("%f",&x);
s[i].pro=x;
total=total+s[i].pro;
if(total>1)
{
printf("\t\tThis probability is not possible.Enter new probability");
total=total-s[i].pro;
i--;
}
}
s[i].pro=1-total;
for(j=1;j<=n-1;j++)

Page 18
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

{
for(i=0;i< n-1;i++)
{
if((s[i].pro)>(s[i+1].pro))
{
temp.pro=s[i].pro;
strcpy(temp.sym,s[i].sym);
s[i].pro=s[i+1].pro;
strcpy(s[i].sym,s[i+1].sym);
s[i+1].pro=temp.pro;
strcpy(s[i+1].sym,temp.sym);
}
}
}
for(i=0;i< n;i++)
s[i].top=-1;
shannon(0,n-1,s);
printf(" ");
printf("\n\n\n\tSymbol\tProbability\tCode");
for(i=n-1;i>=0;i--)
{
printf("\n\t%s\t%f\t",s[i].sym,s[i].pro);
for(j=0;j<=s[i].top;j++)
printf("%d",s[i].arr[j]);
}
printf("\n ");
getch();
return 0;
}

Stdin:
Enter How Many Symbols Do You Want To Enter : 6
Enter symbol 1 ---> a
Enter symbol 2 ---> b
Enter symbol 3 ---> c
Enter symbol 4 ---> d
Enter symbol 5 ---> e
Enter symbol 6 ---> f

Enter probability for a ---> 0.3


Enter probability for b ---> 0.25
Enter probability for c ---> 0.20
Enter probability for d ---> 0.12
Enter probability for e ---> 0.08
Enter probability for f ---> 0.05
Stdout:
Symbol Probability Code
a 0.300000 00
b 0.250000 01
c 0.200000 10
d 0.120000 110
e 0.080000 1110
f 0.050000 1111

RESULT:
Thus the program was executed and verified successfully.

Page 19
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

THE HUFFMAN- CODING ALGORITHM

AIM:
This is a basic information theoretic algorithm. A simple example will be programmed in
C++ for Huffman Coding algorithm:

DESCRIPTION:
The Huffman coding scheme takes each symbol and its weight (or frequency of occurrence), and
generates proper encodings for each symbol taking account of the weights of each symbol, so
that higher weighted symbols have fewer bits in their encoding. (See the WP article for more
information).
A Huffman encoding can be computed by first creating a tree of nodes:

1. Create a leaf node for each symbol and add it to the priority queue.
2. While there is more than one node in the queue:
 Remove the node of highest priority (lowest probability) twice to get two nodes.
 Create a new internal node with these two nodes as children and with probability equal to
the sum of the two nodes’ probabilities.
 Add the new node to the queue.
3. The remaining node is the root node and the tree is complete.
Traverse the constructed binary tree from root to leaves assigning and accumulating a ‘0’ for one
branch and a ‘1’ for the other at each node. The accumulated zeros and ones at each leaf
constitute a Huffman encoding for those symbols and weights:
Using the characters and their frequency from the string “this is an example for huffman encoding”, create a
program to generate a Huffman encoding for each character as a table.

PROGRAM:
#include <iostream>
#include <queue>
#include <map>
#include <climits> // for CHAR_BIT
#include <iterator>
#include <algorithm>
const int UniqueSymbols = 1 << CHAR_BIT;
const char* SampleString = "this is an example for huffman encoding";
typedef std::vector<bool> HuffCode;
typedef std::map<char, HuffCode> HuffCodeMap;
class INode
{
public:
const int f;
virtual ~INode() {}

Page 20
Laboratory Manual of Fundamentals of Information Theory and Coding
etwork

protected:
INode(int f) : f(f) {}
};
class InternalNode : public INode
{
public:
INode *const left;
INode *const right;
InternalNode(INode* c0, INode* c1) : INode(c0->f + c1->f), left(c0), right(c1) {}
~InternalNode()
{
delete left;
delete right;
}
};
class LeafNode : public INode
{
public:
const char c;
LeafNode(int f, char c) : INode(f), c(c) {}
};
struct NodeCmp
{
bool operator()(const INode* lhs, const INode* rhs) const { return lhs->f > rhs->f; }
};
INode* BuildTree(const int (&frequencies)[UniqueSymbols])
{
std::priority_queue<INode*, std::vector<INode*>, NodeCmp> trees;
for (int i = 0; i < UniqueSymbols; ++i)
{
if(frequencies[i] != 0)
trees.push(new LeafNode(frequencies[i], (char)i));
}
while (trees.size() > 1)
{
INode* childR = trees.top();
trees.pop();
INode* childL = trees.top();
trees.pop();
INode* parent = new InternalNode(childR, childL);
trees.push(parent);
}
return trees.top();
}
void GenerateCodes(const INode* node, const HuffCode& prefix, HuffCodeMap& outCodes)
{
if (const LeafNode* lf = dynamic_cast<const LeafNode*>(node))
{
outCodes[lf->c] = prefix;
}
else if (const InternalNode* in = dynamic_cast<const InternalNode*>(node))
{
HuffCode leftPrefix = prefix;
leftPrefix.push_back(false);
GenerateCodes(in->left, leftPrefix, outCodes);
HuffCode rightPrefix = prefix;
rightPrefix.push_back(true);

Page 21
Laboratory Manual of Fundamentals of Information Theory and Coding

GenerateCodes(in->right, rightPrefix, outCodes);


}
}
int main()
{
// Build frequency table
int frequencies[UniqueSymbols] = {0};
const char* ptr = SampleString;
while (*ptr != '\0')
++frequencies[*ptr++];
INode* root = BuildTree(frequencies);
HuffCodeMap codes;
GenerateCodes(root, HuffCode(), codes);
delete root;
for (HuffCodeMap::const_iterator it = codes.begin(); it != codes.end(); ++it)
{
std::cout << it->first << " ";
std::copy(it->second.begin(), it->second.end(),
std::ostream_iterator<bool>(std::cout));
std::cout << std::endl;
}
return 0;
}
Stdin:

The Input is Empty


Stdout:
110
a 1001
c 101010
d 10001
e 1111
f 1011
g 101011
h 0101
i 1110
l 01110
m 0011
n 000
o 0010
p 01000
r 01001
s 0110
t 01111
u 10100
x 10000
RESULT:
Thus the program was executed and verified successfully.

Page 22

You might also like