0% found this document useful (0 votes)
8 views109 pages

Chapter 5 Greedy Algorithm Final

Uploaded by

zeelsoni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views109 pages

Chapter 5 Greedy Algorithm Final

Uploaded by

zeelsoni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 109

Greedy Algorithms

1
Introduction
• If you wish to travel from city A to city B.
• Options:

A B

WALK FLIGHT
CAR TRAIN

2
Optimization problems
• An optimization problem is one in which you
want to find, not just a solution, but the best
solution
• A “greedy algorithm” sometimes works well for
optimization problems
• A greedy algorithm works in phases. At each
phase:
– You take the best you can get right now, without
regard for future consequences
– You hope that by choosing a local optimum at each
step, you will end up at a global optimum 3
Example: Counting money
• Suppose you want to count out a certain amount of
money, using the fewest possible bills and coins
• A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin
that does not overshoot
– Example: To make $6.39, you can choose:
• a $5 bill
• a $1 bill, to make $6
• a 25¢ coin, to make $6.25
• A 10¢ coin, to make $6.35
• four 1¢ coins, to make $6.39
• For US money, the greedy algorithm always gives
the optimum solution
4
A failure of the greedy algorithm
• Rupees come in 1 coins, 7 coins, and 10 coins
• Using a greedy algorithm to count out 15
rupees, you would get
– A 10 rupees coin piece
– Five 1 coin pieces, for a total of 15 rupees
– This requires six coins
• A better solution would be to use two 7
rupees coins and one 1 rupee coin piece
– This only requires three coins
• The greedy algorithm results in a solution, but
not in an optimal solution 5
• A Generic Greedy Algorithm:
(1) Initialize C to be the set of candidate solutions
(2) Initialize a set S = the empty set  (the set is to be
the optimal solution we are constructing).
(3) While C   and S is (still) not a solution do
(3.1) select x from set C using a greedy strategy
(3.2) delete x from C (3.3) if
{x}  S is a feasible solution, then
S = S  {x} (i.e., add x to set S)
(4) if S is a solution then
return S
(5) else return failure

• In general, a greedy algorithm is efficient because it makes a


sequence of (local) decisions and never backtracks. The
6
solution is not always optimal, however.
Characteristics of Greedy Algorithm:
(Dec 2024 Marks 3)

To find out the optimal solution the greedy algorithm maintains two sets.
One set maintains chosen or selected items.
Second set contains unselected or rejected.
Greedy algorithm basically consists of four function and they are as
follows:
(i) A function that can find out or checks that selected set of items
provides solution or not
(ii) A function that can find out or checks that the set is feasible or not.
(iii) A function which is called as selection checks that most promising
solutions among all solutions.
(iv) An objective function which doesn't appear explicitly gives the value
of a solution.
7
Activity-Selection Problem

• A conference room is shared among different activities


– S =a ,a ,...,a is the set of proposed activities
1 2 n

– Activity a has a start time s and a finish time f


i i i

– Activities a and a are compatible if either f ≤ s or f ≤ s


i j i j j i

• Problem: find the largest set of compatible activities


• Example

• Is there a greedy solution for this problem?


Activity-Selection Problem 1
Activity-Selection Problem 2
Activity-Selection Problem
3
Activity Selection Problem -Algorithm

GREEDY- ACTIVITY SELECTOR (s, f)


1. n ← length [s]
2. A ← {a1}
3. i ← 1.
4. for m ← 2 to n
5. do if sm ≥ fi
6. then A ← A ∪ {am}
7. i ← m
8. return A
Example

• S = (A1 A2 A3 A4 A5 A6 A7 A8 A9A10)
• Si = (1,2,3,4,7,8,9,9,11,12)
• fi = (3,5,4,7,10,9,11,13,12,14)
A scheduling problem
• You have to run nine jobs, with running times of 3, 5, 6, 10, 11,
14, 15, 18, and 20 minutes
• You have three processors on which you can run these jobs
• You decide to do the longest-running jobs first, on whatever
processor is available
P1 20 10 3
P2
18 11 6
P3
15 14 5

• Time to completion: 18 + 11 + 6 = 35 minutes


• This solution isn’t bad, but we might be able to do better
14
Another approach
• What would be the result if you ran the shortest job first?
• Again, the running times are 3, 5, 6, 10, 11, 14, 15, 18, and 20
minutes

P1 3 10 15
P2
5 11 18
P3
6 14 20

• That wasn’t such a good idea; time to completion is now


6 + 14 + 20 = 40 minutes
• Note, however, that the greedy algorithm itself is fast
– All we had to do at each stage was pick the minimum or maximum
15
An optimum solution
• Better solutions do exist:

P1 20 14
P2
18 11 5
P3
15 10 6 3

• This solution is clearly optimal (why?)


• Clearly, there are other optimal solutions (why?)
• How do we find such a solution?
– One way: Try all possible assignments of jobs to processors
– Unfortunately, this approach can take exponential time

16
Job Scheduling Problem
• Consider that there are n jobs that are to be
executed.
• At any time t=1,2,…. Only exactly one job to be
executed.
• The profit Pi are given.
• Associated with job i is an integer deadline di>=0
and a profit Pi>0.
• For any job i the profit Pi is earned iff the job is
completed by its deadline.

17
Job Scheduling Problem
• To complete a job one has to process the job on a
machine for one unit of time.
• Only one machine is available for processing jobs.
• A feasible solution for this problem is a subset J of
jobs such that each job in this subset can be
completed by its deadline.
• The value of a feasible solution J is the sum of the
profits of the jobs in J or ε iєJ Pi
• An optimal solution is a feasible solution with
maximum value.
18
• Assumption:
- Uni processor
- No preemption
- 1 unit of time

19
Algorithm
1. Arrange all jobs in decreasing order of profit.
2. For each job(mi), do linear search to find
particular slot in array of size(ni).
where n = maximum deadline
m = total jobs

20
21
22
23
24
25
26
27
28
29
Knapsack Problem
• There are n different items in a store
• Item i :
– weighs wi pounds
– worth $vi
• A thief breaks in
• Can carry up to W pounds in his knapsack
• What should he take to maximize the value of
his haul?

30
0-1 vs. Fractional Knapsack
• 0-1 Knapsack Problem:
– the items cannot be divided
– thief must take entire item or leave it behind
• Fractional Knapsack Problem:
– thief can take partial items
– for instance, items are liquids or powders
– solvable with a greedy algorithm…

31
32
Example(Method 1) m=100
Object 1 2 3 4 5
Weight(W) 10 20 30 40 50
Profit(P) 20 30 66 40 60
Pi/Wi 2 1.5 2.2 1 1.2

Method: 1 Select object with maximum Pi(profit)

Object Profit(Pi) Weight Remaining Weight


- - - 100
3 66 30 100-30=70
5 60 50 70-50=20
4 40/2 40/2=20 20-20=0
Profit=146

Solution Xi=(0,0,1,0.5,1) 33
Example(Method 2)
Object 1 2 3 4 5
Weight(w) 10 20 30 40 50
Profit(P) 20 30 66 40 60
Pi/Wi 2 1.5 2.2 1 1.2

Method: 2 Select object with minimum weight(Wi)

Object Profit(Pi) Weight Remaining Weight


- - - 100
1 20 10 100-10=90
2 30 20 90-20=70
3 66 30 70-30=40
4 40 40 40-40=0
Profit=156

Solution xi=(1,1,1,1,0) 34
Example(Method 3)
Object 1 2 3 4 5
Weight(W) 10 20 30 40 50
Profit(P) 20 30 66 40 60
Pi/wi 2 1.5 2.2 1 1.2

Method: 3 Select object with maximum Pi/Wi

Object Profit(pi) Weight Remaining Weight


- - - 100
3 66 30 100-30=70
1 20 10 70-10=60
2 30 20 60-20=40
5 60*0.8=48 50*0.8=40 40-40=0
Profit=164

50 weight=100
40 weight = ?=(40*100)/50=0.8
35
Solution Xi=(1,1,1,0,0.8)
May 2024 Marks-7
• Solve the following Knapsack Problem using
greedy method. Number of items = 5,
knapsack capacity W = 100,

weight vector = {50, 40, 30, 20, 10} and


profit vector = {1, 2, 3, 4, 5}.

36
Solve the following Knapsack Problem using
greedy method.(Oct-2020)

Number of items = 7,
Knapsack capacity W = 15,
weight vector = {2, 3, 5, 7, 1, 4, 1}
and profit vector = {10, 5, 15, 7, 6, 18, 3}

37
MINIMUM SPANNING TREE

38
Tree
• A tree is an undirected graph that is connected and acyclic.
• Much of what makes trees so useful is the simplicity of their
structure. For instance,
1. Property 1 Removing a cycle edge cannot disconnect a graph.
2. Property 2 A tree on n nodes has n - 1 edges.
3. Property 3 Any connected, undirected graph G = (V;E) with
E = V - 1 is a tree
4. Property 4 An undirected graph is a tree if and only if there is

a unique path between any pair of nodes.


39
MST
• A minimum spanning tree (MST) or minimum
weight spanning tree is a subset of the edges
of a connected, edge-weighted undirected
graph that connects all the vertices together,
without any cycles and with the minimum
possible total edge weight.
• That is, it is a spanning tree whose sum of
edge weights is as small as possible

40
Applications:
• Minimum spanning trees have direct
applications in the design of networks,
including:
- Computer networks
- Telecommunications networks
- Transportation networks
- Water supply networks
- Electrical grids
41
Algorithms
• There are two algorithms to find MST.
1. Prim’s Algorithm
2. Kruskal’s Algorithm

42
Prim’s Algorithm

43
Prim’s Example

44
MST

Total Cost=4+8+1+2+4+2+7+9=37
45
May 2025 Marks -7

46
Prim’s Algorithm – Example

47
Walk-Through
2
Initialize array
3
10
F C K dv pv

A 7
4
3 A F  
8  
18 B F
4
9
B D C F  
10
H 25 D F  
2
3 E F  
G 7
E F F  
G F  
H F  

48
2
Start with any node, say D
3
10
F C K dv pv

A 7
4
3 A
8
18 B
4
9
B D C
10
H 25 D T 0 
2
3 E
G 7
E F
G
H

49
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A
8
18 B
4
9
B D C 3 D
10
H 25 D T 0 
2
3 E 25 D
G 7
E F 18 D
G 2 D
H

50
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A
8
18 B
4
9
B D C 3 D
10
H 25 D T 0 
2
3 E 25 D
G 7
E F 18 D
G T 2 D
H

51
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A
8
18 B
4
9
B D C 3 D
10
H 25 D T 0 
2
3 E 7 G
G 7
E F 18 D
G T 2 D
H 3 G

52
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A
8
18 B
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E 7 G
G 7
E F 18 D
G T 2 D
H 3 G

53
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E 7 G
G 7
E F 3 C
G T 2 D
H 3 G

54
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E 7 G
G 7
E F T 3 C
G T 2 D
H 3 G

55
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E 2 F
G 7
E F T 3 C
G T 2 D
H 3 G

56
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H 3 G

57
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H 3 G
Table entries unchanged

58
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G

59
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A 4 H
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G

60
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A T 4 H
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G

61
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv

A 7
4
3 A T 4 H
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G
Table entries unchanged

62
2 Select node with minimum
distance
3
10
F C K dv pv

A 7
4
3 A T 4 H
8
18 B T 4 C
4
9
B D C T 3 D
10
H 25 D T 0 
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G

63
2 Cost of Minimum Spanning
Tree =  dv = 21
3
F C K dv pv

A 4
3 A T 4 H
B T 4 C
4
B D C T 3 D
H D T 0 
2
3 E T 2 F
G E F T 3 C
G T 2 D
H T 3 G

Done

64
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A
F
B
C
D
E
F
G

65
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 2 A
C 2 A
D 1 A
E
F
G

66
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 2 A
C 21 AD
D Y 1 A
E 1 D
F 6 D
G 5 D

67
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 2 A
C Y 21 AD
D Y 1 A
E 1 D
F 62 DC
G 5 D

68
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F 62 DC
G 53 DE

69
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F 62 DC
G 53 DE

70
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F Y 62 DC
G 53 DE

71
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F Y 62 DC
G Y 53 DE

72
Example: Prim's Algorithm
2 Output:
A B 1 (A, D) (C, F)
1 (B, E) (D, E)
2 5 E (C, D) (E, G)
1
1 D
C 5 3 Total Cost: 9
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F Y 62 DC
G Y 53 DE

73
Analysis: Prim's Algorithm
Correctness
• Intuitively similar to Dijkstra's algorithm

Run-time
• Same as Dijkstra's algorithm
• O(|E| log |V|) using a priority queue

74
Kruskal’s Algorithm

Work with edges, rather than nodes


Two steps:
– Sort edges by increasing edge weight
– Select the first |V| – 1 edges that do not
generate a cycle

75
Walk-Through
Consider an undirected, weight graph
3
10
F C
A 4
4
3
8
6
5
4
B D
4
H 1
2
3
G 3
E

76
Sort the edges by increasing edge weight
3
10
F C edge dv edge dv

A 4
4
3 (D,E) 1 (B,E) 4
8 (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10

77
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10

78
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2  (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10

79
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2  (B,F) 4
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10

Accepting edge (E,G) would create a cycle

80
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2  (B,F) 4
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10

81
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2  (B,F) 4
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10

82
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2  (B,F) 4
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3  (A,B) 8
(B,C) 4 (A,F) 10

83
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv

A 4 3 (D,E) 1  (B,E) 4
8 4
6 (D,G) 2  (B,F) 4
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3  (A,B) 8
(B,C) 4  (A,F) 10

84
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
 
A 4
4
3 (D,E) 1 (B,E) 4
8  (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3  (A,B) 8
(B,C) 4  (A,F) 10

85
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
 
A 4
4
3 (D,E) 1 (B,E) 4
8  (B,F) 4 
6 (D,G) 2
5
4
B D (E,G) 3  (B,H) 4
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3  (A,B) 8
(B,C) 4  (A,F) 10

86
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
 
A 4
4
3 (D,E) 1 (B,E) 4
8  (B,F) 4 
6 (D,G) 2
5
4
B D (E,G) 3  (B,H) 4 
4
H 1
(C,D) 3  (A,H) 5
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3  (A,B) 8
(B,C) 4  (A,F) 10

87
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
 
A 4
4
3 (D,E) 1 (B,E) 4
8  (B,F) 4 
6 (D,G) 2
5
4
B D (E,G) 3  (B,H) 4 
4
H 1
(C,D) 3  (A,H) 5 
2
3 (G,H) 3  (D,F) 6
G 3
E (C,F) 3  (A,B) 8
(B,C) 4  (A,F) 10

88
Select first |V|–1 edges which do not
generate a cycle
3
F C edge dv edge dv
 
A 4
3 (D,E) 1 (B,E) 4
(D,G) 2  (B,F) 4 
5
B D (E,G) 3  (B,H) 4 
H 1
(C,D) 3  (A,H) 5 

}
2
3 (G,H) 3  (D,F) 6
G E (C,F) 3  (A,B) 8
not
considered
(B,C) 4  (A,F) 10

Done
Total Cost =  dv = 21

89
90
Huffman Codes
Huffman Codes
• For compressing data (sequence of characters)
• Widely used
• Very efficient (saving 20-90%)
• Use a table to keep frequencies of occurrence
of characters.
• Output binary string.

“Today’s weather is “001 0110 0 0 100


nice” 1000 1110”
Huffman codes:

• In telecommunication, how do we represent a set of messages,


each with an access frequency, by a sequence of 0’s and 1’s?
• To minimize the transmission and decoding costs, we may use
short strings to represent more frequently used messages.
• Given a string X, efficiently encode X into a smaller string Y
– Saves memory and/or bandwidth
• A good approach: Huffman encoding
– Compute frequency f(c) for each character c.
– Encode high-frequency characters with short code words
– No code word is a prefix for another code
– Use an optimal encoding tree to determine the code words

3 -92
Encoding Tree Example
• A code is a mapping of each character of an alphabet to a binary code-
word
• A prefix code is a binary code such that no code-word is the prefix of
another code-word
• An encoding tree represents a prefix code
– Each external node stores a character
– The code word of a character is given by the path from the root to the
external node storing the character (0 for a left child and 1 for a right child)

00 010 011 10 11
a b c d e
a d e

b c
Greedy Method and Compression 93
Find an optimal Huffman code for the following
set of frequency. A : 50, B: 20, C: 15, D: 30

94
Huffman encoding
• The Huffman encoding algorithm is a greedy algorithm.
• You always pick the two smallest numbers to combine

• Average bits/char:
10 0.22*2 + 0.12*3
0 +
54
0.24*2 + 0.06*4
2 A=00
+
7 B=100
0.27*2 + 0.09*4
46 15 C=01
= 2.42
D=101
0 • The Huffman
22 12 24 6 27 algorithm finds an
E=11
9 optimal solution
F=101
A B C D E
1 95
F
Huffman Codes
A file of 100,000
characters.
The coding schemes can be represented by trees:
Frequency Fixed-length Frequency Variable-length
(in thousands) codeword (in thousands) codeword

‘a’ 45 000 ‘a’ 45 0


‘b’ 13 001 ‘b’ 13 101
‘c’ 12 010 ‘c’ 12 100
‘d’ 16 011 ‘d’ 16 111
‘e’ 9 100 ‘e’ 9 1101
‘f’ 5 101 ‘f’ 5 1100
Not a full 0 100 100 A full binary tree
1 0 1 every nonleaf node
binary tree 86 has 2 children
14 a:45 55
0 1 0 0 1
58 28 14 25 30
0 1 0 1 0 1 0 1 0 1
a:45 b:13 c:12 d:16 e:9 f:5 b:13 c:12 14 d:16
0 1
e:9 f:5
Example,

• Symbols: A, B, C, D, E, F, G
freq. : 2, 3, 5, 8, 13, 15, 18

• Huffman codes:
A: 10100 B: 10101 C: 1011
D: 100 E: 00 F: 01
G: 11

A Huffman code Tree


Huffman Algorithm

98
Single – Source Shortest Path Problem
• Problem: Finding Shortest paths from a vertex v to all other
vertices in the graph.

• Dijkstra's algorithm : A solution to the single-source shortest


path problem in graph theory.

• Works on both directed and undirected graphs. However, all edges


must have nonnegative weights.

• Input: Weighted graph G={E,V} and source vertex v ∈ V, such


that all edge weights are nonnegative

• Output: Lengths of shortest paths (or the shortest paths


themselves) from a given source vertex v ∈ V to all other vertices
100
Example
Example
Example
Example
Another Example
Example
Example
Example
Example

You might also like