Chapter 5 Greedy Algorithm Final
Chapter 5 Greedy Algorithm Final
1
Introduction
• If you wish to travel from city A to city B.
• Options:
A B
WALK FLIGHT
CAR TRAIN
2
Optimization problems
• An optimization problem is one in which you
want to find, not just a solution, but the best
solution
• A “greedy algorithm” sometimes works well for
optimization problems
• A greedy algorithm works in phases. At each
phase:
– You take the best you can get right now, without
regard for future consequences
– You hope that by choosing a local optimum at each
step, you will end up at a global optimum 3
Example: Counting money
• Suppose you want to count out a certain amount of
money, using the fewest possible bills and coins
• A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin
that does not overshoot
– Example: To make $6.39, you can choose:
• a $5 bill
• a $1 bill, to make $6
• a 25¢ coin, to make $6.25
• A 10¢ coin, to make $6.35
• four 1¢ coins, to make $6.39
• For US money, the greedy algorithm always gives
the optimum solution
4
A failure of the greedy algorithm
• Rupees come in 1 coins, 7 coins, and 10 coins
• Using a greedy algorithm to count out 15
rupees, you would get
– A 10 rupees coin piece
– Five 1 coin pieces, for a total of 15 rupees
– This requires six coins
• A better solution would be to use two 7
rupees coins and one 1 rupee coin piece
– This only requires three coins
• The greedy algorithm results in a solution, but
not in an optimal solution 5
• A Generic Greedy Algorithm:
(1) Initialize C to be the set of candidate solutions
(2) Initialize a set S = the empty set (the set is to be
the optimal solution we are constructing).
(3) While C and S is (still) not a solution do
(3.1) select x from set C using a greedy strategy
(3.2) delete x from C (3.3) if
{x} S is a feasible solution, then
S = S {x} (i.e., add x to set S)
(4) if S is a solution then
return S
(5) else return failure
To find out the optimal solution the greedy algorithm maintains two sets.
One set maintains chosen or selected items.
Second set contains unselected or rejected.
Greedy algorithm basically consists of four function and they are as
follows:
(i) A function that can find out or checks that selected set of items
provides solution or not
(ii) A function that can find out or checks that the set is feasible or not.
(iii) A function which is called as selection checks that most promising
solutions among all solutions.
(iv) An objective function which doesn't appear explicitly gives the value
of a solution.
7
Activity-Selection Problem
• S = (A1 A2 A3 A4 A5 A6 A7 A8 A9A10)
• Si = (1,2,3,4,7,8,9,9,11,12)
• fi = (3,5,4,7,10,9,11,13,12,14)
A scheduling problem
• You have to run nine jobs, with running times of 3, 5, 6, 10, 11,
14, 15, 18, and 20 minutes
• You have three processors on which you can run these jobs
• You decide to do the longest-running jobs first, on whatever
processor is available
P1 20 10 3
P2
18 11 6
P3
15 14 5
P1 3 10 15
P2
5 11 18
P3
6 14 20
P1 20 14
P2
18 11 5
P3
15 10 6 3
16
Job Scheduling Problem
• Consider that there are n jobs that are to be
executed.
• At any time t=1,2,…. Only exactly one job to be
executed.
• The profit Pi are given.
• Associated with job i is an integer deadline di>=0
and a profit Pi>0.
• For any job i the profit Pi is earned iff the job is
completed by its deadline.
17
Job Scheduling Problem
• To complete a job one has to process the job on a
machine for one unit of time.
• Only one machine is available for processing jobs.
• A feasible solution for this problem is a subset J of
jobs such that each job in this subset can be
completed by its deadline.
• The value of a feasible solution J is the sum of the
profits of the jobs in J or ε iєJ Pi
• An optimal solution is a feasible solution with
maximum value.
18
• Assumption:
- Uni processor
- No preemption
- 1 unit of time
19
Algorithm
1. Arrange all jobs in decreasing order of profit.
2. For each job(mi), do linear search to find
particular slot in array of size(ni).
where n = maximum deadline
m = total jobs
20
21
22
23
24
25
26
27
28
29
Knapsack Problem
• There are n different items in a store
• Item i :
– weighs wi pounds
– worth $vi
• A thief breaks in
• Can carry up to W pounds in his knapsack
• What should he take to maximize the value of
his haul?
30
0-1 vs. Fractional Knapsack
• 0-1 Knapsack Problem:
– the items cannot be divided
– thief must take entire item or leave it behind
• Fractional Knapsack Problem:
– thief can take partial items
– for instance, items are liquids or powders
– solvable with a greedy algorithm…
31
32
Example(Method 1) m=100
Object 1 2 3 4 5
Weight(W) 10 20 30 40 50
Profit(P) 20 30 66 40 60
Pi/Wi 2 1.5 2.2 1 1.2
Solution Xi=(0,0,1,0.5,1) 33
Example(Method 2)
Object 1 2 3 4 5
Weight(w) 10 20 30 40 50
Profit(P) 20 30 66 40 60
Pi/Wi 2 1.5 2.2 1 1.2
Solution xi=(1,1,1,1,0) 34
Example(Method 3)
Object 1 2 3 4 5
Weight(W) 10 20 30 40 50
Profit(P) 20 30 66 40 60
Pi/wi 2 1.5 2.2 1 1.2
50 weight=100
40 weight = ?=(40*100)/50=0.8
35
Solution Xi=(1,1,1,0,0.8)
May 2024 Marks-7
• Solve the following Knapsack Problem using
greedy method. Number of items = 5,
knapsack capacity W = 100,
36
Solve the following Knapsack Problem using
greedy method.(Oct-2020)
Number of items = 7,
Knapsack capacity W = 15,
weight vector = {2, 3, 5, 7, 1, 4, 1}
and profit vector = {10, 5, 15, 7, 6, 18, 3}
37
MINIMUM SPANNING TREE
38
Tree
• A tree is an undirected graph that is connected and acyclic.
• Much of what makes trees so useful is the simplicity of their
structure. For instance,
1. Property 1 Removing a cycle edge cannot disconnect a graph.
2. Property 2 A tree on n nodes has n - 1 edges.
3. Property 3 Any connected, undirected graph G = (V;E) with
E = V - 1 is a tree
4. Property 4 An undirected graph is a tree if and only if there is
40
Applications:
• Minimum spanning trees have direct
applications in the design of networks,
including:
- Computer networks
- Telecommunications networks
- Transportation networks
- Water supply networks
- Electrical grids
41
Algorithms
• There are two algorithms to find MST.
1. Prim’s Algorithm
2. Kruskal’s Algorithm
42
Prim’s Algorithm
43
Prim’s Example
44
MST
Total Cost=4+8+1+2+4+2+7+9=37
45
May 2025 Marks -7
46
Prim’s Algorithm – Example
47
Walk-Through
2
Initialize array
3
10
F C K dv pv
A 7
4
3 A F
8
18 B F
4
9
B D C F
10
H 25 D F
2
3 E F
G 7
E F F
G F
H F
48
2
Start with any node, say D
3
10
F C K dv pv
A 7
4
3 A
8
18 B
4
9
B D C
10
H 25 D T 0
2
3 E
G 7
E F
G
H
49
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A
8
18 B
4
9
B D C 3 D
10
H 25 D T 0
2
3 E 25 D
G 7
E F 18 D
G 2 D
H
50
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A
8
18 B
4
9
B D C 3 D
10
H 25 D T 0
2
3 E 25 D
G 7
E F 18 D
G T 2 D
H
51
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A
8
18 B
4
9
B D C 3 D
10
H 25 D T 0
2
3 E 7 G
G 7
E F 18 D
G T 2 D
H 3 G
52
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A
8
18 B
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E 7 G
G 7
E F 18 D
G T 2 D
H 3 G
53
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E 7 G
G 7
E F 3 C
G T 2 D
H 3 G
54
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E 7 G
G 7
E F T 3 C
G T 2 D
H 3 G
55
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E 2 F
G 7
E F T 3 C
G T 2 D
H 3 G
56
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H 3 G
57
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H 3 G
Table entries unchanged
58
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A 10 F
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G
59
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A 4 H
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G
60
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A T 4 H
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G
61
2 Update distances of
adjacent, unselected nodes
3
10
F C K dv pv
A 7
4
3 A T 4 H
8
18 B 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G
Table entries unchanged
62
2 Select node with minimum
distance
3
10
F C K dv pv
A 7
4
3 A T 4 H
8
18 B T 4 C
4
9
B D C T 3 D
10
H 25 D T 0
2
3 E T 2 F
G 7
E F T 3 C
G T 2 D
H T 3 G
63
2 Cost of Minimum Spanning
Tree = dv = 21
3
F C K dv pv
A 4
3 A T 4 H
B T 4 C
4
B D C T 3 D
H D T 0
2
3 E T 2 F
G E F T 3 C
G T 2 D
H T 3 G
Done
64
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A
F
B
C
D
E
F
G
65
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 2 A
C 2 A
D 1 A
E
F
G
66
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 2 A
C 21 AD
D Y 1 A
E 1 D
F 6 D
G 5 D
67
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 2 A
C Y 21 AD
D Y 1 A
E 1 D
F 62 DC
G 5 D
68
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F 62 DC
G 53 DE
69
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F 62 DC
G 53 DE
70
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F Y 62 DC
G 53 DE
71
Example: Prim's Algorithm
2
A B 1
1
2 5 E
1
1 D
C 5 3
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F Y 62 DC
G Y 53 DE
72
Example: Prim's Algorithm
2 Output:
A B 1 (A, D) (C, F)
1 (B, E) (D, E)
2 5 E (C, D) (E, G)
1
1 D
C 5 3 Total Cost: 9
6 G vertex known? cost prev
10
2 A Y 0 -
F
B Y 21 AE
C Y 21 AD
D Y 1 A
E Y 1 D
F Y 62 DC
G Y 53 DE
73
Analysis: Prim's Algorithm
Correctness
• Intuitively similar to Dijkstra's algorithm
Run-time
• Same as Dijkstra's algorithm
• O(|E| log |V|) using a priority queue
74
Kruskal’s Algorithm
75
Walk-Through
Consider an undirected, weight graph
3
10
F C
A 4
4
3
8
6
5
4
B D
4
H 1
2
3
G 3
E
76
Sort the edges by increasing edge weight
3
10
F C edge dv edge dv
A 4
4
3 (D,E) 1 (B,E) 4
8 (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
77
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
78
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
79
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
80
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
81
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
82
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
83
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4 3 (D,E) 1 (B,E) 4
8 4
6 (D,G) 2 (B,F) 4
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
84
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4
4
3 (D,E) 1 (B,E) 4
8 (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
85
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4
4
3 (D,E) 1 (B,E) 4
8 (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
86
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4
4
3 (D,E) 1 (B,E) 4
8 (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
87
Select first |V|–1 edges which do not
generate a cycle
3
10
F C edge dv edge dv
A 4
4
3 (D,E) 1 (B,E) 4
8 (B,F) 4
6 (D,G) 2
5
4
B D (E,G) 3 (B,H) 4
4
H 1
(C,D) 3 (A,H) 5
2
3 (G,H) 3 (D,F) 6
G 3
E (C,F) 3 (A,B) 8
(B,C) 4 (A,F) 10
88
Select first |V|–1 edges which do not
generate a cycle
3
F C edge dv edge dv
A 4
3 (D,E) 1 (B,E) 4
(D,G) 2 (B,F) 4
5
B D (E,G) 3 (B,H) 4
H 1
(C,D) 3 (A,H) 5
}
2
3 (G,H) 3 (D,F) 6
G E (C,F) 3 (A,B) 8
not
considered
(B,C) 4 (A,F) 10
Done
Total Cost = dv = 21
89
90
Huffman Codes
Huffman Codes
• For compressing data (sequence of characters)
• Widely used
• Very efficient (saving 20-90%)
• Use a table to keep frequencies of occurrence
of characters.
• Output binary string.
3 -92
Encoding Tree Example
• A code is a mapping of each character of an alphabet to a binary code-
word
• A prefix code is a binary code such that no code-word is the prefix of
another code-word
• An encoding tree represents a prefix code
– Each external node stores a character
– The code word of a character is given by the path from the root to the
external node storing the character (0 for a left child and 1 for a right child)
00 010 011 10 11
a b c d e
a d e
b c
Greedy Method and Compression 93
Find an optimal Huffman code for the following
set of frequency. A : 50, B: 20, C: 15, D: 30
94
Huffman encoding
• The Huffman encoding algorithm is a greedy algorithm.
• You always pick the two smallest numbers to combine
• Average bits/char:
10 0.22*2 + 0.12*3
0 +
54
0.24*2 + 0.06*4
2 A=00
+
7 B=100
0.27*2 + 0.09*4
46 15 C=01
= 2.42
D=101
0 • The Huffman
22 12 24 6 27 algorithm finds an
E=11
9 optimal solution
F=101
A B C D E
1 95
F
Huffman Codes
A file of 100,000
characters.
The coding schemes can be represented by trees:
Frequency Fixed-length Frequency Variable-length
(in thousands) codeword (in thousands) codeword
• Symbols: A, B, C, D, E, F, G
freq. : 2, 3, 5, 8, 13, 15, 18
• Huffman codes:
A: 10100 B: 10101 C: 1011
D: 100 E: 00 F: 01
G: 11
98
Single – Source Shortest Path Problem
• Problem: Finding Shortest paths from a vertex v to all other
vertices in the graph.