0% found this document useful (0 votes)
20 views89 pages

09 Dynamic Programming Algorithms

Dynamic Programming Algorithms

Uploaded by

AHILA R CSE DEPT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views89 pages

09 Dynamic Programming Algorithms

Dynamic Programming Algorithms

Uploaded by

AHILA R CSE DEPT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 89

Dynamic Programming

Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!
Optimization Problems
• For most, the best known algorithm runs in exponential time.
• Some have quick Greedy or Dynamic Programming algorithms.

COSC 3101E 2
What is Dynamic Programming?

• Dynamic programming solves optimization problems


by combining solutions to subproblems

• “Programming” refers to a tabular method with a


series of choices, not “coding”

COSC 3101E 3
What is Dynamic Programming?

• A set of choices must be made to arrive at an optimal


solution

• As choices are made, subproblems of the same form


arise frequently

• The key is to store the solutions of subproblems to be


reused in the future

COSC 3101E 4
Example 1

• Fibonacci numbers are defined by:


F0 0
F1 1
Fi Fi  1  Fi  2 for i 2.

COSC 3101E 5
Fibonacci Example

Time?

COSC 3101E 6
Fibonacci Example

Time:
Exponential
Waste time
COSC 3101E 7 redoing work
Memoization
Definition: An algorithmic technique which saves (memoizes) a computed answer
for later reuse, rather than recomputing the answer.

•Memo functions were invented by Professor Donald Michie of


Edinburgh University.

•The idea was further developed by Robin Popplestone in his Pop2 language.

•It was later integrated into LISP.

•This same principle is found at the hardware level in computer architectures which
use a cache to store recently accessed memory locations.

•["'Memo' functions: and machine learning", Donald Michie, Nature, 218, 19-22,
1968].

COSC 3101E 8
Memoization in Optimization

•Remember the solutions for the subinstances


•If the same subinstance needs to be solved again,
the same answer can be used.

COSC 3101E 9
Memoization

Memoization reduces the complexity from exponential to linear!

COSC 3101E 10
From Memoization to Dynamic Programming

• Determine the set of subinstances that need to be


solved.
• Instead of recursing from top to bottom, solve each of
the required subinstances in smallest to largest order,
storing results along the way.

COSC 3101E 11
Dynamic Programming
First determine the complete set of subinstances
{100, 99, 98,…, 0}

Compute them in an order


Smallest to largest
such that no friend must wait.

1
COSC 3101E 0 12
Dynamic Programming

Fill out a table containing


an optimal solution for each subinstance.
0 1 1 2 3 5 2.19×1020 3.54×1020
0, 1, 2, 3, 4, 5, …. 99, 100

1
0
COSC 3101E 13
Dynamic Programming

Time Complexity?

Linear!

COSC 3101E 14
Dynamic Programming vs Divide-and-Conquer

• Recall the divide-and-conquer approach


– Partition the problem into independent subproblems
– Solve the subproblems recursively
– Combine solutions of subproblems
– e.g., mergesort, quicksort

• This contrasts with the dynamic programming


approach

COSC 3101E 15
Dynamic Programming vs Divide-and-Conquer

• Dynamic programming is applicable when


subproblems are not independent
– i.e., subproblems share subsubproblems
– Solve every subsubproblem only once and store the answer
for use when it reappears

• A divide-and-conquer approach will do more work than


necessary

COSC 3101E 16
A Sequence of 3 Steps

• A dynamic programming approach consists of a


sequence of 3 steps
1. Characterize the structure of an optimal solution
2. Recursively define the value of an optimal solution
3. Compute the value of an optimal solution in a bottom-up
fashion

COSC 3101E 17
Elements of Dynamic Programming

• For dynamic programming to be applicable, an optimization


problem must have:
1. Optimal substructure
– An optimal solution to the problem contains within it optimal solutions
to subproblems (but this may also mean a greedy strategy applies)

2. Overlapping subproblems
– The space of subproblems must be small; i.e., the same subproblems
are encountered over and over

COSC 3101E 18
Elements of Dynamic Programming

• Dynamic programming uses optimal substructure


from the bottom up:
– First find optimal solutions to subproblems
– Then choose which to use in optimal solution to problem.

COSC 3101E 19
Example 2. Making Change
Making Change

• To find the minimum number of Canadian coins to make any


amount, the greedy method always works
– At each step, just choose the largest coin that does not overshoot the
desired amount
• The greedy method would not work if we did not have 5¢ coins
– For 31 cents, the greedy method gives seven coins (25+1+1+1+1+1+1),
but we can do it with four (10+10+10+1)
• The greedy method also would not work if we had a 21¢ coin
– For 63 cents, the greedy method gives six coins (25+25+10+1+1+1),
but we can do it with three (21+21+21)
• How can we find the minimum number of coins for any given set of
denominations?

COSC 3101E 21
Example

• We assume coins in the following denominations:


1¢ 5¢ 10¢ 21¢ 25¢
• We’ll use 63¢ as our goal

COSC 3101E 22
A simple solution

• We always need a 1¢ coin, otherwise no solution exists for making one


cent
• To make K cents:
– If there is a K-cent coin, then that one coin is the minimum
– Otherwise, for each value i < K,
• Find the minimum number of coins needed to make i cents
• Find the minimum number of coins needed to make K - i cents
– Choose the i that minimizes this sum

• This algorithm can be viewed as divide-and-conquer, or as brute force


– This solution is very recursive
– It requires exponential work
– It is infeasible to solve for 63¢
COSC 3101E 23
Another solution

• We can reduce the problem recursively by choosing the first


coin, and solving for the amount that is left
• For 63¢:
– One 1¢ coin plus the best solution for 62¢
– One 5¢ coin plus the best solution for 58¢
– One 10¢ coin plus the best solution for 53¢
– One 21¢ coin plus the best solution for 42¢
– One 25¢ coin plus the best solution for 38¢

• Choose the best solution from among the 5 given above


• Instead of solving 62 recursive problems, we solve 5
• This is still a very expensive algorithm
COSC 3101E 24
End of Lecture 18

Nov 15, 2007


COSC 3101E 26
A dynamic programming solution

• Idea: Solve first for one cent, then two cents, then three cents, etc., up
to the desired amount
– Save each answer in an array !
• For each new amount N, compute all the possible pairs of previous
answers which sum to N
– For example, to find the solution for 13¢,
• First, solve for all of 1¢, 2¢, 3¢, ..., 12¢
• Next, choose the best solution among:
– Solution for 1¢ + solution for 12¢
– Solution for 2¢ + solution for 11¢
– Solution for 3¢ + solution for 10¢
– Solution for 4¢ + solution for 9¢
– Solution for 5¢ + solution for 8¢
– Solution for 6¢ + solution for 7¢
COSC 3101E 27
An even better dynamic programming solution

• In fact, we can do a bit better than this, since the coins


come in only a small number of denominations (1¢, 5¢,
10¢, 21¢, 25¢)
• For each new amount N, compute cost of solution based on
smaller sum + one additional coin.
– For example, to find the solution for 13¢,
• First, solve for all of 1¢, 2¢, 3¢, ..., 12¢
• Next, choose the best solution among:
– solution for 12¢ + 1¢ coin
– solution for 8¢ + 5¢ coin
– solution for 3¢ + 10¢ coin

COSC 3101E 28
Making Change: Recurrence Relation

Let sum value of change to return

Let d [1...n ] denominations available

Let mincoins(sum) minimum number of coins required to make change totalling sum.

Let onecoin(sum) one coin in optimal set of coins to make change totalling sum.

Then
mincoins(sum)  min(mincoins(sum - d )  1)
d sum

onecoin(sum) argmin(mincoins(sum - d ))
d sum

COSC 3101E 29
function coins = makechange(d, sum)
%precondition: d=set of denominations (must include penny), sum=change to be made
%postcondition: coins = a minimal set of coins summing to sum
mincoins(0) = 0
mincoins(1…sum) = ∞

for i = 1:sum
%LI: mincoins(0...i-1) holds the min number of coins required to make change of (0…i-1).
% onecoin(1...i-1) holds the value of one coin in a minimal set of coins making the correct change.
for j = 1:length(d) %try each denomination
if d(j) <= i & mincoins(i-d(j)) + 1 < mincoins(i)
mincoins(i) = mincoins(i-d(j)) + 1 %best solution so far
onecoin(i) = d(j)
ncoins = mincoins(sum)
change = sum
for i = 1:ncoins %recover coins in optimal set
coins(i) = onecoin(change)
change = change - coins(i)

COSC 3101E 30
How good is the algorithm?

• The first algorithm is exponential, with a base


proportional to sum (e.g., 63).
• The second algorithm is much better – exponential with
a base proportional to the number of denominations
(e.g., 5).
• The dynamic programming algorithm is
O(sum number of denominations)

COSC 3101E 31
Elements of Dynamic Programming

• For dynamic programming to be applicable, an optimization


problem must have:
1. Optimal substructure
– An optimal solution to the problem contains within it optimal solutions
to subproblems (but this may also mean a greedy strategy applies)

COSC 3101E 32
Elements of Dynamic Programming

• Dynamic programming uses optimal substructure


from the bottom up:
– First find optimal solutions to subproblems
– Then choose which to use in optimal solution to problem.

COSC 3101E 33
Example Proof of Optimal Substructure

• Consider the problem of making N¢ with the fewest


number of coins
– Either there is an N¢ coin, or
– The set of coins n making up an optimal solution for N¢ can be
divided into two nonempty subsets, n1and n2, which make N1¢
and N2¢ change respectively, where N1¢ + N2¢ = N¢.
– If either N1¢ or N2¢ can be made with fewer coins, then clearly
N¢ can be made with fewer coins, hence solution was not
optimal.
– Thus each subset n1 and n2 must themselves be optimal
solutions to the subproblems of making N1¢ and N2¢
change, respectively.

COSC 3101E 34
Optimal Substructure
• Optimal substructure means that
– Every optimal solution to a problem contains...
– ...optimal solutions to subproblems
• Optimal substructure does not mean that
– If you have optimal solutions to all subproblems...
– ...then you can combine any of them to get an optimal solution to a larger
problem.
• Example: In Canadian coinage,
– The optimal solution to 7¢ is 5¢ + 1¢ + 1¢, and
– The optimal solution to 6¢ is 5¢ + 1¢, but
– The optimal solution to 13¢ is not 5¢ + 1¢ + 1¢ + 5¢ + 1¢
• But there is some way of dividing up 13¢ into subsets with optimal
solutions (say, 11¢ + 2¢) that will give an optimal solution for 13¢
– Hence, the making change problem exhibits optimal substructure.
COSC 3101E 35
Optimal Substructure

• Thus the step of choosing which subsolutions to


combine is a key part of a dynamic programming
algorithm.

COSC 3101E 36
Don’t all problems have this optimal
substructure property?
Longest simple path
B
1 2
• Consider the following graph: 3
1 4
A C D

• The longest simple path (path not containing a cycle) from A to


D is A B C D
• However, the subpath A B is not the longest simple path from
A to B (A C B is longer)
• The principle of optimality is not satisfied for this problem
• Hence, the longest simple path problem cannot be solved by a
dynamic programming approach

COSC 3101E 38
Example 2. Knapsack Problem

Get as much value


as you can
into the knapsack

COSC 3101E 39
The (General) 0-1 Knapsack Problem

0-1 knapsack problem:


• n items.
• Item i is worth $vi , weighs wi pounds.
• Find a most valuable subset of items with total weight ≤ W.
• vi, wi and W are all integers.
• Have to either take an item or not take it - can’t take part of it.

Is there a greedy solution to this problem?

COSC 3101E 40
What are good greedy local choices?

• Select most valuable object?


• Select smallest object?
• Select object most valuable by weight?

COSC 3101E 41
Some example problem instances

Let W  Capacity of knapsack = 10kg

Problem Instance 1: Problem Instance 2: Problem Instance 3:


v1 $60, w1 6kg v1 $60, w1 10kg v1 $60, w1 6kg
v2 $50, w2 5kg v2 $50, w2 9kg v2 $40, w2 5kg
v3 $50, w3 5kg v3 $40, w3 5kg

• Select most valuable object?


• Select smallest object? All Fail!
• Select object most valuable by weight?
COSC 3101E 42
Simplified 0-1 Knapsack Problem
• The general 0-1 knapsack problem cannot be solved
by a greedy algorithm.
• What if we make the problem simpler:

Suppose vi wi

• Can this simplified knapsack problem be


solved by a greedy algorithm?
• No!

COSC 3101E 43
Some example problem instances

Let W  Capacity of knapsack = 10kg

Problem Instance 1: Problem Instance 2:


v1 w1 6 v1 w1 10
v2 w2 5 v2 w2 9
v3 w3 5

• Select largest (most valuable) object?


Both Fail!
• Select smallest object?

COSC 3101E 44
Approximate Greedy Solution

• For the simplified knapsack problem:


• the greedy solution (taking the most valuable
object first) isn’t that bad:

ˆ 1
V  V , where
2
Vˆ  Total value of items selected by greedy algorithm
V  Total value of items selected by optimal algorithm

COSC 3101E 45
End of Lecture 19

Nov 20, 2007


YORK UNIVERSITY
Office of the Dean
FACULTY OF SCIENCE AND ENGINEERING
MEMORANDUM
TO: Distribution
FROM: Walter P. Tholen, Associate Dean (Research & Faculty Affairs)
DATE: November 15, 2007
SUBJECT: NSERC Undergraduate Student Research Awards (NSERC USRA)
It is a pleasure to bring to your attention that for 2008-2009 York has been allocated 51
NSERC USRAs. Considering not all offers will be accepted, we should try to recruit at least 70
qualified applicants. Please bring these awards to the attention of qualified students as soon as
possible.
Further information on NSERC USRAs can be found on the NSERC website. The application
form and instructions (Form 202) are now available only on the NSERC website and can be
printed from there. Please read all instructions before completing the application form.
I. NSERC USRA in Universities
(a) The value of the award from NSERC is $4,500 for 16 consecutive weeks. Faculty
members must pay the student at least 25% ($1,125) on top of this. If a student is
selected from another university to hold their award at York, the supervisor will be
responsible for paying at least $787.50 extra; i.e., 4% vacation pay and 10% for
benefits. Travel allowance may also be available when applicable.
(b) At York, the award can only be held during the summer session.
(c) NSERC expects students to work the full term. NSERC may approve shorter
tenure in exceptional circumstances (these circumstances must be approved by
NSERC), so the departments must make the students aware that the awards are for the
full 16 weeks.
(d) Students must return the completed applications to their departmental offices by
January 25, 2008. Transcripts for York students can be printed by the departments as
NSERC does not require that an official transcript be sent to them. Departments must
COSC 3101E submit their completed applications and transcripts,
47 along with their rankings, to my
office no later than February 8, 2008.
Approximate Greedy Solution

1
Claim:Vˆ  V
2
Proof:
Let W capacity of knapsack.
Let s value (weight) of object in optimal solution but not selected by greedy algorithm.
1
Suppose Vˆ  V
2
1
Then s  V (since object was not selected by greedy algorithm)
2
But since Vˆ  s  V W , object would be selected by greedy algorithm.
 Contradiction!
And running time  O( n log n)
where n number of items

COSC 3101E 48
Dynamic Programming Solution

• The General 0-1 Knapsack Problem can be solved by dynamic


programming.

Let W capacity of knapsack (kg)


Let (v i ,w i ) value ($) and weight (kg) of item i  [1...n ]
Let c[i ,w ] value of optimal solution for knapsack of capacity w and items drawn from [1...i ]

0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

COSC 3101E 49
Correctness
Let W capacity of knapsack (kg)
Let (v i ,w i ) value ($) and weight (kg) of item i  [1...n]
Let c[i ,w ] value of optimal solution for knapsack of capacity w and items drawn from [1...i ]

0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

Idea: c[i  1,w ] value of optimal solution for capacity w and items drawn only from [1...i  1]
What happens when we are also allowed to consider item i ?

Case 1. Optimal solution does not include item i .


Total value is the same as before.
Case 2. Optimal solution does include item i. One of these must be true!
Total value is:
Value of item i
 Value of optimal solution for remaining capacity of knapsack and allowable items

COSC 3101E 50
Bottom-Up Computation
Let W capacity of knapsack (kg)
Let (v i ,w i ) value ($) and weight (kg) of item i  [1...n ]
Let c[i ,w ] value of optimal solution for knapsack of capacity w and items drawn from [1...i ]
0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

Need only ensure that c[ i  1, v ] has been computed v w

c[i,w] w
i Allowed Items 0 1 2 … w … W
0 {}
1 {1}
2 {1 2}
… …
i {1 2 … i} c[i,w]
… …
n {1 2 … n}

COSC 3101E 51
Integer Knapsack Solution

Value=0 if no items or no knapsack.

COSC 3101E 52
Integer Knapsack Solution

Fill in table row-wise

Recurrence relation

COSC 3101E 53
Example Capacity W 6
0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

i v w
1 1 2
c[i,w] w
2 3 3 i Allowed Items 0 1 2 3 4 5 6
3 5 1 0 {} 0 0 0 0 0 0 0
4 2 5
5 6 3 1 {1} 0 0 1 1 1 1 1
6 10 5 2 {1 2} 0 0 1 3 3 4 4
3 {1 2 3} 0 5 5 6 8 8 9
4 {1 2 3 4} 0 5 5 6 8 8 9
5 {1 2 3 4 5} 0 5 5 6 11 11 12
6 {1 2 3 4 5 6} 0 5 5 6 11 11 15

COSC 3101E 54
Solving for the Items to Pack
0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

i v w c[i,w] w i n
1 1 2 i Allowed Items 0 1 2 3 4 5 6 w W
2 3 3 0 {} 0 0 0 0 0 0 0
3 5 1 1 {1} 0 0 1 1 1 1 1 items {}
4 2 5 2 {1 2} 0 0 1 3 3 4 4 loop for i  n downto 1
5 6 3 3 {1 2 3} 0 5 5 6 8 8 9
if c[i ,w ]  c[i  1,w ]
6 10 5 4 {1 2 3 4} 0 5 5 6 8 8 9
5 {1 2 3 4 5} 0 5 5 6 11 11 12 items items  {i }
6 {1 2 3 4 5 6} 0 5 5 6 11 11 15 w w  w i

COSC 3101E 55
Second Example Capacity W 6
0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

i v w
1 1 2
c[i,w] w
2 4 3 i Allowed Items 0 1 2 3 4 5 6
3 2 1
4 5 4 0 {} 0 0 0 0 0 0 0
5 4 3 1 {1} 0 0 1 1 1 1 1
6 2 3
2 {1 2} 0 0 1 4 4 5 5
3 {1 2 3} 0 2 2 4 6 6 7
4 {1 2 3 4} 0 2 2 4 6 7 7
5 {1 2 3 4 5} 0 2 2 4 6 7 8
6 {1 2 3 4 5 6} 0 2 2 4 6 7 8

COSC 3101E 56
Knapsack Problem: Running Time

• Running time (nW). (cf. Making change (dsum)).


– Not polynomial in input size!

COSC 3101E 57
End of Lecture 20

Nov 22, 2007


Recall: Knapsack ProblemCapacity W 6
0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

i v w
1 1 2
c[i,w] w
2 4 3 i Allowed Items 0 1 2 3 4 5 6
3 2 1
4 5 4 0 {} 0 0 0 0 0 0 0
5 4 3 1 {1} 0 0 1 1 1 1 1
6 2 3
2 {1 2} 0 0 1 4 4 5 5
3 {1 2 3} 0 2 2 4 6 6 7
4 {1 2 3 4} 0 2 2 4 6 7 7
5 {1 2 3 4 5} 0 2 2 4 6 7 8
6 {1 2 3 4 5 6} 0 2 2 4 6 7 8

COSC 3101E 59
Observation from Last Day (Jonathon):
We could still implement this recurrence relation directly as a recursive
program.
0 if i 0 or w 0

Then c  i ,w  c  i  1, w  if w i  w
max v  c [i  1, w  w ], c [i  1, w ]  if i  0 and w w
 i i i

c[i,w] w
i Allowed Items 0 1 2 3 4 5 6
0 {} 0 0 0 0 0 0 0
1 {1} 0 0 1 1 1 1 1
2 {1 2} 0 0 1 4 4 5 5
3 {1 2 3 } 0 2 2 4 6 6 7
4 {1 2 3 4 } 0 2 2 4 6 7 7
5 {1 2 3 4 5 } 0 2 2 4 6 7 8
6 {1 2 3 4 5 6 } 0 2 2 4 6 7 8

COSC 3101E 60
Recall: Memoization in Optimization

•Remember the solutions for the subinstances


•If the same subinstance needs to be solved again,
the same answer can be used.

COSC 3101E 61
Memoization

Memoization reduces the complexity from exponential to linear!

COSC 3101E 62
From Memoization to Dynamic Programming

• Determine the set of subinstances that need to be


solved.
• Instead of recursing from top to bottom, solve each of
the required subinstances in smallest to largest order,
storing results along the way.

COSC 3101E 63
Dynamic Programming Examples

1. Fibonacci numbers
2. Making change
3. 0-1 Knapsack problem
4. Activity Scheduling with profits

COSC 3101E 64
Recall: The Activity (Job/Event) Selection Problem

Ingredients:
•Instances: Events with starting and finishing times
<<s1,f1>,<s2,f2>,… ,<sn,fn>>.
•Solutions: A set of events that do not overlap.
•Value of Solution: The number of events scheduled.
•Goal: Given a set of events, schedule as many as
possible.

COSC 3101E 65
From Previous Lecture:
Problem can be solved by greedy algorithm

Greedy Criteria: Earliest Finishing Time


Motivation: Schedule the event that will
free up your room for someone Works!
else as soon as possible.
COSC 3101E 66
But what if activities have different values?

Activity Selection with Profits:


Input: information ( si , f i , gi ) about n activities, where
si  start time of activity i
f i  finishing time of activity i
gi  value (profit) of activity i

A feasible schedule is a set S  {1, 2,..., n} such that


i, j  S , activities i and j do not conflict.

Output: A feasible schedule S with maximum profit


P ( S )  gi
iS
COSC 3101E 67
Will a greedy algorithm
based on finishing time still work?

g 2 10
g1 1 g3 1

COSC 3101E
No! 68
Dynamic Programming Solution

Precomputation: Time?
1. Sort activities according to finishing time: f1  f 2   f n (O(n log n))

2. i  {1,..., n}, compute H (i) max{l {1, 2,..., i  1}| f l si } (O(n log n))
i.e. H (i ) is the last event that ends before event i starts.

COSC 3101E 69
Step 1. Define an array of values to compute

i  {0,..., n}, A(i )  largest profit attainable from the


(feasible) scheduling of a subset of activities from {1, 2,..., i}

Ultimately, we are interested in A(n)

COSC 3101E 70
Step 2. Provide a Recurrent Solution

1. Sort activities according to finishing time: f1  f 2   f n (O(n log n))

2. i  {1,..., n}, compute H (i) max{l {1, 2,..., i  1} | f l si } (O(n log n))
i.e. H (i ) is the last event that ends before event i starts.

A(0) 0 One of these must be true!

A(i ) max{ A(i  1), g i  A( H (i ))}, i  {1,..., n}

Decide not Profit from Optimal profit from


to schedule scheduling scheduling activities that
activity i activity i end before activity i begins
COSC 3101E 71
Step 3. Provide an Algorithm

function A=actselwithp(g, H, n)
% assumes inputs sorted by finishing time
A(0)=0
for i=1:n
A(i)=max(A(i-1), g(i)+A(H(i)))
end
Running time? O(n)

COSC 3101E 72
Step 4. Compute Optimal Solution
Invoke with: printasp(A,H,n,‘Activities to Schedule:’)

function actstring=printasp(A,H,i,actstring)

if i=0 Running time? O(n)


return

if A(i)>A(i-1)
actstring = printasp(A, H, H(i), actstring)
actstring = [actstring, sprintf('%d ', i)]
else
actstring = printasp(A, H, i-1, actstring)
COSC 3101E 73
Example

Activity i 1 2 3 4

Start si 0 2 3 2

Finish fi 3 6 6 10

Profit gi 20 30 20 30

H(i) ? ? ? ?

COSC 3101E 74
Example

Activity i 1 2 3 4

Start si 0 2 3 2

Finish fi 3 6 6 10

Profit gi 20 30 20 30

H(i) 0 ? ? ?

COSC 3101E 75
Example

Activity i 1 2 3 4

Start si 0 2 3 2

Finish fi 3 6 6 10

Profit gi 20 30 20 30

H(i) 0 0 ? ?

COSC 3101E 76
Example

Activity i 1 2 3 4

Start si 0 2 3 2

Finish fi 3 6 6 10

Profit gi 20 30 20 30

H(i) 0 0 1 ?

COSC 3101E 77
Example

Activity i 1 2 3 4 A(0) 0

Start si 0 2 3 2
A(1) max{0, 20  A( H (1))} 20

Finish fi 3 6 6 10
A(2) max{20,30  A( H (2))} 30
Profit gi 20 30 20 30
A(3) max{30, 20  A( H (3))} 40
H(i) 0 0 1 0

A(4) max{40,30  A( H (4))} 40

COSC 3101E 78
Dynamic Programming Examples

1. Fibonacci numbers
2. Making change
3. 0-1 Knapsack problem
4. Activity scheduling with profits
5. Longest common subsequence

COSC 3101E 79
Longest Common Subsequence

• Input: 2 sequences, X = x1, . . . , xm and Y = y1, . . . , yn.

• Output: a subsequence common to both whose length is


longest.

• Note: A subsequence doesn’t have to be consecutive, but it


has to be in order.

COSC 3101E 80
Example 3. Longest Common Subsequence

COSC 3101E 81
Brute-force Algorithm

For every subsequence of X , check whether it's a subsequence of Y .

Time: (n2m ).

2m subsequences of X to check.

Each subsequence takes (n) time to check:


scan Y for first letter, from there scan for second, and so on.

COSC 3101E 82
Step 1. Define Data Structure
• Input: 2 sequences, X = x1, . . . , xm and Y = y1, . . . , yn.

Let c(i , j ) length of LCS for X i and Y j


Ultimately, we are interested in c( m, n ).

COSC 3101E 83
Step 2. Define Recurrence
Case 1. Input sequence is empty

COSC 3101E 84
Recurrence

Case 2.
Last elements match:
must be part of an
LCS

X
Xi 1

Yj  1
Y

Z __ __ … __ __
m

COSC 3101E 85
Recurrence

X
Xi 1
Case 3. Last elements don’t match: at
most one of them is part of LCS
Yj
Y

X Choose!
Xi X
Xi
Yj
Y Yj  1
Y

COSC 3101E 86
Step 3. Provide an Algorithm

Running time? O(mn)

COSC 3101E 90
Step 4. Compute Optimal Solution

Running time? O(m+n)

COSC 3101E 91
Example

COSC 3101E 92

You might also like