Growth Functions
References:
Introduction to Algorithms(Second Edition) Thomas H. Cormen,
Charles E. Leiserson, Ronald L. Rivest, Clifford Stein
Middle East Technical University, Department of Computer
Engineering, CENG-213
Algorithmic Performance
There are two aspects of algorithmic performance:
-Time
Instructions take time.
How fast does the algorithm perform?
What affects its runtime?
-Space
Data structures take space
What kind of data structures can be used?
How does choice of data structure affect the runtime?
-We will focus on time:
How to estimate the time required for an algorithm
How to reduce the time required
General rules for estimation:
Loops: The running time of a loop is at most the running time of
the statements inside of that loop times the number of iterations.
Nested Loops: Running time of a nested loop containing a
statement in the inner most loop is the running time of statement
multiplied by the product of the sized of all loops.
Consecutive Statements: Just add the running times of those
consecutive statements.
If/Else: Never more than the running time of the test plus the
larger of running times of S1 and S2.
Analysis of Algorithm:
When we analyze algorithms, we should employ mathematical
techniques that analyze algorithms independently of specific
implementations, computers, or data.
To analyze algorithms:
First, we start to count the number of significant operations in a
particular solution to assess its efficiency.
Then, we will express the efficiency of algorithms using growth
functions.
Measuring the performance of an algorithm in relation with the
input size is called Order of Growth/Growth Rate.
Algorithmic Growth Rates:
We measure an algorithm’s time requirement as a function of the problem size.
Problem size depends on the application: e.g. number of elements in a list for a sorting
algorithm, the number disks for towers of hanoi.
So, for instance, we say that (if the problem size is n)
Algorithm A requires 5*n2 time units to solve a problem of size n.
Algorithm B requires 7*n time units to solve a problem of size n.
The most important thing to learn is how quickly the algorithm’s time requirement
grows as a function of the problem size.
Algorithm A requires time proportional to n2.
Algorithm B requires time proportional to n.
An algorithm’s proportional time requirement is known as growth rate.
We can compare the efficiency of two algorithms by comparing their growth rates.
Common
Function Growth Rate Name Growth Rates
<-----
c Constant
log N Logarithmic
log2N Log-squared
N Linear
N log N
N2 Quadratic
N3 Cubic
2N Exponential
Comparison of Growth Rate Functions:
/* Logarithmic is the slowest growing function whereas exponential is the fastest
growing function(gives huge values for even small input n and grows rapidly
with
varying input size)*/
Comparison of Growth Rate Functions(using a graph):
Growth-Rate Functions
O(1) Time requirement is constant, and it is independent of the problem’s size.
O(log2n) Time requirement for a logarithmic algorithm increases increases slowly
as the problem size increases.
O(n) Time requirement for a linear algorithm increases directly with the size
of the problem.
O(n*log2n) Time requirement for a n*log2n algorithm increases more rapidly than
a linear algorithm.
O(n2) Time requirement for a quadratic algorithm increases rapidly with the
size of the problem.
O(n3) Time requirement for a cubic algorithm increases more rapidly with the
size of the problem than the time requirement for a quadratic algorithm.
O(2n) As the size of the problem increases, the time requirement for an
exponential algorithm increases too rapidly to be practical.
Example:
If an algorithm takes 1 second to run with the problem size 8, what is the time requirement
(approximately) for that algorithm with the problem size 16?
If its order is:
O(1) T(n) = 1 second
O(log2n) T(n) = (1*log216) / log28 = 4/3 seconds
O(n) T(n) = (1*16) / 8 = 2 seconds
O(n*log2n) T(n) = (1*16*log216) / 8*log28 = 8/3 seconds
O(n2) T(n) = (1*162) / 82 = 4 seconds
O(n3) T(n) = (1*163) / 83 = 8 seconds
O(2n) T(n) = (1*216) / 28 = 28 seconds = 256 seconds
Properties of Growth Rate Function:
1. We can ignore low-order terms in an algorithm’s growth-rate function.
If an algorithm is O(n3+4n2+3n), it is also O(n3).
We only use the higher-order term as algorithm’s growth-rate function.
2. We can ignore a multiplicative constant in the higher-order term of an algorithm’s
growth-rate function.
If an algorithm is O(5n3), it is also O(n3).
3. O(f(n)) + O(g(n)) = O(f(n)+g(n))
We can combine growth-rate functions.
If an algorithm is O(n3) + O(4n), it is also O(n3 +4n2) So, it is O(n3).
Similar rules hold for multiplication.
Order of Magnitude Analysis:
If Algorithm A requires time proportional to f(n), Algorithm A is said to be order f(n), and it is
denoted as O(f(n)).
The function f(n) is called the algorithm’s growth-rate function.
Since the capital O is used in the notation, this notation is called the Big O notation.
If Algorithm A requires time proportional to n2, it is O(n2).
If Algorithm A requires time proportional to n, it is O(n).
When we calculated running time for insertion sort; we derived a polynomial
equation similar to an2+bn+c where in higher order term(n2) was considered
to characterize the running time of an algorithm. But when we attach
Asymptotic Notations (O,Θ,Ω) & say the running time is O(n2 ) ; we need to
understand which scenario we mean by that? (Best, Average or Worst)
(Refer Example shown in the class:
Best Case Scenario of Selection Sort is n
Average Case is n2)
Order of an Algorithm:
Definition:
Algorithm A is order f(n) – denoted as O(f(n)) –
if constants k and n0 exist such that A requires
no more than k*f(n) time units to solve a problem
of size n n0.
The requirement of n n0 in the definition of O(f(n)) formalizes the notion of sufficiently
large problems.
In general, many values of k and n can satisfy this definition.