0% found this document useful (0 votes)
7 views55 pages

Algorithm Analysis

The document discusses the analysis of algorithms, focusing on time and space complexity, and methods to measure the efficiency of algorithms. It explains concepts such as worst-case, best-case, and average-case analysis, along with asymptotic notations like O, Ω, and Θ for characterizing algorithm performance. Various algorithms are analyzed for their time and space complexities, providing examples to illustrate these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views55 pages

Algorithm Analysis

The document discusses the analysis of algorithms, focusing on time and space complexity, and methods to measure the efficiency of algorithms. It explains concepts such as worst-case, best-case, and average-case analysis, along with asymptotic notations like O, Ω, and Θ for characterizing algorithm performance. Various algorithms are analyzed for their time and space complexities, providing examples to illustrate these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 55

Algorithm analysis

• The term "analysis of algorithms" was coined by Donald


Knuth.
• Analysis of algorithm is the process of analyzing the problem-
solving capability of the algorithm in terms of the time and size
required (the size of memory for storage while implementation).
• Analysis of algorithms is the determination of the amount of
time and space resources required to an algorithm.
Design and Analysis of Algorithm

• Measuring efficiency of algorithm


• Time (How long the algorithm takes (running
time)
• Space : Memory requirement
Complexity

Time complexity:
– How much time it takes to compute
– Measured by a function T(N)

Space complexity:
– How much memory it takes to compute
– Measured by a function S(N)

3
Time taken by an algorithm?
• Performance measurement or Apostoriori Analysis- Implementing
the algorithm in a machine and then calculating the time taken by the
system to execute the program successfully.
• Performance Evaluation or Apriori Analysis- Before implementing the
algorithm in a system.
• 1. How long the algorithm takes :-will be represented as a function of
the size of the input. f(n)→how long it takes if ‘n’ is the size of input.
• 2. How fast the function that characterizes the running time grows
with the input size. “Rate of growth of running time”. The algorithm
with less rate of growth of running time is considered better.
Time complexity
• N = Size of the input
• T(N) = Time complexity function
• Order of magnitude:
– How rapidly T(N) grows when N grows
– For example: O(N) O(logN) O(N²)
O(2N)

5
Defining “problem size”
• Typically, it is straightforward to identify the size of a
problem, e.g.:
• size of array

• size of stack, queue, list etc.

• vertices and edges in a graph


Analyzing the Algorithm - 1
Algorithm swap(a,b)
{
temp = a;
a = b;
b = temp;
}
Analysis: Time T(n) & Space S(n)
Analysis of Algorithm -1
• Assume each simple statement takes 1 unit of time.
• Time Complexity

Algorithm swap(a, b)
# operations
temp = a 1
a=b 1
b = temp 1
Total 3

• T(n) = O(1)
Analysis of Algorithm -1
• Space Complexity
• Assume each identifiers of 1 word.
Algorithm swap(a, b)

temp = a
a=b
b = temp

• Variables used : a, b, temp = 3


• S(n) = O(1)
Algorithm – 2 (Adding the Elements
of an Array)

Algorithm sum (A, n)


s=0
for(i=0;i<n;i++)
s = s + A[i]
return s
Algorithm – 2 (Adding the elements
of an array)
Algorithm sum (A, n) #ofOperations
s=0 1
for(i=0;i<n;i++) (1+n+1+n) = 2n+2 = n+1
s = s + A[i] n
return s 1
Total, T(n) = 2n+3
O(n)
Algorithm – 2 (Adding the elements
of an array)
Algorithm sum (A, n)
s=0
for(i=0;i<n;i++)
s = s + A[i]
return s
Space Complexity

Variables- A(n), n(1),s(1),i(1).


Total, S(n) = n+3=>O(n)
Algorithm – 3 (Sum of Two Matrices)

Algorithm Sum(A,B,n)
for(i=0,i<n;i++)
for(j=0;j<n;j++)
C[i,j] = A[i,j] + B[i,j]
Algorithm – 3 (Sum of Two Matrices)
Algorithm Sum(A,B,n) #of Operations
for(i=0,i<n;i++) n+1
for(j=0;j<n;j++) n*(n+1)
C[i,j] = A[i,j] + B[i,j] n*n
Total, T(n) = 2n2 + 2n + 1 => O(n2)
Algorithm – 4 (Multiplication of Two
Matrices)
Algorithm multiply(A,B,n)
for(i=0,i<n;i++)
for(j=0;j<n;j++)
C[i,j] = 0
for(k-0;k<n;k++)
C[i,j] = C[i,j] + A[i,k]*B[k,j]
Algorithm – 4 (Multiplication of Two
Matrices)
Algorithm multiply(A,B,n) #ofOperations
for(i=0,i<n;i++) n+1
for(j=0;j<n;j++) n*(n+1)
C[i,j] = 0 n*n
for(k-0;k<n;k++) n*n*(n+1)
C[i,j] = C[i,j] + A[i,k]*B[k,j] n*n*n

Total, T(n) = 2n3+3n2+2n+1 => O(n3)


Algorithm – 4 (Multiplication of Two
Matrices)
Algorithm multiply(A,B,n)
for(i=0,i<n;i++)
for(j=0;j<n;j++)
C[i,j] = 0
for(k-0;k<n;k++)
C[i,j] = C[i,j] + A[i,k]*B[k,j]
Space Complexity –
A(n*n), B(n*n), C(n*n), n(1),i(1), j(1), k(1).
Total, S(n) = 3n2+3 = > O(n2)
Algorithm 5 (while loop)
# of Operations
i=0; 1
while(I < n) n+1
stmt; n
i++; n
Total 3n+2 => O(n)
Algorithm 5 (while loop)
# of Operations
i=0; 1
while(I < n) n+1
stmt; n
i++; n
Total 3n+2 => O(n)
Algorithm 5 (while loop)
# of Operations
i=0;
while(I < n)
stmt;
i++;
Algorithm 5 (while loop)
# of Operations
i=0; 1
while(I < n)
stmt;
i++;
Algorithm 5 (while loop)
# of Operations
i=0; 1
while(I < n) n+1
stmt;
i++;
Algorithm 5 (while loop)
# of Operations
i=0; 1
while(I < n) n+1
stmt; n
i++; n
Total 3n+2
Algorithm 5 (while loop)
# of Operations
i=0; 1
while(I < n) n+1
stmt; n
i++; n
Total 3n+2 =>O(n)
Algorithm 6 (if-else)
Algorithm test (n) #ofOperations
{
if(n<5)
{
printf(“%d”,n); 1
}
else{
for(i=0;i<n;i++)
{
pri ntf(“%d”,i); n
}
}
} max(1,n) = n =>O(n)
To be remembered

1<logn<<n<nlogn<n2<n3<….<2n<3n…<nn
Time Analysis

• Provides upper and lower bounds of running time.

Lower Bound Running Time Upper Bound

• Different types of analysis:


- Worst case
- Best case
- Average case
Worst Case

• Provides an upper bound on running time.


• An absolute guarantee that the algorithm would not
run longer, no matter what the inputs are.

Lower Bound Running Time Upper Bound


Best Case

• Provides a lower bound on running time.


• Input is the one for which the algorithm runs the
fastest.

Lower Bound Running Time Upper Bound


Average Case

• Provides an estimate of “average” running time.


• Assumes that the input is random.
• Useful when best/worst cases do not happen very
often (i.e., few input cases lead to best/worst cases).

Lower Bound Running Time Upper Bound


Comparing algorithms
• Given two algorithms having running times f(n) and g(n),
how do we decide which one is faster?

• Compare “rates of growth” of f(n) and g(n)


Understanding Rate of
Growth
• Consider the example of buying elephants and fish:

Cost: (cost_of_elephants) + (cost_of_fish)

Approximation:

Cost ~ cost_of_elephants
Understanding Rate of Growth
(cont’d)

• The low order terms of a function are relatively


insignificant for large n

n4 + 100n2 + 10n + 50

Approximation:

n4

• Highest order term determines rate of growth!


Example
• Suppose you are designing a website to process
user data (e.g., financial records).
• Suppose program A takes fA(n)=30n+8
microseconds to process any n records, while
program B takes fB(n)=n2+1 microseconds to
process the n records.
• Which program would you choose, knowing you’ll
want to support millions of users?
Example
• Suppose you are designing a website to process
user data (e.g., financial records).
• Suppose program A takes fA(n)=30n+8
microseconds to process any n records, while
program B takes fB(n)=n2+1 microseconds to
process the n records.
• Which program would you choose, knowing you’ll
want to support millions of users?

Compare rates of growth:


30n+8 ~ n and n2+1 ~ n2
Growth Function
• Order of growth of functions provides a simple
characterization of efficiency
• Allows for comparison of relative performance between
alternative algorithms
• Concerned with asymptotic efficiency of algorithms
• Best asymptotic efficiency usually is best choice except
for smaller inputs
• Several standard methods to simplify asymptotic
analysis of algorithms
Asymptotic Analysis

• Study of change in performance of the


algorithm with the change in the order of
the input size is defined as asymptotic
analysis.
Asymptotic Notations

• Mathematical notations used to describe the running


time of an algorithm when the input tends towards a
particular value or a limiting value.
Asymptotic Notation
• O notation: asymptotic “less than”: Upper Bound

f(n)=O(g(n)) implies: f(n) “≤” g(n)

•  notation: asymptotic “greater than”: Lower

Bound

• f(n)=  (g(n)) implies: f(n) “≥” g(n)

•  notation: asymptotic “equality”: Average/Tight

Bound
The O-Notation (Upper Bound)
O(g(n)) = { f(n) : ∃c > 0, n0 > 0 s.t. ∀n ≥ n0: f(n) ≤ c ⋅ g(n) }

c⋅g

n0
Asymptotic O Notation

• O notation: asymptotic “less than”:

f(n)=O(g(n)) implies: f(n) “≤” c g(n) in the limit*

c is a constant

(used in worst-case analysis)

*
formal definition in CS477/677
The Ω-Notation (Lower Bound)
Ω(g(n)) = { f(n) : ∃c > 0, n0 > 0 s.t. ∀n ≥ n0: f(n) ≥ c ⋅ g(n) }

f
c⋅g

n0
Asymptotic Notation

•  notation: asymptotic “greater than”:

f(n)=  (g(n)) implies: f(n) “≥” c g(n) in the limit*

c is a constant

(used in best-case analysis)

*
formal definition in CS477/677
The Θ-Notation (Average Bound)
Θ(g(n)) = { f(n) : ∃c1, c2 > 0, n0 > 0 s.t. ∀n ≥ n0:
c1 · g(n) ≤ f(n) ≤ c2 ⋅ g(n) }

c2 ⋅ g
f
c1 ⋅ g

n0
Asymptotic  Notation
(Average Bound)

•  notation: asymptotic “equality”:

f(n)=  (g(n)) implies: f(n) “=” c g(n) in the limit*

c is a constant

(provides a tight bound of running time)


(best and worst cases are same)

*
formal definition in CS477/677
Big-O Notation - Examples

fA(n)=30n+8

fB(n)=n2+1

10n3 + 2n2

n3 - n2

1273
Big-O Notation - Examples

fA(n)=30n+8 is O(n)

fB(n)=n2+1 is O(n2)

10n3 + 2n2 is O(n3)

n -n
3 2 is O(n3)

is O(1)
1273
Things to Remember
• Asymptotic analysis studies how the values of
functions compare as their arguments grow
without bounds.
• Ignores constants and the behavior of the function
for small arguments.
• Acceptable because all algorithms are fast for small
inputs and growth of running time is more important
than constant factors.
Things to Remember
• Ignoring the usually unimportant details, we obtain a
representation that succinctly describes the growth of
a function as its argument grows and thus allows us to
make comparisons between algorithms in terms of
their efficiency.
Analyzing Linear Search
Algorithm
Operations to count: how many times Num is compared to member of
array
One after the loop each time plus ...
Best-case: find the number we are looking for at the first position in the array (1
= comparisons) O(1)
Average-case: find the number on average half-way down the array (sometimes
longer, sometimes shorter)
(N/2+1 comparisons) O(N)
Worst-case: have to compare Num to very element in the array (N
comparisons) O(N)
Search Algorithm : Binary Search
int search(int A[], int N, int Num) {
int first = 0;
int last = N - 1;
int mid = (first + last) / 2;
while ((A[mid] != Num) && (first <= last)) {
if (A[mid] > Num)
last = mid - 1;
else
first = mid + 1;
mid = (first + last) / 2;
}
if (A[mid] == Num)
return mid;
else
return -1;
}
Analyzing Binary Search
One comparison after loop
First time through loop, toss half of array
Second time, half remainder (1/4 original)
Third time, half remainder (1/8 original)

Loop Iteration Remaining Elements
1 N/2
2 N/4
3 N/8
4 N/16

?? 1
How long to get to 1?
Analyzing Binary Search (Contd.)
•The amount of work done before and after the loop is a constant, and
independent of n
•The amount of work done during a single execution of the loop is
constant
•Time complexity will therefore be proportional to number of times the
loop is executed, so that’s what we’ll analyze
Analyzing Binary Search (Contd.)
• Worst case: key is not found in the array
•Each time through the loop, at least half of the remaining locations
are rejected:
•After first time through, <= n/2 remain
•After second time through, <= n/4 remain
•After third time through, <= n/8 remain
•After kth time through, <= n/2k remain
Analyzing Binary Search (Contd.)
• 2k > n >= 2 k-1
• Next, take base-2 logarithms to get:
• k > log2(n) >= k-1
• •Which is equivalent to:
• log2(n) < k <= log2(n) + 1
•Thus, binary search algorithm is O(log2(n)) in terms of the number of
array locations examined

You might also like