0% found this document useful (0 votes)
87 views29 pages

7 Asymptotic II

Uploaded by

urmi.cse0497.c
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
87 views29 pages

7 Asymptotic II

Uploaded by

urmi.cse0497.c
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 29

Analyzing Algorithms

• Predict the amount of resources required:


• memory: how much space is needed?
• computational time: how fast the algorithm runs?

• FACT: running time grows with the size of the input


• Input size (number of elements in the input)
– Size of an array, polynomial degree, # of elements in a matrix, # of bits in
the binary representation of the input, vertices and edges in a graph

Def: Running time = the number of primitive


operations (steps) executed before termination
– Arithmetic operations (+, -, *), data movement, control, decision making
(if, while), comparison
1
Algorithm Complexity
• Worst Case Complexity:
– the function defined by the maximum number of steps
taken on any instance of size n
• Best Case Complexity:
– the function defined by the minimum number of steps
taken on any instance of size n
• Average Case Complexity:
– the function defined by the average number of steps
taken on any instance of size n
Motivation for Asymptotic Analysis
• An exact computation of worst-case running time
can be difficult
– Function may have many terms:
• 4n2 - 3n log n + 17.5 n - 43 n⅔ + 75
• An exact computation of worst-case running time
is unnecessary
– Remember that we are already approximating running
time by using RAM model
Classifying functions by their
Asymptotic Growth Rates (1/2)
• asymptotic growth rate, asymptotic order, or
order of functions
– Comparing and classifying functions that ignores
• constant factors and
• small inputs.
• The Sets big oh O(g), big theta (g), big omega
(g)
Classifying functions by their
Asymptotic Growth Rates (2/2)
• O(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound;
 (g(n)), Theta of g of n, the Asymptotic Tight Bo
und; and
 (g(n)), Omega of g of n, the Asymptotic Lower
Bound.
Visualization of O(g(n))
cg(n)

f(n)

n0

6
Big - O
Example: 1
Lets say f(n)=3n+2 and g(n)=n
We have to prove:
-f(n)  cg(n)
- 3n+2 c.n

Example: 1
So we can say:
for c=4 and for all n2
f(n)  cg(n)
C n f(n)=3n+2 c.g(n)=c.n
4 1 5 4
4 2 8 8
4 3 11 12
4 5 17 20
Big-O

2n On
2 2

1,000,000n  150,000 On
2 2

5n  7 n  20 On
2 2

2n  2 O n
3 2

8
More Big-O
2
• Prove that: 20n  2n  5 O n  2

• Let c = 21 and n0 = 4
• 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE

9
Big Omega – Notation
 () – A lower bound
f n  g n : there exist positive constants c and n0 such that
0  f n  cg n for all n n0

– n2 = (n)
– Let c = 1, n0 = 2
– For all n  2, n2 > 1  n

10
Visualization of (g(n))
f(n)

cg(n)

n0

11
Visualization of (g(n))
Example: 1
Lets say f(n)=3n+2 and g(n)=n
We have to prove:
f(n)
-f(n)  cg(n)
- 3n+2 c.n
cg(n)

Example: 1
So we can say:
n0 for c=1 and for all n1
f(n)  cg(n)
C n f(n)=3n+2 c.g(n)=c.n
1 1 5 1
1 2 8 2
1 3 11 3
1 4 14 4
Visualization of (g(n))

f(n)

cg(n)

n0
Visualization of (g(n))
Example: 2
Lets say f(n)=3n+2 and g(n)=n2
We have to prove:
f(n)
-f(n)  cg(n)
- 3n+2 c.n
cg(n)

We can not find such a constant c so


that for all n 1 can satisfy our
n0 condition.
-notation

  provides a tight bound


f n  g n : there exist positive constants c1 , c2 , and n0 such that
0 c1 g n   f n  c2 g n for all n n0

15
Visualization of (g(n))
c2g(n)

f(n)

c1g(n)

n0

16
Visualization of (g(n))

Example: Firstly c2 n f(n)=3n+2 c2.g(n)=c2.n


Lets say f(n)=3n+2 and g(n)=n 4 1 5 4
We have to prove:
-f(n)  c2 g(n) 4 2 8 8
- 3n+2  c2.n 4 3 11 12
4 5 17 20

Example: Secondly c1 n f(n)=3n+2 c1.g(n)=c1.n


Lets say f(n)=3n+2 and g(n)=n
We have to prove: 1 1 5 1
-f(n)  cg(n) 1 2 8 2
- 3n+2 c.n 1 3 11 3
1 4 14 4

17
Example 2
• Prove that: 20 n 3
 7 n  1000  n 
3

• Let c = 21 and n0 = 10
• 21n3 > 20n3 + 7n + 1000 for all n > 10
n3 > 7n + 5 for all n > 10
TRUE, but we also need…
• Let c = 20 and n0 = 1
• 20n3 < 20n3 + 7n + 1000 for all n  1
TRUE

18
O(1) constant
• Algorithms whose solutions are independent of
the size of the problem’s inputs are said to have
constant time complexity
• Constant time complexity is denoted as O(1)
Some common rules

20
21
22
Example
• Code:
• a = b;

• Complexity:

23
Example
• Code:
• sum = 0;
• for (i=1; i <=n; i++)
• sum += n;

• Complexity:

24
Example
• Code:
• sum = 0;
• for (j=1; j<=n; j++)
• for (i=1; i<=j; i++)
• sum++;
• for (k=0; k<n; k++)
• A[k] = k;
• Complexity:

25
Example
• Code:
• sum1 = 0;
• for (i=1; i<=n; i++)
• for (j=1; j<=n; j++)
• sum1++;
• Complexity:

26
Example
• Code:
• sum2 = 0;
• for (i=1; i<=n; i++)
• for (j=1; j<=i; j++)
• sum2++;
• Complexity:

27
Example
• Code:
• sum1 = 0;
• for (k=1; k<=n; k*=2)
• for (j=1; j<=n; j++)
• sum1++;
• Complexity:

28
Example
• Code:
• sum2 = 0;
• for (k=1; k<=n; k*=2)
• for (j=1; j<=k; j++)
• sum2++;
• Complexity:

29

You might also like