0% found this document useful (0 votes)
17 views15 pages

Big-Oh - Lec 3

The document discusses the importance of time and space complexity in algorithm comparison, emphasizing that growth rates are more significant for larger inputs. It introduces big-O notation as a means to describe the upper boundary of function growth and provides examples of algorithms with varying complexities. The document concludes with rules for big-O and examples illustrating different time complexities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views15 pages

Big-Oh - Lec 3

The document discusses the importance of time and space complexity in algorithm comparison, emphasizing that growth rates are more significant for larger inputs. It introduces big-O notation as a means to describe the upper boundary of function growth and provides examples of algorithms with varying complexities. The document concludes with rules for big-O and examples illustrating different time complexities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Complexity

In general, we are not so much interested in the


time and space complexity for small inputs.

For example, while the difference in time


complexity between linear and binary search is
meaningless for a sequence with n = 10, it is
gigantic for n = 230.

1
Complexity

For example, let us assume two algorithms A and


B that solve the same class of problems.
The time complexity of A is 5,000n, the one for
B is 1.1n for an input with n elements.
For n = 10, A requires 50,000 steps, but B only 3,
so B seems to be superior to A.
For n = 1000, however, A requires 5,000,000
steps, while B requires 2.51041 steps.

2
Complexity

This means that algorithm B cannot be used for


large inputs, while algorithm A is still feasible.

So what is important is the growth of the


complexity functions.

The growth of time and space complexity with


increasing input size n is a suitable measure for
the comparison of algorithms.

3
Complexity
Comparison: time complexity of algorithms A and B

Input Size Algorithm A Algorithm B


n 5,000n 1.1n
10 50,000 3
100 500,000 13,781
1,000 5,000,000 2.51041
1,000,000 5109 4.81041392

4
Complexity

This means that algorithm B cannot be used for


large inputs, while running algorithm A is still
feasible.

So what is important is the growth of the


complexity functions.

The growth of time and space complexity with


increasing input size n is a suitable measure for
the comparison of algorithms.

5
The Growth of Functions
The growth of functions is usually described
using the big-O notation.

Definition: Let f and g be functions from the


integers or the real numbers to the real numbers.
We say that f(x) is O(g(x)) if there are
constants C and k such that
|f(x)|  C|g(x)|
whenever x > k.

6
The Growth of Functions

When we analyze the growth of complexity


functions, f(x) and g(x) are always positive.
Therefore, we can simplify the big-O requirement
to
f(x)  Cg(x) whenever x > k.

If we want to show that f(x) is O(g(x)), we only


need to find one pair (C, k) (which is never unique).

7
The Growth of Functions
The idea behind the big-O notation is to establish
an upper boundary for the growth of a function
f(x) for large x.
This boundary is specified by a function g(x) that
is usually much simpler than f(x).
We accept the constant C in the requirement
f(x)  Cg(x) whenever x > k,
because C does not grow with x.
We are only interested in large x, so it is OK if
f(x) > Cg(x) for x  k.

8
The Growth of Functions
Example:
Show that f(x) = x2 + 2x + 1 is O(x2).

For x > 1 we have:


x2 + 2x + 1  x2 + 2x2 + x2
 x2 + 2x + 1  4x2
Therefore, for C = 4 and k = 1:
f(x)  Cx2 whenever x > k.

 f(x) is O(x2).

9
The Growth of Functions

Question: If f(x) is O(x2), is it also O(x3)?

Yes. x3 grows faster than x2, so x3 grows also


faster than f(x).

Therefore, in practice, we find the smallest


simple function g(x) for which f(x) is O(g(x)).

10
The Growth of Functions
“Popular” functions g(n) are
n log n, 1, 2n, n2, n!, n, n3, log n

Listed from slowest to fastest growth:


• 1
• log n
• n
• n log n
• n2
• n3
• 2n
• n!

11
The Growth of Functions

A problem that can be solved with polynomial


worst-case complexity is called tractable.

Problems of higher complexity are called


intractable.

Problems that no algorithm can solve are called


unsolvable.

You will find out more about this in next semester.

12
Useful Rules for Big-O
For any polynomial f(x) = anxn + an-1xn-1 + … + a0,
where a0, a1, …, an are real numbers,
f(x) is O(xn).

If f1(x) is O(g1(x)) and f2(x) is O(g2(x)), then


(f1 + f2)(x) is O(max(g1(x), g2(x)))

If f1(x) is O(g(x)) and f2(x) is O(g(x)), then


(f1 + f2)(x) is O(g(x)).

If f1(x) is O(g1(x)) and f2(x) is O(g2(x)), then


(f1f2)(x) is O(g1(x) g2(x)).
13
Complexity Examples
What does the following algorithm compute?
procedure who_knows(a1, a2, …, an: integers)
m := 0
for i := 1 to n-1
for j := i + 1 to n
if |ai – aj| > m then m := |ai – aj|
{m is the maximum difference between any two
numbers in the input sequence}
Comparisons: n-1 + n-2 + n-3 + … + 1
= (n – 1)n/2 = 0.5n2 – 0.5n

Time complexity is O(n2).

14
Complexity Examples
Another algorithm solving the same problem:
procedure max_diff(a1, a2, …, an: integers)
min := a1
max := a1
for i := 2 to n
if ai < min then min := ai
else if ai > max then max := ai
m := max - min
Comparisons: 2n - 2
Time complexity is O(n).

15

You might also like