Asymptotic Notation,
Review of Functions &
Summations
*
Asymptotic Complexity
⬥ Running time of an algorithm as a function of
input size n for large n.
⬥ Expressed using only the highest-order term in
the expression for the exact running time.
⬧ Instead of exact running time, say Θ(n2).
⬥ Describes behavior of function in the limit.
⬥ Written using Asymptotic Notation.
asymp - 1 Comp 122
Asymptotic Notation
⬥ Θ, O, Ω, o, ω
⬥ Defined for functions over the natural numbers.
⬧ Ex: f(n) = Θ(n2).
⬧ Describes how f(n) grows in comparison to n2.
⬥ Define a set of functions; in practice used to compare
two function sizes.
⬥ The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.
asymp - 2 Comp 122
Θ-notation
For function g(n), we define Θ(g(n)),
big-Theta of n, as the set:
Θ(g(n)) = {f(n) :
∃ positive constants c1, c2, and
n0, such that ∀n ≥ n0,
we have 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)
}
Intuitively: Set of all functions that
have the same rate of growth as g(n).
g(n) is an asymptotically tight bound for f(n).
asymp - 3 Comp 122
Θ-notation
For function g(n), we define Θ(g(n)),
big-Theta of n, as the set:
Θ(g(n)) = {f(n) :
∃ positive constants c1, c2, and
n0, such that ∀n ≥ n0,
we have 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)
}
Technically, f(n) ∈ Θ(g(n)).
Older usage, f(n) = Θ(g(n)).
I’ll accept either…
f(n) and g(n) are nonnegative, for large n.
asymp - 4 Comp 122
Example
Θ(g(n)) = {f(n) : ∃ positive constants c1, c2, and n0,
such that ∀n ≥ n0, 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)}
⬥ 10n2 - 3n = Θ(n2)
⬥ What constants for n0, c1, and c2 will work?
⬥ Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
⬥ To compare orders of growth, look at the
leading term.
⬥ Exercise: Prove that n2/2-3n= Θ(n2)
asymp - 5 Comp 122
O-notation
For function g(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
∃ positive constants c and n0,
such that ∀n ≥ n0,
we have 0 ≤ f(n) ≤ cg(n) }
Intuitively: Set of all functions
whose rate of growth is the same as
or lower than that of g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = Θ(g(n)) ⇒ f(n) = O(g(n)).
Θ(g(n)) ⊂ O(g(n)).
asymp - 6 Comp 122
Ω -notation
For function g(n), we define Ω(g(n)),
big-Omega of n, as the set:
Ω(g(n)) = {f(n) :
∃ positive constants c and n0,
such that ∀n ≥ n0,
we have 0 ≤ cg(n) ≤ f(n)}
Intuitively: Set of all functions
whose rate of growth is the same
as or higher than that of g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = Θ(g(n)) ⇒ f(n) = Ω(g(n)).
Θ(g(n)) ⊂ Ω(g(n)).
asymp - 7 Comp 122
Relations Between Θ, O, Ω
asymp - 8 Comp 122
asymp - 9
Definition
An algorithm is any well-defined
computational procedure that
takes some values or set of
values as input and produces
some values or set of values as
output
asymp - 10
Definition
A sequence of computational steps
that transforms the input into
output
asymp - 11
Algorithms
Properties of algorithms:
• Input An algorithm has zero or more inputs,
• Output An algorithm produces at least one output.
• Clear and Unambiguous: The algorithm should be clear
and unambiguous
• Definiteness of every step in the computation, Every
fundamental operator in instruction must be defined without any
ambiguity.
• Language Independent: The Algorithm designed must be
language-independent
12
asymp - 12
Algorithms
Properties of algorithms:
• Definiteness of every step in the computation, Every
fundamental operator in instruction must be defined without any
ambiguity.
• Correctness of output for every possible input,
• Finiteness An algorithm must terminate after a finite number
of steps in all test cases. Every instruction which contains a
fundamental operator must be terminated within a finite amount
of time.
• Effectiveness An algorithm must be developed by using
very basic, simple, and feasible operations.
13
asymp - 13
The Growth of Functions
“Popular” functions g(n) are
n.log n, 1, 2n, n2, n!, n, n3, log n
Increasing rates of growth:
• 1
• log n
• n
• n log n
• n2
• n3
• 2n
• n!
14
asymp - 14