Introduction to Algorithms
Algorithms
What is an Algorithm ?
Algorithms
What is an Algorithm ?
Method of solving a problem by a finite
sequence of instructions.
Algorithms
Is there a problem for which there can not be
an Algorithm ?
Algorithms
Is there a problem for which there can not be
an Algorithm ?
Print all the Integers
Halting Problem
Hilbert’s tenth problem
Efficient Algorithms ?
NP – Hard problems: Example TSP
Travelling Salesman Problem: If a sales man
wants to visit all the cities, what is the best
way ?
Efficient Algorithms
There are some problems whose complexity is
not known
- Factorisation: Given a integer, find it’s factors.
(Primarily-decide if a given number is prime can
be solved efficiently).
- Turnpiker Reconstruction problem: There is an
algorithm, which works well most of the time !
But no guarantee !
Efficient Algorithms ?
This course is about designing
and analysing efficient
algorithms
The focus will be on developing
skills and techniques
Performance of Algorithms
What metric should be used to judge
algorithms?
– How many steps does a it take ? (Running
time)
– Memory required
• Running time is the dominant
standard.
– Quantifiable and easy to compare
– Often the critical bottleneck
How long this program takes ?
#include <stdio.h>
void main(){
int i,j,k,n,s;
n=10000;s=2;
for (i=0;i<n;++i)
for (j=0;j<n;++j)
for (k=0;k<n;++k)
s=2*s+1;
printf("%d",s);
}
What is the output ?
#include #include
<stdio.h><stdio.h>
void main(){
void main(){int i,j,k,n,a[10000000];
n=1000;
int i,j,k,n,a[10000000];
n=1000; for (i=0;i<n;++i)
for (j=0;j<n;++j)
for (i=0;i<n;++i) for (k=0;k<n;++k)
a[i*n*n+j*n+k]=2*i*j*k+1;
}
for (j=0;j<n;++j)
for (k=0;k<n;++k)
a[i*n*n+j*n+k]=2*i*j*k+1;
}
Why Efficient Algorithms
Matter ?
• Suppose N = 10^8
• A PC can process 10N records in 1 sec.
• But if some algorithm does N*N computation,
then it takes 10M seconds = 110 days!!!
• 100 City Traveling Salesman Problem.
– A supercomputer checking 100 billion tours/sec
still requires 1010 years!
Asymptotic Notations
• Big-O, “bounded above by”: T(n) =
O(f(n))
– For some c and N, T(n) c·f(n) whenever n >
N.
Asymptotic Notations
• Big-O, “bounded above by”: T(n) = O(f(n))
– For some c and N, T(n) c·f(n) whenever n > N.
– Example : T(n)=2n2+n-3
T(n)=2n2+n-3
< 2n2+n when n>0
< 2n2+n2 when n>0
< 3n2 when n>0
So, We can say T(n)=O(n2)
Asymptotic Notations
• Big-O, “bounded above by”: T(n) =
O(f(n))
– For some c and N, T(n) c·f(n) whenever n >
N.
• Big-Omega, “bounded below by”: T(n) =
(f(n))
– For some c>0 and N, T(n) c·f(n) whenever n
> N.
Asymptotic Notations
• Big-Omega, “bounded below by”: T(n) =
(f(n))
– For some c>0 and N, T(n) c·f(n) whenever n
> N.
– Example : T(n)=2n2+n-3
T(n)=2n2+n-3
>2n2 when n>3
So T(n) =(n2)
Asymptotic Notations
• Big-O, “bounded above by”: T(n) = O(f(n))
– For some c and N, T(n) c·f(n) whenever n > N.
• Big-Omega, “bounded below by”: T(n) =
(f(n))
– For some c>0 and N, T(n) c·f(n) whenever n > N.
• Big-Theta, “bounded above and below”: T(n)
= (f(n))
– T(n) = O(f(n)) and also T(n) = (f(n))
Asymptotic Notations
• Big-Theta, “bounded above and below”:
T(n) = (f(n))
– T(n) = O(f(n)) and also T(n) = (f(n))
– Example : T(n)=2n2+n-3
2n2<2n2+n-3<3n2 when n>3
So T(n) = (n2)
There exist positive constants c
There exist positive constants c1 and c2
such that there is a positive constant n0
such that there is a positive constant n0
such that … cg(n)
such that …
c2g(n)
f(n) f(n)
c1g(n)
n n
n0 n0
f(n) = ( g(n)) f(n) = O( g(n))
There exist positive constants c
such that there is a positive constant n0
such that …
f(n)
cg(n)
n0 n
f(n) = ( g(n))
Complexity and Tractability
T(n)
n n n log n n2 n3 n4 n10 2n
10 .01s .03s .1s 1s 10s 10s 1s
20 .02s .09s .4s 8s 160s 2.84h 1ms
30 .03s .15s .9s s 810s 6.83d 1s
40 .04s .21s 1.6s s 2.56ms 121d 18m
50 .05s .28s s s 6.25ms 3.1y 13d
100 .1s .66s 10s 1ms 100ms 3171y 41013y
103 1s 9.96s 1ms 1s 16.67m 3.171013y 3210283y
104 s 130s 100ms 16.67m 115.7d 3.171023y
105 s 1.66ms 10s 11.57d 3171y 3.171033y
106 ms 19.92ms 16.67m 31.71y 3.17107y 3.171043y
Assume the computer does 1 billion ops per sec.
Big O Fact
• A polynomial of degree k is O(n k)
• Proof:
– Suppose f(n) = bknk + bk-1nk-1 + … + b1n + b0
• Let ai = | bi |
– f(n) aknk + ak-1nk-1 + … + a1n + a0
i
n
n ai k
k
n k
a
i cn k
n
Simplifications
• Keep just one term!
– the fastest growing term (dominates the
runtime)
• Asymtotic behavior (as n gets large) is
determined entirely by the leading term.
Example. T(n) = 10 n3 + n2 + 40n + 800
• If n = 1,000, then T(n) = 10,001,040,800
• error is 0.01% if we drop all but the n3 term
Runtime Analysis
• Two important rules
– Rule of sums
• if you do a number of operations in sequence, the
runtime is dominated by the most expensive
operation
– Rule of products
• if you repeat an operation a number of times, the
total runtime is the runtime of the operation
multiplied by the iteration count
Review
Operations on Stacks and Queues. Θ(1)
Linked List.
ADD/delete in the beginning is Θ (1)
Add/Delete at the end is Θ (n)
Search/iterative reverse is Θ (n).
Review
Recursive reverse is Θ (n2).
T(n)=n+T(n-1)
=n+n-1+n-2+n-3 + …+1
=n(n-1)/2
Θ(n2)
Iterative Recursion
voidreverse (struct List **head){
struct List *p = NULL, *c = *head, *n;
while (c) {
n = c->next;
c->next = p;
p = c;
c = n; } p c n
*head = p;}
Floyd’s Cycle Finding Algorithm
bool Cycle (struct List *node){
struct List *t1,*t2;t1=t2=node;
while (t2) {
t2=t2->next;
if (t2==t1) return true;
t1 = t1->next;
if(t2) t2=t2->next; }
return false;}