0% found this document useful (0 votes)
187 views3 pages

CS 210a Assignment 1 Solutions

The algorithm Foo takes an array A of length n as input. It initializes a variable x to 0. It then has two nested for loops that iterate from 0 to n-1. The innermost loop iterates from 1 to n^2. This results in overall operations that are O(n^3).

Uploaded by

enayat atal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
187 views3 pages

CS 210a Assignment 1 Solutions

The algorithm Foo takes an array A of length n as input. It initializes a variable x to 0. It then has two nested for loops that iterate from 0 to n-1. The innermost loop iterates from 1 to n^2. This results in overall operations that are O(n^3).

Uploaded by

enayat atal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

CS 210a

Assignment 1 (Concept)
Solutions

1. (10 points) What is the largest size n of the problem that can be solved in 1 minute by an
algorithm which has running time, in microseconds (1 second=1,000,000 microseconds)
(a) log n (b) n (c) n (d) n 2 (e) 2 n

Solution: In each case, we need to solve for n equation of form f(n)=6*107.


(a) log n = 6*107, so n = 26*10000000.
(b) sqrt(n) = 6*107, so n = 36*1014.
(c) n= 6*107, nothing to solve here.
(d) n2= 6*107 , n=sqrt(6*107), which is approximately 7745
(e) 2n= 6*107 , n=log(6*107), which is approximately 25

2. (20 points) Use the definition of “big Oh” to prove that (n+100)3 is O(n3 ) and n3 is
O(n+100)3.
Solution: a) n3 <= 1*(n+100)3 for n0 >= 1, so n3 is O(n+100)3.
b) (n+100)3 is a polynomial of degree 3 with 4 terms, the largest coefficient of any term is
106 . Thus (n+100)3 <= 4* 106*n3 for n0 >= 1, and so (n+100)3 is O(n3)

3. (20 points) Use the definition of “big Oh” to prove that if d(n) is O(f(n)) and e(n) is O(g(n))
then d(n)e(n) is O(f(n)e(n))

Solution: Since d(n) is O(f(n)), there are constants c and n0 s.t. d(n) <= c*f(n) for all n >=
n0. Also since e(n) is O(g(n)), there are constants k and m0 s.t. e(n) <= k*g(n) for all n
>= m0. Now e(n)*d(n) <= k*c*f(n)*g(n) for all n >= max(n0, m0), and so d(n)e(n) is
O(f(n)g(n)).
Alternatively, you can say e(n)*d(n) <= k*c*f(n)*g(n) for all n >= n0* m0, since n0,
m0 are integers bigger than 1, and n0* m0>= max(n0, m0).
Wrong solution 1: Saying e(n)*d(n) <= max(k,c)*f(n)*g(n) for all n >= max(n0, m0)
does not work. For example, n<=2*(0.5n) for all n, 6n<=3*(2n), but n*(6n) is not <=
max(2,3)*(0.5n*2n)=3n2. When we say e(n)*d(n) <= k*c*f(n)*g(n), we use the
following rule: if a<b and c<d then a*c < b*d, provided a,b,c,d are positive. There is no
rule which says that if a<b and c<d, then max(a,c)<b*d
Wrong solution: Some people took an example for d(n),f(n),e(n),g(n), and proved the
above statement for this example. There is no such technique as “prove by example”.
If the statement works for one example, there is no guarantee that it will work for
another example. You had to prove the statement in general, and you had an example
from previous term’s homework on how to do it. You can disprove a statement by
giving an example where it does not work, but you can’t prove by example.
4. (30 points) Order the following list of functions by the big-Oh notation. Group together (for
example, by circling) those functions that are big-Theta of one another
1000n n + 2n n 2 log n 5n 4n 3 / 2 n 3 / 2 log n
22 log n log log n 6 n 2100 210 n + 0.01n 2 n3 / 2 + n
Solution: I give here detailed explanations, which were not required from you for this problem
2100 <= 2100 *(log log n) for n >= 2, but trying to prove that log log n <= c*2100
2100 for n >= n0 leads to a contradiction since log log n eventually grows larger than
any constant
log log n log log n <= sqrt(1000*n) for n >= 1, but trying to prove that sqrt(1000*n) <=
c*log log n for n >= n0 leads to a contradiction

1000n
sqrt(1000*n) <= 10000*(6 sqrt(n)) for n >= 1, and
6*sqrt(n) <= 0.00001*( sqrt(1000n)) for n >= 1
6 n
6sqrt(n) <=2* 4n3/2 for all n >=1, but trying to go the other way leads to a
contradiction that there is a c s.t. 4n <=c* 6 for all n >= n0
4n 3 / 2
4n3/2 <=4*(n3/2 +n) all n >=1, and n3/2 +n <=2n3/2 all n >=1
n3 / 2 + n n3/2 +n <=2(n3/2 log n) all n >=1, but trying to prove the other way leads to a
contradiction. Suppose that there is c,n0 s.t. n3/2 log n <= c (n3/2 +n ) for all n >=
n0. Then n3/2 (log n-c) <= c n all n >= n0, which implies n1/2 (log n-c) <= c all n
n 3 / 2 log n >= n0, but n1/2 (log n-c) goes to infinity
22logn =n2. n3/2 log n <=n2 all n >=1, but trying to go the other way we get a
2 log n
2 contradiction. Suppose that there is c,n0 s.t. n2 <= c (n3/2 log n ) for all n >= n0.
Then, taking logarithms of both sides, 2 log n <= log c + (3/2)(log n)+(log log
n) all n >= n0, which implies 0.5*(log n)-(log log n) <= log c all n >= n0. This is
false since log n increases much more rapidly than (log log n), and therefore
0.5*(log n)-(log log n) goes to infinity

0.01n2+210n is O(n2) using polynomial rule. And n2 <=100*(0.01n2+210n) for all


n>=1, so the other direction holds too.
210 n + 0.01n 2
0.01n2+210n <= 210*(n2logn) for all n >=1. But the other way around would
imply that for some constants c and n0, n2logn <=c*(0.01n2+210n) for all n >=n0.
n 2 log n Which would mean n2(log n – 0.01c)<=c*210n for all n >=n0 , which in turn
would imply n(log n – 0.01c)<=c*210 for all n >=n0 , we get a contradiction since
the left-hand side goes to infinity

n2logn <= n+2n for all n >=10. The other way around is false since 2n grows
much faster than n2logn
n + 2n

5n
n+ 2n <= 2n * 2n for n >= 1, and 2n * 2n= 4n< 5n for n >=1, thus n+ 2n <=
5n for n >=1. Trying to go the other way, we get a contradiction. Suppose
there is a constant c and n0 s.t. 5n <= c(n+2n ) for all n >= n0 . This implies 5n
<= c(2n 2n ) for all n >= n0 Then, taking logarithms of both sides, n* log 5<=
log c + 2n for all n >= n0 , which implies n*(log5 –2) <= log c for all n >= n0.
This is false because (log5-2) > 0 and so n*(log5 –2) goes to infinity.
5. (20 points) Give the asymptotic (big Oh) complexity of the following algorithm, show all the
work you do.

Algorithm Foo(A[], n) # of operations


x0 1
for i 0 to n-1 do n
for j  i to n-1 do n+(n-1)+…+2+1=0.5*n*(n+1)
x  x+A[j] n+(n-1)+…+2+1= 0.5*n*(n+1)
for k  1 to n2 n2+ n2+…+ n2 + n2 = n3
x x+k*A[i] n2+ n2+…+ n2 + n2 = n3

Thus the running time is 1+n+n2+n+2 n3, which is O(n3)

You might also like