0% found this document useful (0 votes)
24 views7 pages

Chapter1 Introduction

An algorithm is a finite sequence of unambiguous instructions for solving a problem, characterized by properties such as input, output, definiteness, finiteness, and effectiveness. The design process involves understanding the problem, choosing computational methods and data structures, specifying the algorithm, verifying it, analyzing its efficiency, and implementing it. The document also discusses insertion sort as a sorting algorithm and introduces asymptotic notation for analyzing algorithm efficiency.

Uploaded by

ellenasan7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views7 pages

Chapter1 Introduction

An algorithm is a finite sequence of unambiguous instructions for solving a problem, characterized by properties such as input, output, definiteness, finiteness, and effectiveness. The design process involves understanding the problem, choosing computational methods and data structures, specifying the algorithm, verifying it, analyzing its efficiency, and implementing it. The document also discusses insertion sort as a sorting algorithm and introduces asymptotic notation for analyzing algorithm efficiency.

Uploaded by

ellenasan7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

CHAPTER I: INTRODUCTION

Introduction
What is an Algorithm?
An algorithm is a sequence of unambiguous instructions for solving a problem, i.e., for obtaining a
required output for any set of inputs in a finite amount of time.

Properties of algorithms:

An algorithm should possess the following properties:


• Input: An algorithm has zero or more inputs, taken from a specified set of objects.
• Output: An algorithm has one or more outputs, which have a specified relation to the inputs.
• Definiteness: Each step in the algorithm must be specific and unambiguous.
• Finiteness: The algorithm must always terminate after a finite number of steps.
• Effectiveness: All operations to be performed must be sufficiently basic that they can be
done exactly and in a finite length of time.

Design of algorithm

Various steps for algorithm design are:


1. Understanding the problem.
2. Decision making on
a. Capabilities of computational devices. (sequential or parallel algorithm)
b. The choice for either exact or approximate problem-solving method.
c. Data structure.
d. Algorithmic strategies.
• Brute force.
• Divide and Conquer.
• Greedy technique.
• Dynamic programming.
• Backtracking.
3. Specification of algorithm.
• Using natural language.
• Pseudocode.
• Flowchart.
4. Algorithmic verification.
5. Analysis of algorithm.
• Time efficiency.
• Space efficiency.
• Simplicity.
• Generality.
• Range of input.
6. Implementation or coding of the algorithm.
Testing of algorithm

There are two phases for testing a program:


Debugging.
Profiling.

Case study: Insertion sort


The sorting problem

The sequences are typically stored in arrays.

Expressing algorithms

We express algorithms in whatever way is the clearest and most concise. English is sometimes the
best way. When issues of control need to be made perfectly clear, we often use pseudocode.

• Pseudocode is similar to C, C++, Pascal, and Java. If you know any of these languages, you
should be able to understand pseudocode.

• Pseudocode is designed for expressing algorithms to humans. Software engineering issues


of data abstraction, modularity, and error handling are often ignored.

• We sometimes embed English statements into pseudocode. Therefore, unlike for !real”
programming languages, we cannot create a compiler that translates pseudocode to machine
code.

Insertion sort

A good algorithm for sorting a small number of elements.

• Start with an empty left hand and the cards face down on the table
• It works the way you might sort a hand of playing cards:
• Then remove one card at a time from the table, and insert it into the correct position in the left hand.
• To find the correct position for a card, compare it with each of the cards already in the hand, from
right to left.
• At all times, the cards held in the left hand are sorted, and these cards were originally the top cards
of the pile on the table.
2.1 Insertion sort 17

7
♣♣ ♣

♣♣
5♣ ♣
♣♣ ♣
10
4 ♣♣
2 ♣
♣ ♣♣ ♣
♣♣ ♣

♣ 7
♣ ♣♣ ♣


5♣4 2♣
♣♣♣
♣♣
0

1

Figurefor
Lecture Notes 2.1 Chapter
Sorting 2:
a hand of cards
Getting using insertion sort.
Started 2-3
Pseudocode
reading our algorithms. What separates pseudocode from “real” code is that in
pseudocode,
I NSERTION -S ORT .A;we employ
n/ whatever expressive method is most clear costandtimes
concise to
for j specify
D 2 toa ngiven algorithm. Sometimes, the clearest method is cEnglish, 1 n so do not
be surprised
key D AŒj ! if you come across an English phrase or sentencec2embedded
n ! 1within
//a section of “real”
Insert AŒj ! intocode. Another
the sorted differenceAŒ1
sequence between
: : j !pseudocode
1!. 0 and n! real1 code
is that pseudocode is not typically concerned with issues of software engineering.
D j of
i Issues !data
1 abstraction, modularity, and error handling are often P!nin1order
c4 ignored
n
to convey the essence of the algorithm more concisely.
while i > 0 and AŒi! > key c 5
PjnD2 j
t
WeAŒistart 1! Dinsertion
C with AŒi! sort, which is an efficient algorithm cfor 6 sortingj D2
a small
.t ! 1/
number of elements. Insertion sort works the way many people Pa nhandjof
sort
i D i !1 c7 j D2 .tj ! 1/
playing cards. We start with an empty left hand and the cards face down on the
AŒi C 1! D key c n!1
table. We then remove one card at a time from the table and 8insert it into the
correct position in the left hand. To find the correct position for a card, we compare
[Leave itthis
withoneach
the ofboard, but already
the cards show only
in thethe pseudocode
hand, from right for now.
to left, as We’ll put inin the
illustrated
18 Chapter 2 of
Operation Getting Started Sort on the array A [ 5, 2, 4, 6, 1, 3]
insertion
Figure
“cost” and 2.1. Atcolumns
“times” all times, later.]
the cards held in the left hand are sorted, and these cards
were originally the top cards of the pile on the table.
12 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6
We present our pseudocode for insertion sort as a procedure called I NSERTION -
(a) 5 2 4 6 1 3 (b) 2 5 4 6 1 3 (c) 2 4 5 6 1 3
ExampleS ORT, which takes as a parameter an array AŒ1 : : n! containing a sequence of
j length n that is to be sorted. (In the
j code, the number n of elements injA is denoted
1 2 by The algorithm 4 5 6 numbers in 1place:
1 2 3 4 5 input
1 2 sorts
3 the
1 2 3 4 5it rearranges
3 A:length.)
4 5 6 2 3 4 5 6 the
1 2 3 4 5 6 6 6
(d) 2 4numbers
5 2 4 5 6 within
6 1 1 3 the (e)
3 array1A,2with
2 5 4 at
4 5 most
66 3 a constant
1 3 (f) number
1 2 3 of4them
2 4 55 6stored
6 1 outside
3
the array at any time. The input array A contains the sorted output sequence when
the I NSERTION
j -S ORT procedure is finished.j
1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6
Figure
2 42.25 The 6 operation
1 3 of I NSERTION1-S ORT 2 on 4 the5 array
6 A3 D h5; 2; 4; 6; 1;13i. 2Array 3 indices
4 5 6
appear above the rectangles, and values stored in the array positions appear within the rectangles.
(a)–(e) The iterations of the for loop of lines 1–8. In each iteration, the black rectangle holds the
key taken from AŒj !, which is compared with the values in shaded rectangles to its left in the test of
line
[Read5. Shaded
this arrows
figureshow
row array
byvalues
[Link]
Eachonepart
position
showsto thewhat
right inhappens
line 6, and black
for aarrows
particular
itera-
indicate where the key moves to in line 8. (f) The final sorted array.
tion with the value of j indicated. j indexes the “current card” being inserted into
the hand. Elements to the left of AŒj ! that are greater than AŒj ! move one position
I NSERTION -S ORT .A/
to the right, and AŒj ! moves into the evacuated position. The heavy vertical lines
1 for j D 2 to A:length
2
separatekeythe part of the array in which an iteration works—AŒ1 : : j !—from the part
D AŒj !
3of the array
// Insert
thatAŒj into the sorted
is! unaffected bysequence
this iteration— 1!. C 1 : : n!. The last part of the
AŒ1 : : j !AŒj
4figure shows
i Dj! the
1 final sorted array.]
5 while i > 0 and AŒi! > key
6 AŒi C 1! D AŒi!
7Correctness
i D i !1
8 AŒi C 1! D key
We often use a loop invariant to help us understand why an algorithm gives the
Loop invariants
correct [Link] the correctness
Here’s of insertion
the loop invariant forsort
I NSERTION -S ORT:
Order of Growth

An abstraction to ease analysis and focus on the important features. Look only at the leading term of
Chapter 3 overview
the formula for running time.

• Drop lower order terms



! A way
Ignore the to describe
constant behavior
coefficient of functions
in the leading term in the limit. We’re studying asymptotic
efficiency.
Example:
! Describe growth
For insertion of we
sort, functions.
can easily show that in the worst-case, the running time is
!
of the! form 𝑎𝑛 + 𝑏𝑛 + 𝑐.
Focus on what’s importantcby
(a, b and areabstracting
some constants)
away low-order terms and constant
factors.
•! Drop 2.
⇒ antimes
Howlower-order
we indicateterms
running of algorithms.
•! Ignore the constant coefficient ⇒ n 2.
A way to compare “sizes” of functions:

However, Owe!cannot
" say that the worst-case running time 𝑇(𝑛) 𝑒𝑞𝑢𝑎𝑙𝑠 𝑛! . It grows like n2.
! ! #
But it doesn’t equal n2. We say that the running time is Θ(𝑛! ) to capture the notion that the
‚ ! D
order of growth is 𝒏𝟐 .
o ! <
! > one algorithm to be more efficient than another if its worst-case
! consider
We usually
running time has a smaller order of growth.
Asymptotic
Asymptoticnotation
notations

(𝑶 ≈≤)
O-notation O-notation

O.g.n// D ff .n/ W there exist positive constants c and n0 such that


0 " f .n/ " cg.n/ for all n # n0 g :
cg(n)

f(n)

n
n0

g.n/ is an asymptotic upper bound for f .n/.


If f .n/ 2 O.g.n//, we write f .n/ D O.g.n// (will precisely explain this soon).
𝛀 – notation ( 𝛀 ≈ ≥ )
n2 = lg lg lg n

!-notation

!.g.n// D ff .n/ W there exist positive constants c and n0 such that


0 ! cg.n/ ! f .n/ for all n " n0 g :

f(n)

cg(n)

n
n0

g.n/ is an asymptotic lower bound for f .n/.

Lecture Notes for Chapter 3: Growth of Functions 3-3


Example
p
n D !.lg n/, with c D 1 and n0 D 16.
𝚯 – notation (𝚯≈=)
‚-notation
Examples of functions in !.n2 /:
‚.g.n//
n2 D ff .n/ W there exist positive constants c1 , c2 , and n0 such that
2
n Cn 0 ! c1 g.n/ ! f .n/ ! c2 g.n/ for all n " n0 g :
n2 # n c2g(n)
1000n2 C 1000n
1000n2 # 1000n
f(n)
Also,
n3 c1g(n)
n2:00001
n2 lg lg lg n
n
22
n
n0

g.n/ is an asymptotically tight bound for f .n/.

Comparison of different time complexities:


Example
n2 =2 # 2n D ‚.n2 /, with c1 D 1=4, c2 D 1=2, and n0 D 8.
1. Constant time — O(1)

Theorem
For example:
#include <stdio.h>
f .n/ D ‚.g.n// if and only if f D O.g.n// and f D !.g.n// :
int main()
{
Leading
printf("Hello constants and low-order terms don’t matter.
World");
}

Asymptotic notation in equations

When on right-hand side


O.n2 / stands for some anonymous function in the set O.n2 /.
2n2 C 3n C 1 D 2n2 C ‚.n/ means 2n2 C 3n C 1 D 2n2 C f .n/ for some
f .n/ 2 ‚.n/. In particular, f .n/ D 3n C 1.
2. Logarithmic time — O(log n)

For example:
int a = 0, i = N;
while (i > 1) {
a += i;
i /= 2;
}

3. Linear time — O(n)

For example:
#include <stdio.h>
void main()
{
int i, n = 8;
for (i = 1; i <= n; i++) {
printf("Hello Word !!!");
}

4. O(n log n)
For example:
int i, j, k = 0;
for (i = 1; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n / 2;
}
}

5. Quadratic time — O(n2)

For example:

int a = 0;
for (i = 0; i < N; i++) {
for (j = 0; j <N ; j++) {
a = a + i + j;
}
}

6. Linear time — O(n + m)

For example:
int a = 0, b = 0;
for (i = 0; i < N; i++) {
a = a + rand();
}

for (j = 0; j < M; j++) {


b = b + rand();
}
More Examples:

Bubble Sort:

flage=true;
for (int i=0; i<n-1 & flage ;i++)
{ flage=false;
for( int j=0;j<n-1-i;j++)
if(a[j]>a[j+1])
{ swap(a[j],a[j+1])
flage= true;
}
}

Ω-notation: n
Ꝋ-notation: n/2*n = n2
O-notation: n2

Sequential Search:

bool Seq_Search(int a[], int n, int x)


{
for( int i=0; i<n ; i++)
if(a[i]==x) return true;

return false;
}

Ω-notation: 1
Ꝋ-notation: n
O-notation: n

Randomized and Recursive algorithms


Recursive algorithm:

Recursive algorithms are based on the idea that a solution to a problem can be found by finding
solutions to smaller subproblems of the same problem. A program is called recursive when an entity
calls itself.

Randomized algorithm:

An algorithm that uses random numbers to decide what to do next anywhere in its logic is called a
Randomized Algorithm. Randomized algorithms throw coins during execution. Hence, either the
order of execution or the result of the algorithm might be different for each run on the same input.

However, deterministic algorithms produce the same results on a given input following the same
computation steps.

You might also like