0% found this document useful (0 votes)
15 views11 pages

Unit 5

The document covers key concepts in iterators and recursion, including the implementation of a simple iterator in Python and recursive functions for calculating Fibonacci numbers and solving the Tower of Hanoi puzzle. It also discusses search algorithms such as simple search and binary search, detailing their time complexities and implementations. Finally, it explains sorting techniques like selection sort, merge sort, and higher-order sorts, along with their respective time complexities and use cases.

Uploaded by

vikasp.tera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views11 pages

Unit 5

The document covers key concepts in iterators and recursion, including the implementation of a simple iterator in Python and recursive functions for calculating Fibonacci numbers and solving the Tower of Hanoi puzzle. It also discusses search algorithms such as simple search and binary search, detailing their time complexities and implementations. Finally, it explains sorting techniques like selection sort, merge sort, and higher-order sorts, along with their respective time complexities and use cases.

Uploaded by

vikasp.tera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

UNIT-5

Iterators and Recursion: Recursive Fibonacci, Tower of Hanoi :-


1. Iterators

An iterator is an object that allows you to traverse through a collection, such as lists, tuples,
or custom sequences, one element at a time.
Key features of iterators:

● Implement the __iter__() and __next__() methods.

● Used in loops like for to iterate over items in a collection.

● StopIteration exception is raised when there are no more elements to traverse.

Example: Simple Iterator in Python

class MyIterator:

def __init__(self, numbers):

self.numbers = numbers

self.index = 0

def __iter__(self):

return self

def __next__(self):

if self.index < len(self.numbers):

value = self.numbers[self.index]

self.index += 1

return value

else:

raise StopIteration

nums = MyIterator([1, 2, 3])


for num in nums:

print(num)

2. Recursion

Recursion occurs when a function calls itself to solve a smaller instance of a problem. It
involves:

● Base case: The terminating condition to stop recursion.

● Recursive case: The logic to break the problem into smaller parts.

Recursive Fibonacci

The Fibonacci sequence is a series where each number is the sum of the two preceding ones.

Recursive Formula

F(n)=F(n−1)+F(n−2)F(n) = F(n-1) + F(n-2)F(n)=F(n−1)+F(n−2)


With base cases:

● F(0)=0F(0) = 0F(0)=0

● F(1)=1F(1) = 1F(1)=1

Implementation

def fibonacci(n):

if n <= 0:

return 0

elif n == 1:

return 1

else:

return fibonacci(n-1) + fibonacci(n-2)

# Example

n=6

print(f"Fibonacci({n}) =", fibonacci(n))


Drawbacks:

● Inefficient for large n due to repeated calculations (exponential time complexity).

● Use of memoization or iteration can improve efficiency.

Tower of Hanoi

The Tower of Hanoi is a mathematical puzzle involving three rods and a number of disks of
different sizes. The objective is to move all the disks from one rod to another, following these
rules:

1. Only one disk can be moved at a time.

2. A disk can only be placed on top of a larger disk.

3. All disks must be moved using the third rod as an auxiliary.

Recursive Strategy

1. Move n−1n-1n−1 disks from the source to the auxiliary rod.

2. Move the largest disk to the destination rod.

3. Move the n−1n-1n−1 disks from the auxiliary rod to the destination rod.

Implementation

def tower_of_hanoi(n, source, destination, auxiliary):

if n == 1:

print(f"Move disk 1 from {source} to {destination}")

return

tower_of_hanoi(n-1, source, auxiliary, destination)

print(f"Move disk {n} from {source} to {destination}")

tower_of_hanoi(n-1, auxiliary, destination, source)

# Example

n=3

tower_of_hanoi(n, 'A', 'C', 'B')


Search:-

1. Simple Search
Definition

Simple Search (also called Linear Search) involves traversing the entire collection
sequentially until the desired element is found.

Algorithm

1. Start at the first element.

2. Compare the current element with the target.

3. If they match, return the index.

4. If not, move to the next element.

5. If the end is reached and no match is found, return "not found."

Time Complexity

● Best Case: O(1)O(1)O(1) (target is the first element)

● Worst Case: O(n)O(n)O(n) (target is the last element or not present)

● Average Case: O(n/2)≈O(n)O(n/2) \approx O(n)O(n/2)≈O(n)

Implementation

def simple_search(arr, target):

for index, value in enumerate(arr):

if value == target:

return index

return -1

# Example

arr = [10, 20, 30, 40, 50]

target = 30

print(f"Element found at index: {simple_search(arr, target)}")

Estimating Search Time


The time taken depends on the number of elements nnn and the system's processing speed.

● For large datasets, time grows linearly with nnn.

2. Binary Search

Definition

Binary Search is an efficient search algorithm for sorted collections. It works by repeatedly
dividing the search interval in half.

Algorithm

1. Start with the entire collection.

2. Compare the middle element with the target.

o If equal, return the index.

o If smaller, discard the left half.

o If larger, discard the right half.

3. Repeat until the target is found or the interval is empty.

Time Complexity

● Best Case: O(1)O(1)O(1) (target is the middle element initially)

● Worst Case: O(log⁡n)O(\log n)O(logn) (logarithmic due to halving the search space)

● Average Case: O(log⁡n)O(\log n)O(logn)

Implementation

def binary_search(arr, target):

low, high = 0, len(arr) - 1

while low <= high:

mid = (low + high) // 2

if arr[mid] == target:

return mid

elif arr[mid] < target:

low = mid + 1
else:

high = mid - 1

return -1

# Example

arr = [10, 20, 30, 40, 50]

target = 30

print(f"Element found at index: {binary_search(arr, target)}")

Estimating Search Time

For a sorted collection of size nnn:

● Number of comparisons: At most ⌈log⁡2(n)⌉\lceil \log_2(n) \rceil⌈log2​(n)⌉.

● For n=1,000,000n = 1,000,000n=1,000,000: Binary search makes approximately 20


comparisons.

Example Time Estimation:

● If each comparison takes 1 μs1 \, \mu s1μs, search time is 20 μs20 \, \mu s20μs.

Comparison of Simple Search and Binary Search

Criteria Simple Search Binary Search

Use Case Unsorted collections Sorted collections

Time Complexity O(n)O(n)O(n) O(log⁡n)O(\log n)O(logn)

Space Complexity O(1)O(1)O(1) O(1)O(1)O(1) (Iterative)

Advantages Simple, no sorting required Fast for large datasets

Disadvantages Slow for large datasets Requires sorted input


Sorting and Merging Techniques:-
Sorting and merging are fundamental operations in computer science used to organize and
combine datasets efficiently. Here, we'll explore Selection Sort, Merge List, Merge Sort,
and Higher-Order Sorts.

1. Selection Sort

Definition

Selection Sort is a simple comparison-based sorting algorithm. It repeatedly selects the


smallest (or largest) element from the unsorted portion of the list and swaps it with the first
unsorted element.

Algorithm

1. Divide the list into sorted and unsorted parts.

2. Find the minimum element in the unsorted part.

3. Swap it with the first element in the unsorted part.

4. Repeat until the list is sorted.

Time Complexity

● Best Case: O(n2)O(n^2)O(n2)

● Worst Case: O(n2)O(n^2)O(n2)

● Space Complexity: O(1)O(1)O(1) (In-place sorting)

Implementation

def selection_sort(arr):

n = len(arr)

for i in range(n):

min_index = i

for j in range(i + 1, n):

if arr[j] < arr[min_index]:

min_index = j

arr[i], arr[min_index] = arr[min_index], arr[i]


return arr

# Example

arr = [64, 34, 25, 12, 22, 11, 90]

print("Sorted array:", selection_sort(arr))

2. Merge List

Definition

Merging two sorted lists involves combining them into a single sorted list.

Algorithm

1. Start with two sorted lists and an empty result list.

2. Compare the smallest elements from both lists.

3. Append the smaller element to the result list.

4. Repeat until all elements from both lists are merged.

Time Complexity

● O(n+m)O(n + m)O(n+m), where nnn and mmm are the lengths of the two lists.

Implementation

python

Copy code

def merge_lists(list1, list2):

merged_list = []

i, j = 0, 0

while i < len(list1) and j < len(list2):

if list1[i] < list2[j]:

merged_list.append(list1[i])

i += 1
else:

merged_list.append(list2[j])

j += 1

# Append remaining elements

merged_list.extend(list1[i:])

merged_list.extend(list2[j:])

return merged_list

# Example

list1 = [1, 3, 5]

list2 = [2, 4, 6]

print("Merged list:", merge_lists(list1, list2))

3. Merge Sort

Definition

Merge Sort is a divide-and-conquer sorting algorithm. It divides the list into halves,
recursively sorts each half, and then merges them back together.

Algorithm

1. Divide the list into two halves.

2. Recursively apply Merge Sort to both halves.

3. Merge the sorted halves using the Merge List process.

Time Complexity

● Best Case: O(nlog⁡n)O(n \log n)O(nlogn)

● Worst Case: O(nlog⁡n)O(n \log n)O(nlogn)

● Space Complexity: O(n)O(n)O(n)

Implementation
python

Copy code

def merge_sort(arr):

if len(arr) <= 1:

return arr

mid = len(arr) // 2

left_half = merge_sort(arr[:mid])

right_half = merge_sort(arr[mid:])

return merge_lists(left_half, right_half)

# Example

arr = [38, 27, 43, 3, 9, 82, 10]

print("Sorted array:", merge_sort(arr))

4. Higher-Order Sorts

Higher-order sorting algorithms are designed for large datasets or specific applications. These
include algorithms like Quick Sort, Heap Sort, and Radix Sort.

Quick Sort

● Divide-and-conquer: Select a pivot, partition the array into two parts, and recursively
sort.

● Time Complexity:

o Best/Average: O(nlog⁡n)O(n \log n)O(nlogn)

o Worst: O(n2)O(n^2)O(n2) (if pivot selection is poor)

Heap Sort

● Heap-based sorting: Uses a binary heap structure to repeatedly extract the maximum
(or minimum) element.
● Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

● Space Complexity: O(1)O(1)O(1)

Radix Sort

● Non-comparison-based: Works by sorting digits or characters in multiple passes.

● Time Complexity: O(nk)O(nk)O(nk), where kkk is the number of digits.

● Best suited for integers and strings.

Comparison of Sorting Algorithms

Time Complexity Time Complexity


Algorithm Space Complexity Stability
(Best) (Worst)

Selection
O(n2)O(n^2)O(n2) O(n2)O(n^2)O(n2) O(1)O(1)O(1) No
Sort

O(nlog⁡n)O(n \log O(nlog⁡n)O(n \log


Merge Sort O(n)O(n)O(n) Yes
n)O(nlogn) n)O(nlogn)

O(nlog⁡n)O(n \log O(log⁡n)O(\log


Quick Sort O(n2)O(n^2)O(n2) No
n)O(nlogn) n)O(logn)

O(nlog⁡n)O(n \log O(nlog⁡n)O(n \log


Heap Sort O(1)O(1)O(1) No
n)O(nlogn) n)O(nlogn)

O(n+k)O(n +
Radix Sort O(nk)O(nk)O(nk) O(nk)O(nk)O(nk) Yes
k)O(n+k)

You might also like