0% found this document useful (0 votes)
42 views15 pages

Ass 2

The document outlines an assignment on the Design and Analysis of Algorithms, focusing on various algorithmic techniques such as Divide and Conquer, Greedy algorithms, and their implementations in C++. It includes tasks like finding the kth largest element in an array, searching in a sorted matrix, calculating the maximum subarray sum, constructing a Huffman tree, and determining the number of minimum spanning trees using Prim's algorithm, along with their respective time complexities. Each section provides code examples and analyses of worst-case time complexities for the algorithms discussed.

Uploaded by

Nimrit Maan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views15 pages

Ass 2

The document outlines an assignment on the Design and Analysis of Algorithms, focusing on various algorithmic techniques such as Divide and Conquer, Greedy algorithms, and their implementations in C++. It includes tasks like finding the kth largest element in an array, searching in a sorted matrix, calculating the maximum subarray sum, constructing a Huffman tree, and determining the number of minimum spanning trees using Prim's algorithm, along with their respective time complexities. Each section provides code examples and analyses of worst-case time complexities for the algorithms discussed.

Uploaded by

Nimrit Maan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Design and Analysis of Algorithms (CSN4001)

Session: 2024-2025-2

Assignment – 2

Roll No: 23103092

Name: Kanav Goyal

Divide and Conquer

1. Write a program to select the kth largest element of unsorted array using
partitioning method of the Quick Sort. Derive recurrence relation and solve it
using iterative method to find the worst case time complexity.
#include <iostream>

#include <vector>

using namespace std;

// Partition function (similar to QuickSort)

int partition(vector<int>& arr, int low, int high) {

int pivot = arr[high]; // Choose last element as pivot

int i = low - 1;

for (int j = low; j < high; j++) {

if (arr[j] >= pivot) { // Sorting in descending order for k-th largest

i++;

swap(arr[i], arr[j]);

swap(arr[i + 1], arr[high]);


return i + 1;

// QuickSelect function to find k-th largest element

int quickSelect(vector<int>& arr, int low, int high, int k) {

if (low <= high) {

int pivotIndex = partition(arr, low, high);

if (pivotIndex == k - 1) // Found the k-th largest element

return arr[pivotIndex];

else if (pivotIndex > k - 1) // Search in left partition

return quickSelect(arr, low, pivotIndex - 1, k);

else // Search in right partition

return quickSelect(arr, pivotIndex + 1, high, k);

return -1; // This should never be reached

int main() {

vector<int> arr = {10, 4, 5, 8, 6, 11, 26};

int k = 3; // Find the 3rd largest element

int n = arr.size();

int result = quickSelect(arr, 0, n - 1, k);

cout << "The " << k << "-th largest element is: " << result << endl;

return 0;

}
The partitioning step divides the array into two subproblems. In the worst case, the pivot is the
smallest or largest element. This means one subproblem is of size n-1 and the other is empty.
T(n) = T(n-1) + c*n (where c*n is the time for partitioning) Solving the recurrence relation
iteratively: T(n) = T(n-1) + cn = (T(n-2) + c(n-1)) + cn = (T(n-3) + c(n-2)) + c(n-1) + cn ... =
T(1) + c(2 + 3 + ... + n) = c * (n(n+1)/2 -1) (Sum of first n natural numbers) = O(n^2) Worst-
Case Time Complexity: O(n^2)

2. Write a program to search element in 2-dimensional sorted matrix. Each


row and each column of the matrix is sorted and consists of unique elements.
Derive recurrence relation and solve it using iterative method to find the
worst case time complexity.
#include <iostream>

#include <vector>

using namespace std;

bool searchMatrix(const vector<vector<int>>& matrix, int target) {

int rows = matrix.size();

if (rows == 0) return false;

int cols = matrix[0].size();

int row = 0;

int col = cols - 1; // Start from top-right corner

while (row < rows && col >= 0) {

if (matrix[row][col] == target) {

return true;

} else if (matrix[row][col] < target) {

row++; // Move down (larger elements)

} else {

col--; // Move left (smaller elements)


}

return false; // Element not found

int main() {

vector<vector<int>> matrix = {

{1, 4, 7, 11, 15},

{2, 5, 8, 12, 19},

{3, 6, 9, 16, 22},

{10, 13, 14, 17, 24},

{18, 21, 23, 26, 30}

};

int target = 5;

if (searchMatrix(matrix, target)) {

cout << target << " found in the matrix." << endl;

} else {

cout << target << " not found in the matrix." << endl;

target = 20;

if (searchMatrix(matrix, target)) {

cout << target << " found in the matrix." << endl;

} else {

cout << target << " not found in the matrix." << endl;

}
return 0;

// Time Complexity Analysis

Worst-Case Scenario:

The worst-case occurs when the target element is either the smallest or the largest element in the
matrix, or if it's not present at all. In this case, we might have to traverse either an entire row or
an entire column.

Recurrence Relation (Not strictly applicable here, but helps to understand):

While a strict recurrence relation isn't the best way to describe this, we can think of it
conceptually. In the worst case, in each comparison, we eliminate either a row or a column. If
the matrix is m x n:

T(m, n) = T(m-1, n) OR T(m, n) = T(m, n-1) (We either reduce the number of rows or columns
by 1).

Iterative Solution and Worst-Case Time Complexity:

The iterative solution (provided in the code) starts from the top-right corner and either moves
down (if the current element is less than the target) or moves left (if the current element is greater
than the target). In the worst case, we might traverse all the elements in the first row and then the
remaining elements in the last column (or vice-versa).

Therefore, the maximum number of steps we take is m + n - 1 (where m is the number of rows
and n is the number of columns).

Worst-Case Time Complexity: O(m + n)

This is because in the worst-case scenario, we traverse at most m rows and n columns. This is a
linear time complexity with respect to the dimensions of the matrix.
3. Write a program to find the subarray with largest sum of the elements.
Derive recurrence relation and solve it using iterative method to find the
worst case time complexity.

Eg. A = [1, -2, 5, 7, -3, 9, -11, 5, 3, -4, 2]

Sub-array with largest sum is [5, 7, -3, 9]


#include <iostream>

#include <vector>

#include <algorithm> // Make sure to include this for std::max

using namespace std;

int maxSubarraySum(const vector<int>& arr) {

int max_so_far = 0; // Initialize with 0

int current_max = 0;

for (int x : arr) {

current_max = max(x, current_max + x);

max_so_far = max(max_so_far, current_max);

return max_so_far;

int main() {

vector<int> A = {1, -2, 5, 7, -3, 9, -11, 5, 3, -4, 2};

int maxSum = maxSubarraySum(A);

cout << "Maximum subarray sum: " << maxSum << endl; // Output: 18

A = {-2, -1, -4, -1, -2, -1, -5, -4}; // All negative numbers

maxSum = maxSubarraySum(A);
cout << "Maximum subarray sum: " << maxSum << endl; // Output: -1

A = {}; // Empty array

maxSum = maxSubarraySum(A);

cout << "Maximum subarray sum: " << maxSum << endl; // Output: 0

return 0;

// Time Complexity Analysis (Kadane's Algorithm): O(n) - Single pass through the array.

// Space Complexity: O(1) - Constant extra space.

Greedy

4. Given a set of charades {A, B, C, D, E, F}. Construct the Huffman tree and
generate Huffman codes based on following table. Write a program to
generate Huffman tree and find the time complexity.

Character Frequency
A 5
B 9
C 12
D 13
E 16
F 45

#include <iostream>
#include <queue>
#include <vector>
#include <map>

using namespace std;


// Huffman Tree Node
struct Node {
char ch;
int freq;
Node *left, *right;

Node(char c, int f) : ch(c), freq(f), left(NULL), right(NULL) {}


};
// Comparator for priority queue (Min-Heap)
struct Compare {
bool operator()(Node* a, Node* b) {
return a->freq > b->freq;
}
};
// Function to generate Huffman Codes
void generateHuffmanCodes(Node* root, string code, map<char, string>& huffmanCodes) {
if (!root) return;
if (root->ch != '$') // Non-internal nodes
huffmanCodes[root->ch] = code;
generateHuffmanCodes(root->left, code + "0", huffmanCodes);
generateHuffmanCodes(root->right, code + "1", huffmanCodes);
}
// Function to build Huffman Tree and generate Huffman Codes
void huffmanEncoding(vector<pair<char, int>> charFreq) {
priority_queue<Node*, vector<Node*>, Compare> pq;
// Step 1: Create nodes and push into priority queue
for (auto it : charFreq)
pq.push(new Node(it.first, it.second));
// Step 2: Build Huffman Tree
while (pq.size() > 1) {
Node* left = pq.top(); pq.pop();
Node* right = pq.top(); pq.pop();
// Create new internal node with combined frequency
Node* newNode = new Node('$', left->freq + right->freq);
newNode->left = left;
newNode->right = right;
pq.push(newNode);
}
// Step 3: Generate Huffman Codes
Node* root = pq.top();
map<char, string> huffmanCodes;
generateHuffmanCodes(root, "", huffmanCodes);
// Output Huffman Codes
cout << "Huffman Codes:\n";
for (auto it : huffmanCodes)
cout << it.first << " : " << it.second << endl;
}
int main() {
vector<pair<char, int>> charFreq = {
{'A', 5}, {'B', 9}, {'C', 12}, {'D', 13}, {'E', 16}, {'F', 45}
};
huffmanEncoding(charFreq);
return 0;
}

5. Write a program to the number of minimum spanning trees using prim’s


algorithm and return all the trees. Use adjacency list and heap as priority
queue. Calculate the time complexity of the code.
#include <iostream>
#include <vector>
#include <queue>
#include <algorithm> // For std::sort

using namespace std;

// Structure to represent edge data (now including parent)


struct EdgeData {
int to;
int parent;
int weight;
};

// Comparison function for the priority queue (min-heap) - CORRECTED


struct CompareEdge {
bool operator()(const EdgeData& a, const EdgeData& b) {
return a.weight > b.weight; // Compare based on weight
}
};

// Structure to represent an edge in the adjacency list


struct Edge {
int to;
int weight;
};

vector<vector<pair<int, int>>> findMSTs(const vector<vector<Edge>>& adj) {


int n = adj.size();
vector<vector<pair<int, int>>> allMSTs;

for (int start_node = 0; start_node < n; ++start_node) {


vector<vector<pair<int, int>>> currentMSTs;
vector<pair<int, int>> currentMST;

priority_queue<EdgeData, vector<EdgeData>, CompareEdge> pq;


vector<bool> visited(n, false);

pq.push({start_node, -1, 0}); // Start with arbitrary weight 0

while (!pq.empty()) {
EdgeData current_edge = pq.top();
pq.pop();

int u = current_edge.to;
int parent = current_edge.parent;
int weight = current_edge.weight;

if (visited[u]) continue;
visited[u] = true;

if (parent != -1) {
currentMST.push_back({parent, u});
}

for (const Edge& edge : adj[u]) {


if (!visited[edge.to]) {
pq.push({edge.to, u, edge.weight});
}
}
}

sort(currentMST.begin(), currentMST.end());
currentMSTs.push_back(currentMST);

bool isDuplicate = false;


for (const auto& existingMST : allMSTs) {
if (currentMST == existingMST) {
isDuplicate = true;
break;
}
}
if (!isDuplicate) {
allMSTs.insert(allMSTs.end(), currentMSTs.begin(), currentMSTs.end());
}
}
return allMSTs;
}

int main() {
int n = 4; // Example graph (you can change this)
vector<vector<Edge>> adj(n);

// Example graph (undirected) - Add edges both ways


adj[0].push_back({1, 10}); adj[1].push_back({0, 10});
adj[0].push_back({2, 6}); adj[2].push_back({0, 6});
adj[0].push_back({3, 5}); adj[3].push_back({0, 5});
adj[1].push_back({2, 15}); adj[2].push_back({1, 15});
adj[1].push_back({3, 4}); adj[3].push_back({1, 4});
adj[2].push_back({3, 8}); adj[3].push_back({2, 8});

vector<vector<pair<int, int>>> allMSTs = findMSTs(adj);

cout << "Number of Minimum Spanning Trees: " << allMSTs.size() << endl;

for (size_t i = 0; i < allMSTs.size(); ++i) {


cout << "MST " << i + 1 << ": ";
for (const auto& edge : allMSTs[i]) {
cout << "(" << edge.first << ", " << edge.second << ") ";
}
cout << endl;
}
return 0;
}

Time Complexity Analysis:

1. Prim's Algorithm (single execution): O(E log V), where E is the number of edges and V is the
number of vertices. Using a heap as a priority queue makes the log V factor come from heap
operations.

2. Iterating through all starting nodes: We do Prim's algorithm V times (once for each possible
starting node). This gives us a factor of V.

3. Checking for duplicates: In the worst case, we might compare each newly generated MST with
all the existing MSTs. If we have 'm' MSTs, this could add a factor of 'm' (number of MSTs)
multiplied by the cost of comparison which is O(V) in the worst case, giving O(mV).

Overall Time Complexity: O(V * (E log V + mV))

In a dense graph (E ≈ V^2), this could become closer to O(V^4) in the worst case, but this is a
loose upper bound. In practice, the number of MSTs (m) is often much smaller than V, so it's
better than this upper bound suggests. The actual complexity depends heavily on the graph
structure and the number of MSTs. If the number of MSTs is small, then the time complexity
will be closer to O(VE log V).

You might also like