Backtracking
Backtracking is a general algorithmic technique for solving problems incrementally by trying partial
solutions and then abandoning (“backtracking”) them if they fail to satisfy the constraints.
🧠 Core Idea
Backtracking builds a solution one piece at a time and removes those solutions that fail to satisfy
constraints at any point during construction.
🧩 Typical Use Cases
Combinatorial problems:
o N-Queens
o Sudoku solver
o Graph coloring
o Subset sum
o Permutations / Combinations
o Word search
✅ General Backtracking Steps
1. Choose a starting point.
2. Choose an option.
3. Check if it leads to a solution:
o ✅ If yes, move forward.
o ❌ If no, backtrack and try another option.
4. Repeat until all possibilities are explored or the solution is found.
🧬 Template (Pseudocode)
python
CopyEdit
def backtrack(solution):
if is_complete(solution):
process(solution)
return
for option in choices(solution):
if is_valid(option, solution):
solution.append(option)
backtrack(solution)
solution.pop() # backtrack
🛠 Python Example: N-Queens (4x4)
python
CopyEdit
def solve_n_queens(n):
def is_safe(row, col, diagonals, anti_diagonals, cols):
return col not in cols and (row - col) not in diagonals and (row + col) not in anti_diagonals
def backtrack(row, diagonals, anti_diagonals, cols, state):
if row == n:
result.append(["".join(r) for r in state])
return
for col in range(n):
if is_safe(row, col, diagonals, anti_diagonals, cols):
state[row][col] = "Q"
diagonals.add(row - col)
anti_diagonals.add(row + col)
cols.add(col)
backtrack(row + 1, diagonals, anti_diagonals, cols, state)
# backtrack
state[row][col] = "."
diagonals.remove(row - col)
anti_diagonals.remove(row + col)
cols.remove(col)
result = []
empty_board = [["." for _ in range(n)] for _ in range(n)]
backtrack(0, set(), set(), set(), empty_board)
return result
# Example: Print all solutions for 4-Queens
for board in solve_n_queens(4):
for row in board:
print(row)
print()
⏱ Time Complexity
Worst-case is usually exponential, but it's faster than brute-force due to pruning.
o N-Queens: ~O(N!) (with optimizations)
o Sudoku: ~O(9^(N*N)), but pruned significantly in practice
✅ Advantages
Simple, elegant recursive structure
Solves complex constraint-based problems
Easily extendable with optimization (like branch and bound)
❌ Limitations
Can be slow for large input without optimization
Requires good pruning conditions to be efficient
Exhaustive Search
Exhaustive Search (also called Brute Force Search) is a straightforward algorithmic technique that
systematically explores all possible solutions to a problem and selects the one(s) that satisfy the
given conditions.
🔍 Core Idea
Try every possible combination or path, regardless of efficiency, and pick the correct or optimal
solution.
🧠 When to Use
When the problem space is small or moderate
When correctness is critical
As a baseline to test optimized algorithms
✅ Characteristics
Guaranteed to find a solution (if one exists)
Simple to implement
Usually very slow for large inputs due to exponential growth
🕒 Time Complexity
Typically O(kⁿ) where:
o k = number of choices at each step
o n = number of steps
🛠 Example 1: Subset Sum (Exhaustive Search)
python
CopyEdit
def subset_sum(nums, target):
def dfs(index, total):
if index == len(nums):
return total == target
# Include or exclude current number
return dfs(index + 1, total + nums[index]) or dfs(index + 1, total)
return dfs(0, 0)
# Example
print(subset_sum([3, 2, 7, 1], 6)) # True (3+2+1)
🛠 Example 2: Travelling Salesman Problem (TSP - Brute Force)
python
CopyEdit
from itertools import permutations
def tsp_brute_force(graph):
n = len(graph)
min_path = float('inf')
for perm in permutations(range(1, n)):
current_cost = graph[0][perm[0]]
for i in range(len(perm) - 1):
current_cost += graph[perm[i]][perm[i+1]]
current_cost += graph[perm[-1]][0] # return to start
min_path = min(min_path, current_cost)
return min_path
# Example Graph
graph = [
[0, 10, 15, 20],
[10, 0, 35, 25],
[15, 35, 0, 30],
[20, 25, 30, 0]
print(tsp_brute_force(graph)) # Output: 80
✅ Pros
Easy to write and debug
Always finds the correct result
Can be used to test more efficient algorithms
❌ Cons
Highly inefficient for large input sizes
Doesn't scale well
Often impractical in real-world scenarios
💡 Optimizations
To improve exhaustive search:
Pruning (like in Backtracking)
Memoization to avoid recomputation
Heuristics to guide the search (informed search)
DFS (Depth-First Search)
Depth-First Search (DFS) is a graph/tree traversal algorithm that explores as far as possible along
each branch before backtracking.
✅ Key Concepts
DFS uses stack data structure (implicitly via recursion or explicitly via a stack).
It goes deep into the graph/tree before exploring siblings.
Useful for:
o Finding connected components
o Topological sorting
o Detecting cycles
o Solving puzzles (mazes, Sudoku)
o Backtracking problems (e.g. N-Queens)
🧠 DFS Variants
Recursive DFS – elegant and concise
Iterative DFS – uses a stack to simulate recursion
🔁 Recursive DFS (Graph)
python
CopyEdit
def dfs_recursive(graph, node, visited):
if node in visited:
return
print(node, end=' ')
visited.add(node)
for neighbor in graph[node]:
dfs_recursive(graph, neighbor, visited)
# Example usage:
graph = {
'A': ['B', 'C'],
'B': ['D', 'E'],
'C': ['F'],
'D': [],
'E': ['F'],
'F': []
visited = set()
dfs_recursive(graph, 'A', visited)
➡️Output: A B D E F C
📦 Iterative DFS (Using Stack)
python
CopyEdit
def dfs_iterative(graph, start):
visited = set()
stack = [start]
while stack:
node = stack.pop()
if node not in visited:
print(node, end=' ')
visited.add(node)
stack.extend(reversed(graph[node])) # reverse for correct order
🕒 Time and Space Complexity
Graph Type Time Complexity Space Complexity
Adjacency List O(V + E) O(V)
Adjacency Matrix O(V²) O(V)
Where V = number of vertices, E = number of edges
🌐 DFS Applications
1. Cycle Detection in Directed/Undirected Graph
2. Topological Sorting (DAG)
3. Solving Mazes / Game Maps
4. Connected Components
5. Strongly Connected Components (Kosaraju’s Algorithm)
6. Pathfinding (less optimal than BFS in unweighted graphs)
⚠️DFS vs BFS
Feature DFS BFS
Data Structure Stack (Recursion) Queue
Memory Usage Less in wide graphs Less in deep graphs
Finds Shortest Path ❌ Not guaranteed ✅ Yes (in unweighted graphs)
Use Case Example Topo sort, cycle detection Shortest path, level order
What Are Constraints in Algorithmic Problems?
Constraints are rules or limits that define:
1. The range of inputs allowed
2. Conditions that any solution must satisfy
They are essential in algorithm design to choose the right strategy (brute-force, greedy, DP, etc.) and
to avoid Time Limit Exceeded (TLE) or Memory Limit Exceeded (MLE).
🧠 Types of Constraints
📌 1. Input Size Constraints
Guide the choice of algorithm based on time complexity.
Input Size Max Operations Allowed Suggested Time Complexity
≤ 10 ~10! = 3.6 million Brute-force / Backtracking
≤ 100 ~10⁶ O(N³) acceptable
≤ 1,000 ~10⁶ O(N²) or better
≤ 10⁵ ~10⁸ O(N log N) or better
≤ 10⁶ ~10⁸ O(N) or better
📌 2. Value Constraints
Help optimize memory usage and indexing strategies.
Example: 1 ≤ A[i] ≤ 1000 → can use arrays for frequency counting.
📌 3. Time Constraints
Usually set to 1–2 seconds.
On most platforms: 1 second ≈ 10⁸ operations
📌 4. Memory Constraints
Typical limits: 256MB to 1GB
Use these to estimate space complexity:
o int = 4 bytes
o bool = 1 byte
o list of 10⁶ ints ≈ 4MB
✅ How Constraints Guide Solution Strategy
Constraint Strategy
N ≤ 10 Brute force, permutations
N ≤ 20 Bit masking, backtracking
N ≤ 100 DP with O(N²)
Constraint Strategy
N ≤ 10⁵ Greedy, sorting, binary search
N ≤ 10⁶ Hashing, prefix sums, linear scan
Values small (≤ 1000) Frequency arrays, counting sort
🛠 Example
Problem:
Given n integers (1 ≤ n ≤ 10⁵), find a pair with sum k.
Brute-force O(n²) → TLE ❌
HashSet O(n) → Accepted ✅
📦 Checklist for Using Constraints Effectively
✅ Read input size & limits carefully
✅ Estimate feasible time/space complexity
✅ Choose algorithm that fits the scale
✅ Watch for hidden constraints (sorted, unique, etc.)