Module I

Introduction to Analysis of Algorithms

Learn the fundamental concepts of algorithm analysis including time and space complexity, asymptotic notations, and techniques to solve recurrence relations.

1. Fundamentals of Algorithm Analysis

What is an Algorithm?
An algorithm is a finite sequence of well-defined instructions to solve a specific problem or perform a computation. It takes input, processes it through a series of steps, and produces output.

Why Analyze Algorithms?

Algorithm analysis helps us understand the efficiency of an algorithm in terms of the resources it consumes. The two primary resources we measure are:

  • Time: How long does the algorithm take to complete?
  • Space: How much memory does the algorithm need?
💡
Key Insight: We analyze algorithms to compare different solutions to the same problem and choose the most efficient one for our specific use case.

Types of Analysis

Priori Analysis (Theoretical)

  • Done before implementation
  • Independent of hardware/language
  • Uses asymptotic notations
  • Gives approximate complexity

Posteriori Analysis (Empirical)

  • Done after implementation
  • Depends on hardware/language
  • Measures actual running time
  • Gives exact time in seconds

2. Time Complexity

Time Complexity
Time complexity represents the amount of time an algorithm takes to complete as a function of the input size (n). It measures the number of basic operations performed.

Three Cases of Time Complexity

1
Best Case (Ω - Omega)
The minimum time required to execute the algorithm. Represents the most favorable input.
Example: In Linear Search, best case is O(1) when the element is at the first position.
2
Average Case (Θ - Theta)
The expected time for a random input. Calculated by averaging over all possible inputs.
Example: In Linear Search, average case is O(n/2) ≈ O(n).
3
Worst Case (O - Big-Oh)
The maximum time required to execute the algorithm. Represents the most unfavorable input.
Example: In Linear Search, worst case is O(n) when element is at last position or not present.

Common Time Complexities (Increasing Order)

Complexity Name Example n=1000
O(1) Constant Array access, Hash table lookup 1
O(log n) Logarithmic Binary Search ~10
O(n) Linear Linear Search, Traversal 1,000
O(n log n) Linearithmic Merge Sort, Quick Sort (avg) ~10,000
O(n²) Quadratic Bubble Sort, Selection Sort 1,000,000
O(n³) Cubic Matrix Multiplication (naive) 1,000,000,000
O(2ⁿ) Exponential Tower of Hanoi, Subset Sum Astronomical
O(n!) Factorial Travelling Salesman (brute force) Beyond computation
📝 Example: Analyzing a Simple Loop
Pseudocode
sum = 0                    // 1 operation
for i = 1 to n:            // Loop runs n times
    sum = sum + i          // 1 operation per iteration

// Total: 1 + n = O(n)

Analysis: The loop executes n times, each iteration doing constant work. Therefore, T(n) = O(n).

📝 Example: Nested Loops
Pseudocode
for i = 1 to n:            // Outer loop: n times
    for j = 1 to n:        // Inner loop: n times
        print(i, j)        // 1 operation

// Total: n × n = O(n²)

Analysis: For each iteration of outer loop (n times), inner loop runs n times. Total = n × n = O(n²).

3. Space Complexity

Space Complexity
Space complexity represents the total amount of memory an algorithm needs to run as a function of input size. It includes both fixed part (constants, variables) and variable part (dynamic allocation, recursion stack).
S(P) = C + Sp(instance)
Where C = Fixed space (constants, simple variables)
Sp = Variable space (depends on input)

Components of Space Complexity

  • Input Space: Space needed to store the input
  • Auxiliary Space: Extra space used by the algorithm (excluding input)
  • Recursion Stack: Space used by recursive function calls
📝 Example: Space Analysis
Pseudocode
// Algorithm 1: O(1) auxiliary space
function sum(arr[], n):
    total = 0              // 1 variable
    for i = 0 to n-1:
        total += arr[i]
    return total
// Only uses 'total' and 'i' - constant extra space

// Algorithm 2: O(n) auxiliary space
function copyArray(arr[], n):
    newArr = new array[n]  // n elements
    for i = 0 to n-1:
        newArr[i] = arr[i]
    return newArr
// Creates a new array of size n
🎯
Exam Tip: When asked about space complexity, always mention whether you're considering auxiliary space or total space (including input). For recursive algorithms, don't forget to count the recursion stack space!

4. Asymptotic Notations

Asymptotic notations are mathematical tools used to describe the running time of algorithms when the input tends toward infinity. They provide bounds on the growth rate of functions.

4.1 Big-O Notation (Upper Bound)

Big-O Notation: O(g(n))
f(n) = O(g(n)) if there exist positive constants c and n₀ such that:

0 ≤ f(n) ≤ c·g(n) for all n ≥ n₀

Big-O provides an asymptotic upper bound on the growth rate of f(n). It represents the worst-case scenario.
Big-O Visualization
    │
    │         c·g(n)
    │        ╱
    │       ╱     f(n) ≤ c·g(n)
    │      ╱   ___________
    │     ╱  ╱
    │    ╱  ╱  f(n)
    │   ╱ ╱
    │──●──────────────────
    │  n₀
    └────────────────────── n

    For n ≥ n₀, f(n) is bounded above by c·g(n)
📝 Example: Prove that f(n) = 3n + 2 is O(n)

Proof:

  1. We need to find c and n₀ such that 3n + 2 ≤ c·n for all n ≥ n₀
  2. For n ≥ 1: 3n + 2 ≤ 3n + 2n = 5n
  3. So, with c = 5 and n₀ = 1: 3n + 2 ≤ 5n for all n ≥ 1
  4. Therefore, 3n + 2 = O(n) ✓

4.2 Big-Omega Notation (Lower Bound)

Big-Omega Notation: Ω(g(n))
f(n) = Ω(g(n)) if there exist positive constants c and n₀ such that:

0 ≤ c·g(n) ≤ f(n) for all n ≥ n₀

Big-Omega provides an asymptotic lower bound. It represents the best-case scenario.
Big-Omega Visualization
    │
    │              f(n)
    │            ╱
    │          ╱
    │        ╱   f(n) ≥ c·g(n)
    │      ╱  ___________
    │    ╱  ╱
    │   ╱ ╱    c·g(n)
    │──●──────────────────
    │  n₀
    └────────────────────── n

    For n ≥ n₀, f(n) is bounded below by c·g(n)
📝 Example: Prove that f(n) = 3n + 2 is Ω(n)

Proof:

  1. We need to find c and n₀ such that c·n ≤ 3n + 2 for all n ≥ n₀
  2. For n ≥ 1: 3n ≤ 3n + 2
  3. So, with c = 3 and n₀ = 1: 3n ≤ 3n + 2 for all n ≥ 1
  4. Therefore, 3n + 2 = Ω(n) ✓

4.3 Big-Theta Notation (Tight Bound)

Big-Theta Notation: Θ(g(n))
f(n) = Θ(g(n)) if there exist positive constants c₁, c₂, and n₀ such that:

0 ≤ c₁·g(n) ≤ f(n) ≤ c₂·g(n) for all n ≥ n₀

Big-Theta provides a tight bound. f(n) grows at the same rate as g(n). This is the average-case scenario.
Big-Theta Visualization
    │
    │               c₂·g(n)
    │              ╱
    │            ╱    f(n) is "sandwiched"
    │   f(n)   ╱   ___________
    │     \  ╱   ╱
    │      ╲╱  ╱   c₁·g(n)
    │      ╱╲╱
    │──●──────────────────
    │  n₀
    └────────────────────── n

    f(n) is bounded both above and below
📌 Key Relationship

f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) AND f(n) = Ω(g(n))

4.4 Comparison Summary

Notation Type Meaning Use Case
O(g(n)) Upper Bound f(n) grows at most as fast as g(n) Worst Case Analysis
Ω(g(n)) Lower Bound f(n) grows at least as fast as g(n) Best Case Analysis
Θ(g(n)) Tight Bound f(n) grows exactly as fast as g(n) Average Case Analysis
🎯
Exam Tip (83% Frequency): Questions like "Define Big Oh, Big Omega, and Theta notation" or "Explain Asymptotic Notations" appear in almost every paper. Make sure you can:
  • Write the formal mathematical definition
  • Draw the diagram showing bounds
  • Give a simple proof example

5. Recurrence Relations

Recurrence Relation
A recurrence relation is an equation that defines a sequence where each term is defined as a function of its preceding terms. In algorithm analysis, recurrences express the time complexity of recursive algorithms.

General Form

T(n) = aT(n/b) + f(n)
Where:
a = number of subproblems
n/b = size of each subproblem
f(n) = cost of dividing/combining

Common Recurrence Relations

Algorithm Recurrence Solution
Binary Search T(n) = T(n/2) + O(1) O(log n)
Merge Sort T(n) = 2T(n/2) + O(n) O(n log n)
Quick Sort (avg) T(n) = 2T(n/2) + O(n) O(n log n)
Linear Search T(n) = T(n-1) + O(1) O(n)
Tower of Hanoi T(n) = 2T(n-1) + O(1) O(2ⁿ)

Methods to Solve Recurrence Relations

  1. Substitution Method: Guess the solution and prove by induction
  2. Recursion Tree Method: Visualize as a tree and sum costs
  3. Master Theorem: Direct formula for specific forms

6. Substitution Method

The substitution method involves two steps: guess the form of the solution, then use mathematical induction to prove the guess is correct.

📋 Steps for Substitution Method
  1. Guess the form of the solution (e.g., T(n) = O(n log n))
  2. Assume the solution holds for smaller values (inductive hypothesis)
  3. Substitute the guess into the recurrence
  4. Prove that the solution holds for n
  5. Verify the base case
Solved Example (Exam Pattern) 10 Marks
Solve the following recurrence using Substitution Method:
T(n) = 2T(n/2) + cn, where T(1) = 1
Solution

Step 1: Guess the solution

Based on the form (similar to Merge Sort), we guess T(n) = O(n log n)

We assume T(n) ≤ cn log n for some constant c > 0

Step 2: Inductive Hypothesis

Assume T(k) ≤ ck log k holds for all k < n

Step 3: Substitution

T(n) = 2T(n/2) + cn

Substituting our hypothesis for T(n/2):
T(n) ≤ 2[c(n/2)log(n/2)] + cn
     = cn·log(n/2) + cn
     = cn·(log n - log 2) + cn
     = cn·log n - cn + cn
     = cn·log n

∴ T(n) ≤ cn log n

Step 4: Verify Base Case

For n = 2: T(2) = 2T(1) + 2c = 2(1) + 2c = 2 + 2c

We need: 2 + 2c ≤ c(2)log(2) = 2c

This requires adjusting our constant, but the bound holds.

Conclusion: T(n) = O(n log n) ✓

7. Recursion Tree Method

The recursion tree method visualizes the recurrence as a tree where each node represents the cost of a subproblem. The total cost is the sum of costs at all levels.

📋 Steps for Recursion Tree Method
  1. Draw the recursion tree from the recurrence
  2. Calculate the cost at each level
  3. Count the number of levels
  4. Sum up all the costs across all levels
  5. Express the total in asymptotic notation
Solved Example (Exam Pattern) 10 Marks
Solve T(n) = 2T(n/2) + n using Recursion Tree Method
Solution
Recursion Tree for T(n) = 2T(n/2) + n
Level 0:                    n                           Cost: n
                          /   \
Level 1:              n/2       n/2                     Cost: n/2 + n/2 = n
                     /   \     /   \
Level 2:           n/4  n/4  n/4  n/4                   Cost: 4(n/4) = n
                   / \  / \  / \  / \
Level 3:          ...  ...  ...  ...                    Cost: n
                   .    .    .    .
                   .    .    .    .
Level k:          1    1    1    1   ... (n leaves)     Cost: n

Total Levels: log₂n (since n/2^k = 1 → k = log n)

Analysis:

  • Cost per level: n (constant at each level)
  • Number of levels: log₂n + 1
  • Total cost: n × (log₂n + 1) = n log n + n
T(n) = n + n + n + ... + n = n · log n
Sum of costs across all log n levels

Conclusion: T(n) = O(n log n) ✓

📝 Another Example: T(n) = T(n/3) + T(2n/3) + n
Unbalanced Recursion Tree
Level 0:                    n                           Cost: n
                          /   \
Level 1:              n/3      2n/3                     Cost: n
                     /   \     /    \
Level 2:           n/9  2n/9 2n/9  4n/9                 Cost: n
                   ...   ...  ...   ...

Longest path: n → 2n/3 → 4n/9 → ... → 1
Path length: log₃/₂(n) levels

Each level sums to at most n
Total levels ≤ log₃/₂(n) ≈ 2.41 log n

Result: T(n) = O(n log n)

8. Master Theorem

The Master Theorem provides a direct way to solve recurrences of the form T(n) = aT(n/b) + f(n), where a ≥ 1 and b > 1 are constants.

Master Theorem
For recurrence T(n) = aT(n/b) + f(n), let ccrit = logba

Compare f(n) with nccrit to determine the solution.

The Three Cases

1
Case 1: f(n) = O(nc) where c < logba
f(n) is polynomially smaller than nlogba

Solution: T(n) = Θ(nlogba)

The cost is dominated by the leaves (recursive calls).
2
Case 2: f(n) = Θ(nlogba)
f(n) is equal to nlogba

Solution: T(n) = Θ(nlogba · log n)

The cost is evenly distributed across all levels.
3
Case 3: f(n) = Ω(nc) where c > logba
f(n) is polynomially larger than nlogba
AND af(n/b) ≤ cf(n) for some c < 1 (regularity condition)

Solution: T(n) = Θ(f(n))

The cost is dominated by the root (divide/combine step).
⚠️
Important: The Master Theorem only applies when:
  • a ≥ 1 and b > 1
  • f(n) is asymptotically positive
  • Subproblems have equal size (n/b)
Solved Example 1 (Exam Pattern) 5 Marks
Solve T(n) = 4T(n/2) + n² using Master's Theorem
Solution

Given: T(n) = 4T(n/2) + n²

Here, a = 4, b = 2, f(n) = n²

Step 1: Calculate logba = log₂4 = 2

Step 2: Compare f(n) = n² with nlog₂4 = n²

Step 3: Since f(n) = n² = Θ(n²) = Θ(nlog₂4)

This matches Case 2 of Master Theorem

T(n) = Θ(n² log n)
Solved Example 2 (Exam Pattern) 5 Marks
Solve T(n) = 16T(n/4) + n using Master's Theorem
Solution

Given: T(n) = 16T(n/4) + n

Here, a = 16, b = 4, f(n) = n

Step 1: Calculate logba = log₄16 = 2

Step 2: Compare f(n) = n with nlog₄16 = n²

Step 3: f(n) = n = O(n2-ε) for ε = 1

f(n) is polynomially smaller than n²

This matches Case 1 of Master Theorem

T(n) = Θ(n²)
Solved Example 3 (Merge Sort) 5 Marks
Solve T(n) = 2T(n/2) + n using Master's Theorem
Solution

Given: T(n) = 2T(n/2) + n (Merge Sort recurrence)

Here, a = 2, b = 2, f(n) = n

Step 1: Calculate logba = log₂2 = 1

Step 2: Compare f(n) = n with nlog₂2 = n¹ = n

Step 3: f(n) = n = Θ(n) = Θ(nlog₂2)

This matches Case 2 of Master Theorem

T(n) = Θ(n log n)

Quick Reference Table

Recurrence a, b, f(n) logba Case Solution
T(n) = 2T(n/2) + 1 2, 2, 1 1 Case 1 Θ(n)
T(n) = 2T(n/2) + n 2, 2, n 1 Case 2 Θ(n log n)
T(n) = 2T(n/2) + n² 2, 2, n² 1 Case 3 Θ(n²)
T(n) = T(n/2) + 1 1, 2, 1 0 Case 2 Θ(log n)
T(n) = 4T(n/2) + n 4, 2, n 2 Case 1 Θ(n²)
T(n) = 8T(n/2) + n² 8, 2, n² 3 Case 1 Θ(n³)
🎯
Exam Tip (50% Frequency): Master's Theorem questions are common. Memorize the three cases and practice identifying which case applies. Always show: (1) identify a, b, f(n), (2) calculate logba, (3) compare and state case, (4) write solution.

9. Practice Questions (Exam Style)

Question 1 (Definition) 5 Marks
Explain Best Case, Average Case, and Worst Case time complexity with examples.
View Answer

Best Case (Ω): Minimum time required. Example: Linear search when element is at first position - O(1).

Average Case (Θ): Expected time for random input. Example: Linear search on average - O(n/2) = O(n).

Worst Case (O): Maximum time required. Example: Linear search when element is at last position or not present - O(n).

Question 2 (Definition) 5 Marks
Define Big Oh, Big Omega, and Theta notation with mathematical definitions.
View Answer

Big-O: f(n) = O(g(n)) if ∃ c > 0, n₀ > 0 such that 0 ≤ f(n) ≤ c·g(n) for all n ≥ n₀ (Upper bound)

Big-Ω: f(n) = Ω(g(n)) if ∃ c > 0, n₀ > 0 such that 0 ≤ c·g(n) ≤ f(n) for all n ≥ n₀ (Lower bound)

Big-Θ: f(n) = Θ(g(n)) if ∃ c₁, c₂ > 0, n₀ > 0 such that c₁·g(n) ≤ f(n) ≤ c₂·g(n) for all n ≥ n₀ (Tight bound)

Question 3 (Application) 5 Marks
Solve the following recurrence relations using Master's method:
i. T(n) = 4T(n/2) + n²
ii. T(n) = 16T(n/4) + n
View Answer

i. T(n) = 4T(n/2) + n²

a=4, b=2, f(n)=n². log₂4 = 2. f(n) = n² = Θ(n²). Case 2. T(n) = Θ(n² log n)

ii. T(n) = 16T(n/4) + n

a=16, b=4, f(n)=n. log₄16 = 2. f(n) = n = O(n^(2-1)). Case 1. T(n) = Θ(n²)

Question 4 (Application) 10 Marks
Solve the following Recurrence using Substitution Method:
T(n) = 1 if n=1, T(n) = 2T(n/2) + Cn if n>1
View Answer

Guess: T(n) = O(n log n), assume T(n) ≤ cn log n

Substitution:

T(n) = 2T(n/2) + cn

≤ 2[c(n/2)log(n/2)] + cn

= cn log(n/2) + cn

= cn(log n - 1) + cn

= cn log n - cn + cn

= cn log n

Result: T(n) = O(n log n) ✓

Question 5 (Theory) 5 Marks
Explain recurrences and various methods to solve recurrences.
View Answer

Recurrence: An equation that defines a function in terms of its value on smaller inputs. Used to express time complexity of recursive algorithms.

Methods to solve:

  1. Substitution: Guess solution, prove by induction
  2. Recursion Tree: Visualize as tree, sum costs at all levels
  3. Master Theorem: Direct formula for T(n) = aT(n/b) + f(n)

Key Points Summary

📌 Remember These for the Exam
  • Time Complexity measures operations, Space Complexity measures memory
  • Big-O (upper bound), Big-Ω (lower bound), Big-Θ (tight bound)
  • O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!)
  • Master Theorem: T(n) = aT(n/b) + f(n), compare f(n) with n^(log_b(a))
  • Merge Sort: T(n) = 2T(n/2) + n = O(n log n)
  • Binary Search: T(n) = T(n/2) + 1 = O(log n)
  • Always show step-by-step working in exam answers