Quick Sort:
(a) Best Case , Average Case:
O(n log n)
(b) Worst Case:
O(n^2)
(c) Recurrence Relation:
T(n) = T(0) + T(n-1) + ϴ(n)
which is equivalent to
T(n) = T(n-1) + ϴ(n)
======================================================
Merge Sort:
(a) Best Case , Average Case & Worst Case:
O(n log n)
(c) Recurrence Relation:
T(n) = 2T(n/2) + ϴ(n)

Applications of Merge Sort
1) Merge Sort is useful for sorting linked lists in O(nLogn) time. Other nlogn algorithms like Heap Sort, Quick Sort (average case nLogn) cannot be applied to linked lists.
2) Inversion Count Problem
3) Used in External Sorting
Worst Case Comparison: (Merging 2 sorted lists of length ‘m’ and ’n’ is)
m + n - 1
======================================================
Heap Sort:
(a) Best Case , Average Case & Worst Case:
- Time Complexity : O(n log n)
Heapify Operation :
- Time Complexity : O( Log n )
Height of tree :
- Time Complexity (Worst Case) : O( Log n )
======================================================
Graph Representation :
(a) Adjacency Matrix:
- Searching an edge : O(1)
- Removing an edge : O(1)
- Adding a vertex : O(V^2)
- Space Complexity : O(V^2)
(b) Adjacency List:
- Searching an edge : O(V)
- Adding a vertex : O(1)
- Space Complexity (Best Case) : O(V+E)
- Space Complexity (Worst Case) : O(V^2)
=======================================================
Graph Traversal :
(a) BFS :
- Queue is used for traversal.
- Space Complexity : O(V)
- Time Complexity : O(|V| + |E|)
Applications of DFS:
- Finding shortest paths & minimum spanning tree for unweighted graph
- Peer to peer networks
- Crawlers in Search Engines
- Social Network WebSites
- Broadcasting in Networks
- Detecting cycle in undirected graphs
- Testing bipartite graphs
(b) DFS :
- Stack is used for traversal.
- Space Complexity : O(V)
- Time Complexity : O(|V| + |E|)
Applications of DFS:
- Finding minimum spanning tree
- Finding all pairs shortest path tree
- Detecting cycle in a graph
- Testing bipartite graphs
- For Topological Sorting
- Finding strongly connected components
- Finding path between 2 given vertices
- Solving maze problems.
=======================================================
Finding Minimum Cost Spanning Tree:
Prim’s Algorithm:
(a) If graph is represented using Adjacency Matrix:
- Time Complexity : O(V^2) [Best Case]
(b) If graph is represented using Adjacency List & with help of binary heap (Min heap):
- Time Complexity : O(E Log V) [Worst Case, since E=V^2 therefore O(V^2 Log V) ]
Kruskal’s Algorithm:
- Time Complexity : O(E Log V)
=======================================================
Q) The minimum number of comparisons required to find the minimum and the maximum of 100 numbers is _________________.
Answer: 147 The minimum number of comparisons required is 3n/2 – 3 for n numbers
=======================================================
Greedy Knapsack Problem:
- Time Complexity : O(n Log n)
=======================================================
Creating Min Heap:
- Time Complexity : O(n)
Extracting Min/Max element from Heap:
- Time Complexity : O(Log n)
Inserting/Deleting Min/Max element to Heap:
- Time Complexity : O(Log n)
Huffman’s Coding:
- Time Complexity : O(n Log n)
=======================================================
Dijkstra’s Algorithm:
- Time Complexity = O(E Log V)
Bellman-Ford algorithm:
Time complexity of Bellman-Ford algorithm is Θ(VE) where V is number of vertices and E is number edges (See this).
If the graph is complete, the value of E becomes Θ(V2). So overall time complexity becomes Θ(V3)
======================================================
DAG or Topological Sorting Algorithm: (Similar to DFS)
- Time Complexity = O(V + E)
======================================================
Fibonacci Series :
- Time Complexity = O(2 ^ n)
Fibonacci Series (Using Dynamic Programming) :
- Time Complexity = O(n)
======================================================
Matrix Chain Multiplication:
No. of ways of splitting or parenthesising while multiplying ’n’ matrices :
= (2n)! / ( (n+1)! x n! ) , where ’n’ = No. of matrices - 1
Example : ABCD
=> (A|BCD) , (AB|CD), (ABC|D)
=> (A|(B|CD)) , (A|(BC|D)), (AB|CD), ((A|BC)|D), ((AB|C)|D)
= 5 ways of splitting
=> (2 x 3)! / ( (3 + 1)! x 3! )
= 5 ways of splitting
No. of Sub Problems to solve while multiplying a matrix of size A[1..n] :
= n ( n + 1 ) / 2
=> O (n^2)
No. of Splits:
= n-1
Time Complexity :
= O(n x n^2)
= O(n ^ 3)
Binary Search Trees:
No. of trees : (2n)! / ( (n+1)! x n! )
======================================================
Multistage Graph:
Time Complexity : O(n^2)
= O(V^2)
= O(E)
======================================================
Longest Common Subsequence:
(a) Using recursion :
O(2^(m+n))
(b) Using Dynamic Programming:
O(mn)
======================================================
0/1 Knapsack / Subset Sum :
a) Using Recursive function (BruteForce method or Enumeration):
Depth of the recursion tree = O(n)
Total no. of nodes : O(2^n) ————> Completely full Binary Tree
= O((2^n) / 2) —————> Almost Complete or Half full Binary Tree
Therefore no. of function calls : O( 2^n )
Time Complexity : O( 2^n )
b) Using Dynamic Problem:
Time Complexity : O(nw) , where n = no. of objects in the knapsack
w = weight of the knapsack
======================================================
Subset Sum :
a) Using Recursive function (BruteForce method or Enumeration):
Depth of the recursion tree = O(n)
Total no. of nodes : O(2^n) ————> Completely full Binary Tree
= O((2^n) / 2) —————> Almost Complete or Half full Binary Tree
Therefore no. of function calls : O( 2^n )
Time Complexity : O( 2^n )
b) Using Dynamic Problem:
Time Complexity : O(nw) , where n = no. of objects in the knapsack
w = weight of the knapsack
Example :
S = { 6, 3, 2, 1}
W = 5
======================================================
Travelling Salesman Problem:
a) Without Dynamic Programming / Using Brute Force method or Enumeration:
Time Complexity : O(n!)
b) With Dynamic Programming (NP Complete):
Time Complexity : O(n 2^n) x O(n)
= O(n^2 x 2^n)
======================================================
All Pair Shortest Path / Floyd Warshall Algorithm:
a) Using Dijkstra’s Algorithm:
Time Complexity of Dijkstra’s Algorithm : O(E Log V)
If Dijkstra’s Algorithm is applied to all vertices
= O(V) x O(E Log V)
= O(VE Log V) ——> Sparse Graphs
= O(V. V^2 Log V)
= O(V^3 Log V) ——-> Dense Graphs
b) Using Bellman Ford’s Algorithm:
Time Complexity of Bellman Ford’s Algorithm : O(VE)
If Bellman Ford’s Algorithm is applied to all vertices
= O(V) x O(VE)
= O(V^2 . E) ——> Sparse Graphs
= O(V^2 . V^2)
= O(V^4) ——-> Dense Graphs
c) Using Dynamic Algorithm:
There are n^2 problems represented using the dense matrix for ’n’ vertices.
Time Complexity : n . n^2
= O(n^3)
======================================================