Linear Search:
- Linear Search is the simplest method to solve the searching problem.
- Linear search has no assumptions about the order of the list.
- It finds an item in a collection by looking for it from the beginning of the array and looks for it till the end. It returns the first position (index) of an item whenever it finds it.

- Best case:
- The best-case occurs when the search term is in the first slot in the array.
- A number of comparisons in best case = 1.
- Time complexity = O(1).
- Worst case:
- The worst case occurs when the search term is in the last slot in the array or is not in the array.
- The number of comparisons in worst case = size of the array = n.
- Time complexity = O(n)
- Average case:
- On average, the search term will be somewhere in the middle of the array.
- The average number of comparisons = n/2
- Time complexity = O(n)
Binary Search
- Binary search assumes the list is already in order.
- In each step, the binary search algorithm divides the array into three sections.
- Finds the Middle element
- Considers only left side elements of middle elements if the searching element is less than the middle.
- Considers only right side elements of middle elements if the searching element is greater than the middle.

- Best case:
- The best-case occurs when the search term is in the middle of the array.
- A number of comparisons in best case = 1.
- Time complexity in the best case= O(1).
- Worst case:
- The worst-case for binary search occurs in the following cases:
- when the search term is not in the list, or
- when the search term is one item away from the middle of the list, or
- when the search term is the first or last item in the list.
- The maximum number of comparisons in worst case = (log2 n) + 1
- Time complexity in the worst case = O(log2 n)
- The worst-case for binary search occurs in the following cases:
- Average case:
- The average case occurs when the search term is anywhere else in the list.
- Number of comparisons = O(log2 n)
- Time complexity in the average case = O(log2 n)
- Best case: The best-case complexity is O(1)
- Average case: T(n) = O(log2 n)
- Worst case: In the worst case, the complexity of the binary search is O(log2 n)
- The number of comparisons performed by Algorithm binary search on a sorted array of size n is at most log n + 1.
- To find the first instance of an item (element).
- To find the last instance of an item (element)
- To find the number of instances of an item.
- Given an array containing only zero's and one's in sorted order. You can find the first occurrence of 1 in array.
Linear Search Vs Binary Search
- Linear search is sequential, but binary search uses divide and conquers approach
- Linear search begins the search from first position of array, whereas binary search begins from middle position of array.
- Linear search can be applied to any array, but binary search can be applied to only sorted array.
- Linear search worst case time complexity is O(n), and Binary search worst case time complexity is O(log2 n).
- After k th comparison, number of remaining elements left for searcing in Linear search is (n-i), and in Binary search is n/(2k) approximately.
Sorting:
Sorting is ordering a list of elements.
Types of sorting: There are many types of algorithms exist based on the following criteria:
- Based on Complexity
- Based on Memory usage (Internal & External Sorting)
- Based on recursive/non-recursive implementation
- Based on stability
- Based on comparison/non-comparison, etc.
Internal Sorting: If the number of elements is small enough to fits into the main memory, sorting is called internal sorting. Bucket sort, Bubble sort, Insertion sort, Selection sort, Heap sort, and Merge sort are the different types of internal sorting algorithms.
External Sorting: If the number of elements is so large that some of these elements reside on the external storage during the sort, it is called as an external sorting. External merge sort and shell sort also known as (Bucket sort) are external sorting algorithms.
In-place Sorting: The In-place sorting algorithm does not use extra storage to sort the elements in the given list.
Stable Sorting: Stable sorting algorithm maintain the relative order of records with equal values during sorting the elements.
Comparison based sorting: It determines that ,which of two elements being compared should occur first in the final sorted list.
- Exchanging: used by Bubble Sort
- Selection: used by Selection Sort and Heapsort
- Insertion: used by Insertion Sort and Shell Sort
- Merging: used by Merge Sort
- Partitioning: used by Quick Sort
Non-Comparison based sorting: Bucket sort, Radix sort, and Counting sort are non-comparison based sorting algorithms respectively.
Some of the sorting algorithms are explained below.
Bubble Sort
Working Principle of Bubble Sort:
- It works by repeatedly stepping through the list to be sorted, comparing each pair of adjacent items and swapping them if they are in the wrong order. The pass through the list is repeated until no swaps are needed, which indicates that the list is sorted.
- It is comparison based sorting algorithm (It uses only comparisons to sort the elements).
// A function to implement bubble sort
void bubbleSort(int arr[], int n)
{
int i, j;
for (i = 0; i < n-1; i++)
// Last i elements are already in place
for (j = 0; j < n – i – 1; j++)
if (arr[j] > arr[j+1])
swap(&arr[j], &arr[j+1]);
}
Analysis of Bubble Sort: The time complexity of bubble sort in all three cases is given below
- Best case Complexity: O(n2)
- Average case Complexity: O(n2)
- Worst case Complexity: O(n2)
Insertion Sorting
Working Principle of Insertion Sort:
- Every iteration of insertion sort removes an element from the input data, inserting it into the correct position in the already-sorted list, until no input elements remain. The resulting array after i iterations has the property where the first i+1 entries are sorted.
- The insertion sort only passes through the array once. Therefore, it is very fast and efficient sorting algorithm with small arrays. Insertion sort is useful only for small files or very nearly sorted files.
- It is comparison based, in-place and stable algorithm.

Analysis of Insertion Sort
- Best case complexity: Array is already sorted.
- T(n) = O(n)
- Average case complexity: All permutations are equally likely.
- T(n) = O(n2)
- Worst case complexity: Input is in reverse sorted order.
- T(n) = O(n2)
Selection Sorting
Working Principle of Selection Sort:
- Find the minimum element of an array of n elements. Swap it with the value in the first position of array. Next, we find the minimum of the remaining n − 1 elements and swap with second position of array. We continue this way until the second largest element is stored in A[n−1].
- Selection sort is an in-place comparison sorting algorithms.
Pseudo Code of Selection Sort:
- Worst case: T(n) = O(n2).
- Best case: T(n) = O(n2).
- Average case: T(n) = O(n2).
Analysis of Selection Sort:
- Worst case time complexity: O(n2).
- Best case time complexity: O(n2).
- Average case time complexity: O(n2).
Heap Sort
Heapsort is simple to implement and is a comparison-based sorting. It is In-place sorting but not a stable sort algorithm.
Max heap: A heap in which the parent has a larger key than the children is known as a max heap sort.
Min heap: A heap in which the parent has a smaller key than the children is known as a min-heap sort.
BUILD_HEAP (A)
- heap-size (A) ← length [A]
- For i ← floor(length[A]/2) down to 1 do
- Heapify (A, i)
Pseudo Code of Heap Sort:
Heapify (A, i)
- l ← left [i]
- r ← right [i]
- if l ≤ heap-size [A] and A[l] > A[i]
- then-largest ← l
- else largest ← i
- if r ≤ heap-size [A] and A[i] > A[largest]
- then-largest ← r
- if largest ≠ i
- then exchange A[i] ↔ A[largest]
- Heapify (A, largest)
Heapsort(A)
- BUILD_HEAP (A)
- for i ← length (A) down to 2 do
- exchange A[1] ↔ A[i]
- heap-size [A] ← heap-size [A] - 1
- Heapify (A, 1)
Analysis of Heap Sort: The total time for heap sort is O (n log n) in all three cases (best, worst and average).
- Heapify: which runs in O(logn) time.
- Build-Heap: which runs in linear time O(n).
- Heap Sort: which runs in O(n logn) time.
- Extract-Max: which runs in O(logn) time.
Merge Sort
- Merge sort is a comparison based sorting algorithm uses divide and conquer approach.
- Merge sort is a stable sort.
- Working Principle:
- Divide the unsorted list into n sublists, each containing 1 element (a list of 1 element is considered sorted).
- Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining.
- Divide: Divide the n-element sequence to be sorted into two subsequences of n/2 elements each
- Conquer: Sort the two subsequences recursively using merge sort.
- Combine: Merge the two sorted subsequences to produce the sorted answer.
- Input: An array A[1...m] of elements and three indices p, q and r, with 1 ≤p≤ q< r≤ m, such that both the sub-arrays A[p...q] and A[q+1...r] are sorted individually in non-decreasing order
- Output: A[p...r] contains the result of merging the two subarrays A[p...q] and A[q+1...r].
Pseudo Code for Merge Sort:
void merge(int a[], int low, int mid, int high){ int b[100]; int i = low, j = mid + 1, k = 0; while (i <= mid && j <= high) { if (a[i] <= a[j]) b[k++] = a[i++]; else b[k++] = a[j++]; } while (i <= mid) b[k++] = a[i++]; while (j <= high) b[k++] = a[j++]; k--; while (k >= 0) { a[low + k] = b[k]; k--; }}void mergesort(int a[], int low, int high){ if (low < high) { int m = (high + low)/2; mergesort(a, low, m); mergesort(a, m + 1, high); merge(a, low, m, high); }}
Analysis of Merge Sort: Let T(n) be the worst-case number of comparisons and data movements for an array (of size n).
- Recursive call on the two halves of A: 2 * T(n/2)
- Time to Merge two halves of array A: O(n)
Recurrence relation for merge sort is: T(n) = 2T(n/2)+ O(n)
- Best Case Time Complexity: O(n log n)
- Average case Complexity: O(n log n)
- Worst Case Complexity: O(n log n)
Quick Sort
- It is in-place sorting.
- It is also known as partition exchange sort.
- The elements A[low…high] to be sorted are rearranged using partition.
- Pivot element which is always occupies its correct position, say A[w].
- All elements that are less than or equal to pivot occupy the positions A[low...w−1].
- All elements that are greater than A[w] occupy the positions A[w + 1…high].
- The subarrays A[low…w−1] and A[w+1…high] are then recursively sorted to produce the entire sorted array.
Working principle of Quick Sort:
- Select pivot element from the given list of elements.
- Reorder the list so that all elements which are less than the pivot come before the pivot and so that all elements greater than the pivot come after it (equal values can go either way).
- After this partitioning, the pivot is in its final position. This is called the partition operation.
- Recursively sort the sub-list of lesser elements and the sub-list of greater elements. The base case of the recursion are lists of size zero or one, which are always sorted.
- Quick sort is comparison based, and in-place based sorting algorithm.
Pseudo Code for Quick Sort:
void quicksort(int *array, int start, int stop)
{
int left = start,
right = stop,
center = array[(start + stop) / 2];
while(left<right)
{
while(array[left]<center) left++;
while(array[right]>center) right--;
if(left<=right)
{
swap(&array[left], &array[right]);
left++;
right--;
}
}
if(right>start) quicksort(array, start, right);
if(left<stop) quicksort(array, left, stop);
}
void quicksort_start(int *array, int num)
{
quicksort(array, 0, num-1);
}
Recursive Implementation of Quick sort:
Partition(A, p, q) {
x = A[p];
i = p;
for (j = p+1 to q)
{
if (A[j] <= x) i = i+1;
swap(A[i], A[j]);
}
swap(A[p], A[i]);
return i;
}
Quicksort(A, p, r) // sorts list A[p..r]{ if (p < r) {
q = Partition(A, p, r); Quicksort(A, p, q-1); Quicksort(A, q+1, r); }}
Analysis of Quick Sort:
- Best case time complexity: O(n log n).
- Average case time complexity: O(n log n).
- Worst-case time complexity: O(n2).
Note:
- Merge sort is a fast sorting algorithm whose best, worst, and average case complexity are all in O(n log n), but unfortunately it uses O(n) extra space to do its work.
- Quicksort has best and average case complexity in O(n log n), but unfortunately its worst case complexity is in O(n2).
Randomized Quick Sort :
To minimize the worst case of Quicksort , we use randomized Quicksort. In this , there is only one change that instead of choosing first element as pivot , any random element is choosen as pivot.
By doing this, the chances of biased partitioning are drastically decreased.
So , we select a random element as pivot , exchange it with the first element , and then proceed normally as our Quicksort algorithm.
Recursive Implementation of Randomized Quick sort:
Partition(A, p, q) {
x = A[p];
i = p;
for (j = p+1 to q)
{
if (A[j] <= x) i = i+1;
swap(A[i], A[j]);
}
swap(A[p], A[i]);
return i;
}
RandomQuicksort(A, p, r) // sorts list A[p..r]{ if (p < r) {
m = random_generator(A,p,r); // generates a random element from array
swap(a[p],a[m]); // swap the first element from the random element
q = Partition(A, p, r); RandomQuicksort(A, p, q-1); RandomQuicksort(A, q+1, r); }}
Analysis of Randomized Quick Sort:
- Best case time complexity: O(n log n).
- Average case time complexity: O(n log n).
- Worst-case time complexity: O(n2) (But the chances of occurrence of worst case is very less)
Candidates can also practice 110+ Mock tests for exams like GATE, NIELIT with BYJU'S Exam Prep Test Series check the following link:
Click Here to Avail GATE CSE Test Series!
Thanks
#DreamStriveSucceed
Comments
write a comment