Insertion Sort

In this assignment, we will explore a different sorting algorithm called Insertion Sort and compare it with the other sorting algorithms we have already discussed.

Algorithm

In Insertion Sort, we gradually build a sorted portion of the array from left to right. The steps involved are as follows, assuming that we are sorting in ascending order:

  1. Start with the second element in the array (index 1).
  2. Temporarily store that element, creating a conceptual gap.
  3. Compare the stored value to the elements on its left.
  4. If an element on the left is greater than the stored value, conceptually shift it to the right. In the code, this means overwriting the current index with the value from the left.
  5. Continue shifting elements to the right until a smaller value is encountered or the left end of the array is reached. This process involves repeatedly overwriting the next index to the right with the value from the current index until the correct position is found.
  6. Insert the stored value into the current conceptual gap: In the code, this means finding the correct position where the stored value should go (after all necessary shifts) and overwriting that index with the stored value.
  7. Move to the next element in the array and repeat steps 2 to 6 until you reach the end of the array.
  8. The sorted portion of the array gradually grows from left to right with each iteration until the array is fully sorted.

Walkthrough

Let's do a detailed walkthrough on the example array [4, 2, 1, 3]:

Pass 1:

  • Step 1: We start with the second element (at index 1) and temporarily store it in the variable current, conceptually creating a gap.

  • Step 2: Now we start the shifting phase, where we compare the stored element 2 to the elements on the left. If an element is greater than the stored value, we shift that value to the right. This is a "conceptual" shift, as in the code, we would simply overwrite the value to the right.

We continue shifting until we find a value that is smaller than the stored element or we reach the left end of the array.

  • Step 3: We have reached the left end of the array, so we assign the stored element 2 to index 0.

Let's continue with the second pass through the array.

Pass 2:

  • Step 1: We start with the third element (at index 2) and temporarily store it in the variable current, conceptually creating a gap.

  • Step 2: We compare the stored element 1 with the value to its left, which is 4. Since 4 is greater than 1, we shift 4 to the right.

  • Step 3: Next, we compare the stored element1 with the next element on the left, 2. Since 2 is greater than 1, we shift it to the right as well.

  • Step 4: We have reached the left end of the array, so we assign the stored element 1 to index 0.

Proceeding to the third and final iteration through the array

Pass 3:

  • Step 1: In the final pass, we start with the fourth element (at index 3) and temporarily store it in the variable current, conceptually creating a gap.

  • Step 2: We compare the stored element 3 with the value to its left, which is 4. Since 4 is greater than 3, we shift 4 to the right.

  • Step 3: Next, we compare the stored element 3 with the next element on the left, 2. Since 2 is smaller than 3, we are done with shifting, so we assign the stored element 3 to index 2, where we currently have our 'gap'.

After completing all the passes, the array is fully sorted in ascending order: [1, 2, 3, 4].


Conceptual vs Implementation

Now that we understand insertion sort conceptually, let's consider the implementation. In the walkthrough above, we mentioned that the 'gap' we create is conceptual. This can lead to confusion when we go from this mental model to our implementation. Instead of making an actual gap in our array, we're just keeping a reference to current, the value we're 'moving,' and overwriting values as a way to 'shift' elements.

Let's look at 'Pass 1' from the walkthrough above, but with a more literal diagram that better reflects the implementation we'll use for this problem.

  • Step 1: We start with the second element (at index 1) and temporarily store it in the variable current, conceptually creating a gap. Notice that index 1 still references the value 2, but we're free to overwrite this now that we have a reference to 2 saved with current.

  • Step 2: Now we start the shifting phase, where we compare the stored element 2 to the elements on the left. If an element is greater than the stored value, we shift that value to the right. In this case, 4 is greater than 2, so we'll overwrite our 'gap' with the value 4. Our 'gap' still references 4, but we're free to overwrite it because we've shifted 4 to the right.

  • Step 3: We have reached the left end of the array, so we assign the stored element 2 to index 0, overwriting the 4 that was referenced by our gap.


Code Implementation

Here's the code implementation of the Insertion Sort algorithm in JavaScript:

function insertionSort(arr) {
  const len = arr.length;

  // Iterate over the array starting from the second element
  for (let i = 1; i < len; i++) {
    // Store the current element, creating a 'gap' at index `i`
    let current = arr[i];
    // Set the index to compare against to the left of our 'gap'
    let j = i - 1;

    // Keep shifting as long as the element we are
    // comparing to is greater than the currently stored
    // element, or we run out of elements
    while (j >= 0 && arr[j] > current) {
      // Shift the element we're comparing to the right
      arr[j + 1] = arr[j];
      j--;
    }

    // Insert the stored element into the gap
    arr[j + 1] = current;
  }

  return arr;
}

const array = [4, 2, 1, 3];
console.log(insertionSort(array)); // Output: [1, 2, 3, 4]

In the worst-case scenario, where the input array is in reverse order, Insertion Sort demonstrates a time complexity of O(N^2). This means that the number of operations required to sort the array grows quadratically with the input size.

When comparing Insertion Sort with Selection Sort and Bubble Sort in terms of worst-case time complexity, all three algorithms have the same O(N^2) time complexity. However, in practice, Insertion Sort often outperforms these algorithms for certain scenarios.

Insertion Sort excels when working with partially sorted or nearly sorted arrays. In such scenarios, the algorithm benefits from significantly reducing the number of comparisons and shifts required, resulting in improved performance. For arrays that are already sorted or nearly sorted, Insertion Sort can even outperform more complex algorithms like Merge Sort or Quick Sort. In this "best case scenario," Insertion Sort exhibits a linear time complexity of O(N). The reason behind this linear time complexity lies in the fact that during each iteration of the algorithm when comparing an element with the elements on its left, only a few comparisons may be needed before determining its correct position.

However, we have seen that Bubble Sort also has linear time complexity in the best-case scenario. So, how do we choose between them? While Insertion Sort, like Bubble Sort, involves a lot of swaps, it makes fewer comparisons overall. Therefore, in cases where comparisons are costly, Insertion Sort would be the preferred option.

Overall, while Insertion Sort may not be the most efficient algorithm for all cases, especially when dealing with large, unsorted collections where other algorithms like Merge Sort or Quick Sort perform much better, its performance can be remarkable in scenarios where the input is partially sorted, or nearly sorted, and swapping operations are not costly.

Understanding the characteristics and behavior of different algorithms allows us to choose the most appropriate sorting method based on the specific requirements and characteristics of the data we are working with.

We will postpone the discussion of more complex sorting algorithms, such as Quick Sort, to the Advanced DSA book. Quick Sort involves a deeper understanding of recursive techniques, and we want to ensure a solid foundation before exploring its intricacies.