In this assignment, we will explore a different sorting algorithm called Insertion Sort and compare it with the other sorting algorithms we have already discussed.
In Insertion Sort, we gradually build a sorted portion of the array from left to right. The steps involved are as follows, assuming that we are sorting in ascending order:
1
).
Let's do a detailed walkthrough on the example array [4, 2, 1, 3]
:
1
) and temporarily store it in the variable current
, conceptually creating a gap.
2
to the elements on the left. If an element is greater than the stored value, we shift that value to the right. This is a "conceptual" shift, as in the code, we would simply overwrite the value to the right.
We continue shifting until we find a value that is smaller than the stored element or we reach the left end of the array.
2
to index 0
.
Let's continue with the second pass through the array.
2
) and temporarily store it in the variable current
, conceptually creating a gap.
1
with the value to its left, which is 4
. Since 4
is greater than 1
, we shift 4
to the right.
1
with the next element on the left, 2
. Since 2
is greater than 1
, we shift it to the right as well.
1
to index 0
.
Proceeding to the third and final iteration through the array
3
) and temporarily store it in the variable current
, conceptually creating a gap.
3
with the value to its left, which is 4
. Since 4
is greater than 3
, we shift 4
to the right.
3
with the next element on the left, 2
. Since 2
is smaller than 3
, we are done with shifting, so we assign the stored element 3
to index 2
, where we currently have our 'gap'.
After completing all the passes, the array is fully sorted in ascending order: [1, 2, 3, 4]
.
Now that we understand insertion sort conceptually, let's consider the implementation. In the walkthrough above, we mentioned that the 'gap' we create is conceptual. This can lead to confusion when we go from this mental model to our implementation. Instead of making an actual gap in our array, we're just keeping a reference to current
, the value we're 'moving,' and overwriting values as a way to 'shift' elements.
Let's look at 'Pass 1' from the walkthrough above, but with a more literal diagram that better reflects the implementation we'll use for this problem.
1
) and temporarily store it in the variable current
, conceptually creating a gap. Notice that index 1
still references the value 2
, but we're free to overwrite this now that we have a reference to 2
saved with current
.
2
to the elements on the left. If an element is greater than the stored value, we shift that value to the right. In this case, 4
is greater than 2
, so we'll overwrite our 'gap' with the value 4
. Our 'gap' still references 4
, but we're free to overwrite it because we've shifted 4
to the right.
2
to index 0
, overwriting the 4
that was referenced by our gap.
Here's the code implementation of the Insertion Sort algorithm in JavaScript:
function insertionSort(arr) {
const len = arr.length;
// Iterate over the array starting from the second element
for (let i = 1; i < len; i++) {
// Store the current element, creating a 'gap' at index `i`
let current = arr[i];
// Set the index to compare against to the left of our 'gap'
let j = i - 1;
// Keep shifting as long as the element we are
// comparing to is greater than the currently stored
// element, or we run out of elements
while (j >= 0 && arr[j] > current) {
// Shift the element we're comparing to the right
arr[j + 1] = arr[j];
j--;
}
// Insert the stored element into the gap
arr[j + 1] = current;
}
return arr;
}
const array = [4, 2, 1, 3];
console.log(insertionSort(array)); // Output: [1, 2, 3, 4]
In the worst-case scenario, where the input array is in reverse order, Insertion Sort demonstrates a time complexity of O(N^2)
. This means that the number of operations required to sort the array grows quadratically with the input size.
When comparing Insertion Sort with Selection Sort and Bubble Sort in terms of worst-case time complexity, all three algorithms have the same O(N^2)
time complexity. However, in practice, Insertion Sort often outperforms these algorithms for certain scenarios.
Insertion Sort excels when working with partially sorted or nearly sorted arrays. In such scenarios, the algorithm benefits from significantly reducing the number of comparisons and shifts required, resulting in improved performance. For arrays that are already sorted or nearly sorted, Insertion Sort can even outperform more complex algorithms like Merge Sort or Quick Sort. In this "best case scenario," Insertion Sort exhibits a linear time complexity of O(N)
. The reason behind this linear time complexity lies in the fact that during each iteration of the algorithm when comparing an element with the elements on its left, only a few comparisons may be needed before determining its correct position.
However, we have seen that Bubble Sort also has linear time complexity in the best-case scenario. So, how do we choose between them? While Insertion Sort, like Bubble Sort, involves a lot of swaps, it makes fewer comparisons overall. Therefore, in cases where comparisons are costly, Insertion Sort would be the preferred option.
Overall, while Insertion Sort may not be the most efficient algorithm for all cases, especially when dealing with large, unsorted collections where other algorithms like Merge Sort or Quick Sort perform much better, its performance can be remarkable in scenarios where the input is partially sorted, or nearly sorted, and swapping operations are not costly.
Understanding the characteristics and behavior of different algorithms allows us to choose the most appropriate sorting method based on the specific requirements and characteristics of the data we are working with.
We will postpone the discussion of more complex sorting algorithms, such as Quick Sort, to the Advanced DSA book. Quick Sort involves a deeper understanding of recursive techniques, and we want to ensure a solid foundation before exploring its intricacies.