Sort an almostsorted, ksorted or nearlysorted arrayIn this article, we will learn to sort an almost sorted array in detail. What is meant by sorting an almost sorted array?When we can sort an array either by
Then it is considered as the sort of an almostsorted, ksorted or nearlysorted array. For example: The abovegiven arrays are almost sorted, some of the elements in both of them are either misplaced by k positions, or a few elements are placed in reverse order. The simple idea would be to sort the array using the best sorting algorithm, like merge sort in O(n logn) time. But there are more efficient ways to sort an almost sorted array. The best possible approaches to sort the almost sorted array are discussed below: Approach 1:By using insertion sort, the time complexity of this approach is T(n) = O(n k), where n is the size of the problem or array, the outer forloop runs up to n times, and the inner whileloop runs a maximum of k times. Space complexity = O(1) as there is no demand for extra space. The programs to sort the almost sorted array in different programming languages are given below: C++ CodeSample Output: Sorted Array = {3, 4, 5, 6, 7, 8, 9, 10, 13} C code:Sample Output: Sorted Array = {21, 22, 23, 24, 32, 33, 34, 35, 36} Python code:Sample Output: Sorted array = [5, 6, 7, 9, 10, 11, 12, 13, 15] Approach 2:By using a minheap, the above problem can be solved with the help of a minheap. In this approach, we use the idea of a minheap. In step first step  we create a minheap of size k+1 and insert the first k+1 elements in it. We have created a minheap of size k+1 because the elements can be found at a kdistance apart from their actual position in the sorted array. This process takes O(k) time. In the second step  We remove the smallest element from the heap and insert it into the array. After each removal of an element from the heap, we insert the next element from the array into the heap and repeat the same process for the nk elements. The method of addition and removal of elements takes O(log k) time. The actual time complexity of the program becomes equal to the T(n) = O(k) + O(nk) * O(log k) = O(m * log k) where m = nk. Sometimes, it is considered to be T(n) @ O(n * log k). C++ Code:Sample Output: Original array = 5 7 4 6 12 8 9 10 Sorted array = 4 5 6 7 8 9 10 12 Time Complexity: T(n) = O(k) + O(nk) * O(log k) = O(m * log k) where m = nk Space Complexity: = O(k) The same problem with the same time complexity can be solved using the AVL tree (SelfBalancing tree) as insertion and deletion can be performed in O(log k) time. Its overall cost becomes O(n logk). But minheap is preferred over the AVL tree as it does not require any extra space for the left and right pointers and it is less complex than the AVL tree. Approach 3The above problem can be solved using the quick sort algorithm to achieve a better time complexity than the above approach. If you don't have the idea of working on the quick sort algorithm. Please go through the given link to learn the technicalities of the quick sort algorithm. < https://www.javatpoint.com/quicksort> In this approach, we follow two ideas:
The idea of selecting the mid element as the pivot two divide the array into two halves for better logarithmic time complexity. The implementation of this approach is given below: C++ code:Sample Output: Given array = 2 5 2 4 4 6 7 7 12 12 10 9 8 Sorted array = 2 2 4 4 5 6 7 7 8 9 10 12 12 C Code:Sample Output: Given array = 6 5 5 3 1 3 1 11 14 12 12 13 Sorted array = 1 1 3 3 5 5 6 11 12 12 13 14 Time complexity: T(n) = O(k Logn) Space complexity: O(Logk) We can observe from the time and space complexity that this approach is better than the previous two approaches.
Next TopicFibonacci Heap
