# Sortings: Algorithms & Implementation & Analyze

Sorting is always the top topic in computer science. While today we usually use `Array.sort`all the time for the sake of simplicity, it’s important to learn algorithms and mechanisms behind. In this post, I’ll list 3 top sorting algorithms, with its implementation in Ruby, and time complexity.

## Selection sort

The selection sort is the most basic and intuitive sorting algorithm. The idea is that starting from the first element from the array, we find the minimum element in the following array, then swap the current element with the minimum element. Thus, the minimum element is always replaced to the front. After we walk through the whole array, the smaller elements are all placed in the front. See the following graph for more details:

Implementation:

`def selection_sort(array)    n = array.size - 1    n.times do |i|        min = i        #find the min in the remaining array        for j in (i + 1)..n do            min = j if array[j] < array[min]        end        #swap current with min        if min != i            array[i], array[min] = array[min], array[i]        end    end    arrayend`

Time complexity:

The running time is `O(n²)` for the selection sort. Because for every element in length N array, it needs to look up the minimum element in the following array. It runs `O(n²) `in total to sort the array.

## Merge sort

Merge sort has better performance than the selection sort. It’s done recursively. It has 3 steps to process an unsorted array:

• Sort the first half of the array
• Sort the second half of the array
• Combine two sorted halves into one sorted sequence

Don’t worry if it sounds too abstract, see the following graph for explanation:

Implementation:

`def merge_sort(array)    #base case    return array if array.size < 2    mid = (array.size / 2)    left = merge_sort(array[0..mid - 1]    right = merge_sort(array[mid..array.size])    merge(left, right)enddef merge(left, right)    return left if right.empty?    return right if left.empty?    if left[0] < right[0]        [left[0]] + merge(left[1..left.size], right)    else        [right[0]] + merge(left, right[1..left.size])    endend`

Time complexity:

Since the merge sort contains recursive calls, it’s hard to determine the running time. Let’s break it down to levels, each level contains chunks of an array, and in the end, all these chunks have the minimum size: 1. I draw a graph to illustrate the process. As we can see from the graph, each level is dividing the array into same size chunks. And at each level, the total cost is a constant number: n. Now the problems become into how many levels it has in order to get size 1 chunks? By calculation, there are total log2(n) numbers of levels. And the time complexity is `O(nlog(n))`.

## Quicksort

Quicksort is my personal favorite sorting algorithm. It’s easy to understand and to implement. Its average time complexity is `O(nlog(n))` , same with merge sort. Similar to merge sort, it also uses the divide-and-conquer recursive algorithm. It has three steps to sort an array:

• Select an element in the array as a ‘pivot’, then put this pivot to its correct position, so that all the elements in the left side of the pivot are smaller than it, and all the elements in the right side of the pivot are greater than it
• Quicksort all the elements in the left side of the pivot
• Quicksort all the elements in the right side of the pivot

Implementation:

`def quick_sort(arr, low, high)    if low < high        position = partition(arr, low, high)        quick_sort(arr, 0, position - 1)        quick_sort(arr, position + 1, high)    end    arrenddef partition(arr, low, high)    pivot = arr[high]    i = low - 1    for j in low..high - 1 do        if arr[j] <= pivot            i += 1            arr[i], arr[j] = arr[j], arr[i]        end    end    arr[i + 1], arr[high] = arr[high], arr[i + 1]    i + 1end`

Time complexity:

Running time for quicksort is depends on the input array. For the average and best case when the input array is nearly sorted, the complexity is nearly` O(nlog(n))` . However, when dealing with the worst-case(the input array is in descending order), the running time is `O(n²)` .

## Conclusion

There’s never a best or worst algorithm in sorting. When choosing a sorting algorithm, we always need to consider which is the best in a certain scenario. For example, if an array is nearly sorted, quicksort is the best solution. For more information, here’s the link with detailed time and space complexity for all sorting algorithms. Hope this post can help!

on the way to become a programmer

## More from Wangyy

on the way to become a programmer