Sorting is the process of placing elements from a
collection in some kind of order. An example of such could be placing the
numbers in a list in ascending or descending order.
Various sorting algorithms have been
developed, although the debate of which algorithm is the most efficient is
still up for debate.
One sorting algorithm is the bubble sort. It
compares adjacent items and exchanges those that are out of the particular
order.
Because it goes through each element in the list, the worst case run time is O(n^2) meaning that the elements are in reverse order of that in which you want them to be.
The best case run time is O(n) -> this means that every item in the list is in the order in which you want them to be, and the only action that has to be taken is comparing each index n.
Selection sort
Selection sort is another sorting algorithm, which improves on bubble sort in simplicity yet has a worse run time.
The algorithm divides the input list in to two parts : the first half being the sorted section and the second half the unsorted.
The algorithm then goes over the list and finds the biggest, smallest, basically the next wanted value and places it at the end of the sorted section.
Selection sort runs in a way that once sorted - the items get placed in their final place.
The run time behind selection sort is O(n^2) for both the best and the worst case.
We also learned a couple of new sorting algorithm such as merge sort.
Merge sort is different that the other algorithms because it splits the list into two separate lists and then recursively calls the merge sort on the sub lists. Once they hit the base case: which is a list of length one , they 'merge' the two lists, placing the objects in order.
The run time for merge sort is one of the most efficient algorithm known so far - running at a constant O(n log n) for both best and worst cases.