Saturday, November 30, 2013

Sorting (Python)

There are some slow and fast sorting algorithms. We use Big O notation to define it. Big O notation, basically it says how the time needed scales when you add more elements. So a sorting algorithm with O(n^2 ). Sorting a list with 10 elements would need - very roughly - 10^2 = 100 units of time. Big O notation can give us the average time of the sorting algorithm.

O(nlog2n). Fast
  • Quicksort: Typically fastest in practice, simple to code and very processor friendly. Also doesn't need any extra space. O(nlogn) is the average case, the worst case for quicksort is O(n^2). The worst case for quicksort is when the list is ordered, since it needs to partition every item in the list. If part of the list is ordered, then we can solve this by choosing randomly a pivot.
  • Mergesort: Not as common as quicksort but occasionally useful (sorting stuff on disk for instance). It needs extra memory space compare to quicksort. Mergesort efficient is same for its worst case and best case.  
O(n^2). "Slow" can be deceptive, if the data is small, or nearly sorted, these may be faster than the "fast" algorithms.
  • Selection Sort: Easy to understand and commonly taught but rarely used (insertion sort better in pretty much every way). Selection sort efficient is the same with every list since it would iterate every item with any case. 
  • Insertion Sort: Simple algorithm, works well for small arrays. Also runs almost linearly for nearly sorted lists. Pretty common to use this instead heavy-duty sorts for simple jobs. The best case for insertion sort is when the list is ordered, then it will be O(n). The reason is that it doesn't need to shift the items in the list when sorting, hence O(n).

No comments:

Post a Comment