Like It Called on Me (QuickSort)

("Like It Called on Me (QuickSort)" is set to the music of "Wavin' Flag", original music and lyrics by K'naan from his album Troubadour. Alternate lyrics are by Steve Wolfman, Susanne Bradley, Rachel Pottinger, Naomi Wolfman, and the CPSC 320 2016W2 TA staff.)

You can subject yourself to video of me and Susanne singing this one.

```Pivot and smaller,
You will be sorted.
I'll call on quicksort,
Just like it called on me.

Larger than pivot,
You will be sorted.
I'll call on quicksort,
Just like it called on me.
And then I go back,
and then I go back,
And then I go back.

Sir Antony, built it for speed,
Tuned by Bentley, writing in C
But poor choice of p, adversary

And it's unstable, unless it is able
To use index labels, uniquely.

If worst case lies, or you randomize
For any large size, then quicksort takes prize

So we shuffling, till n = none,
and we wondering when we'll be done.
But we patiently wait, for that trivial case,
It's log n away, so for now we say:

Pivot and smaller,
You will be sorted.
I'll call on quicksort,
Just like it called on me.
And then I go back,
and then I go back,
And then I go back.

Memory space, sorted in place
But in the worst cases, O(n) stack frames
Reorder calls, so memory use falls
Shorter half first; then tail recurse!

But what are they good for, sorts that run slower?
QuickSort the big wars, then call an n-squared.
Heapsort falls far short for cache-conscious users
And Bogo or Brute Force, they take n factorial

But we shuffling, till n = none
And we wondering when we'll be done
But we patiently wait, for that trivial case,
It's log n away, so for now we say:

Pivot and smaller,
You will be sorted.
I'll call on quicksort,
Just like it called on me.
And then I go right,
and then I go right,
and then I go right,
and then I go..

Larger than pivot,
You will be sorted.
I'll call on quicksort,
Just like it called on me.
And then I go back,
and then I go back,
and then I go back.

And common cases will be n log n.
And you and I will be n log n.
And log n bang will be n
log n.

Pivot and smaller,
You will be sorted.
I'll call on quicksort,
Just like it called on me.
And then I go right,
and then I go right,
and then I go right,
and then I go..

Larger than pivot,
You will be sorted
I'll call on quicksort,
Just like it called on me.
And then I go back,
and then I go back,
And then I go back.

Pivot and smaller,
Larger than pivot.
I'll call on quicksort,
Just like it called on me.
Just like it called on me.
Just like it called on me.
Call, call, just like it called on me.
```

Footnotes

1. The chorus is the recursive case of QuickSort. For the clearest description, please see the first version of the chorus that includes "and then I go right". Note that "pivot and smaller" and "larger than pivot" are probably not what you really want to use in practice. Rather, we should have "smaller than pivot", "larger than pivot", and handling for the pivot itself and any element equal to it. For more on how to handle equal-valued elements, including those equal to the pivot, please see Bentley and McIlroy's paper Engineering a Sort Function, particularly the section entitled "The Opportunity of Equal Elements" and its loving reference to Dijkstra and the Dutch National Flag.
2. Sir Charles Antony Richard (Tony) Hoare designed QuickSort in the early 1960's while working on Russian-English machine translation. Hoare's 1962 paper on QuickSort is a fascinating read. Among all the brilliance there mentioned in several places in this song/document, one fun piece is the evocatively-named "nest" data structure (a term for a stack that presumably never caught on).
3. As mentioned above, Jon Bentley of Programming Pearls renown worked with M. Douglas McIlroy (of UNIX renown) on Engineering a Sort Function (quicksort, that is) for use in the C library.
4. p here represents the pivot, a common choice of variable name and perhaps a not unreasonable poetic license. Quicksort suffers from a well-known quadratic worst-case at the hands of an adversary armed with knowledge of its pivot-selection strategy. McIlroy (discussed above) designed A Killer Adversary for Quicksort that haunts all our quicksort-y dreams.
5. To be fair, many stable versions of quicksort exist, but some of the fastest and best known are not stable. However, quicksort, like any unstable sort, can be made stable by tagging each entry with its index and breaking ties in keys by index (in which case no two elements can compare equal, i.e., all elements are unique).
6. Here we see the most egregious poetic license in the song. If we really partition into "pivot and smaller" and "larger than pivot", then we cannot use a base case of 0, as the left partition will never reach size zero. But the obvious poetic resolution "till n = one" fails as well, since one or the other of the partitions can easily be empty, even if we ignore the possibility that the initial array is empty, which we would have if footnotes had page limits. However, the whole genesis of this parody was the lovely match in scansion between "when I get older" and "pivot and smaller"; so, we're stuck with it. Fortunately, this song (and probably the footnotes, too, depending on future Supereme Court rulings targeting these footnotes, the most important in computing since Ada Lovelace went to some Italian guy's talk) are Creative Commons Licensed. So change it to "n = one" or even (heaven forfend) "n ≤ one", however you're going to get that to scan. (Oh, and don't whine about the scansion on factorial.)
7. Make the recursive call on the shorter "half" of the list first, and you're guaranteed a maximum call stack depth that is logarithmic in the original input size, since the second call is a tail call, i.e., one that adds nothing to the current continuation, or in call stack terms, one where the current stack frame can be discarded before allocating one for the recursive call. This point came out early in quicksort's history. Tony Hoare pointed out in his 1962 paper: "...to ensure that the number of segments postponed at any one time never exceeds the logarithm (base 2) of the number of items to be sorted, it is sufficient to adopt the rule of always postponing the processing of the larger of the two segments"! In comparison, the naive approach of making the left-hand call first advocated by this song's chorus can result in linear stack depth.
8. In Implementing Quicksort Programs, Sedgewick discusses a clever way to switch to an O(n2) sort at small subarray sizes: simply ignore partitions below a certain size and then perform a single pass of insertion sort of the entire array, although as with many other variants and questions about quicksort, Hoare's 1962 paper touches on this idea as well. For a twist on this idea, Musser's introsort (one of his two Introspective Sorting and Selection Algorithms) escapes from McIlroy's adversary by limiting the recursion depth (carefully counting recursions that don't increase the stack size) to a logarithmic factor, after which heapsort is used as a "stopper" in Musser's terms.
9. Does heapsort actually have worse cache performance than quicksort? According to LaMarca and Ladner (shout-out to Tony and Richard!), yes. Indeed, QuickSort, which seems so dominant in so many ways, once again displays amazing performance when considering caching performance. Someone should write a song about it. (Amazingly, Hoare's 1962 paper also touches on this point. Cache performance, not the song.)
10. Some would say that bogosort encompasses the likely brute force solution (i.e., generating all permutations deterministically until one is found that is in sorted order). Those people would deny us our slant rhyme on "force" and are therefore wrong. If you're not sure what bogosort is, just take a deck of cards around with you to various people and ask each one if they can teach you how to play "fifty-two card pickup". One of them will have had a younger or elder sibling and can teach you, filled with nostalgic or vengeful schadenfreude, respectively. If you truly wish to learn more of this area, perhaps start with Sorting the Slow Way by Gruber, Holzer, and Ruepp or Pessimal Algorithms and Simplexity Analysis by Broder and Stolfi.
11. There's a beautiful decision-tree-based argument for why Ω(log (n!)) is a lower-bound on the worst-case runtime of any algorithm that sorts using comparisons, and we could probably find a reference to the paper that establishes it, but why bother when Hoare took care of that for the non-asymptotic average case in 1962. Assuming you get there before the heat-death of the universe, check out his snappy little entropy-based argument. Oh, and Ω(log (n!)) = Ω(n log n). Ask your algorithms instructor why. (Yes, your algorithms teacher, not stackexchange.)