("Like It Called on Me (QuickSort)" is set to the music of "Wavin' Flag", original music and lyrics by K'naan from his album Troubadour. Alternate lyrics are by Steve Wolfman, Susanne Bradley, Rachel Pottinger, Naomi Wolfman, and the CPSC 320 2016W2 TA staff.)

You can subject yourself to video of me and Susanne singing this one.

If you'd like to play this on ukulele (and maybe guitar), try the
simple chord rotation `C F Am G` over and over. There's
lots of fun strumming patterns, but just one strum per beat (i.e.,
strum down C on "Pivot" and then "Smaller", strum down F on "You" and
"sorted", etc.) works just fine.

Pivot and smaller, You will be sorted. I'll call on quicksort, Just like it called on me. Larger than pivot, You will be sorted. I'll call on quicksort, Just like it called on me. And then I go back, and then I go back, And then I go back. Sir Antony, built it for speed, Tuned by Bentley, writing in C But poor choice ofp, adversary can scale up speed, quadratically. And it's unstable, unless it is able To use index labels, uniquely. If worst case lies, or you randomize For any large size, then quicksort takes prize So we're shuffling, tilln= none, and we're wondering when we'll be done. But we patiently wait, for that trivial case, It's lognaway, so for now we say: Pivot and smaller, You will be sorted. I'll call on quicksort, Just like it called on me. And then I go back, and then I go back, And then I go back. Memory space, sorted in place But in the worst cases, O(n) stack frames Reorder calls, so memory use falls Shorter half first; then tail recurse! But what are they good for, sorts that run slower? QuickSort the big wars, then call an n-squared. Heapsort falls far short for cache-conscious users And Bogo or Brute Force, they take n factorial But we're shuffling, tilln= none And we're wondering when we'll be done But we patiently wait, for that trivial case, It's lognaway, so for now we say: Pivot and smaller, You will be sorted. I'll call on quicksort, Just like it called on me. And then I go right, and then I go right, and then I go right, and then I go.. Larger than pivot, You will be sorted. I'll call on quicksort, Just like it called on me. And then I go back, and then I go back, and then I go back. And common cases will benlogn. And you and I will benlogn. And lognbang will benlogn. Pivot and smaller, You will be sorted. I'll call on quicksort, Just like it called on me. And then I go right, and then I go right, and then I go right, and then I go.. Larger than pivot, You will be sorted I'll call on quicksort, Just like it called on me. And then I go back, and then I go back, And then I go back. Pivot and smaller, Larger than pivot. I'll call on quicksort, Just like it called on me. Just like it called on me. Just like it called on me. Call, call, just like it called on me.

- The chorus is the recursive case
of QuickSort. For the clearest description, please see the first
version of the chorus that includes "and then I go
right". However,
**at minimum**, you should also remove the pivot from the recursive calls and place it between the smaller and larger elements for a correct and efficient sorting algorithm. More generally, we probably want "smaller than pivot", "larger than pivot", and handling for the pivot itself and any element equal to it. The most naïve quicksort—that takes the pivot as the first element, abandons being "in-place", and splits the list into two smaller/larger sublists—will want "smaller than pivot" and "pivot and larger" to ensure stability, although more efficient in-place versions are going to be unstable regardless. For more on equal-valued elements, including those equal to the pivot, please see Bentley and McIlroy's paper Engineering a Sort Function, particularly the section entitled "The Opportunity of Equal Elements" and its loving reference to Dijkstra and the Dutch National Flag. We have an excellent rewrite of the song that clearly and succinctly handles the pivot but it does not fit in these footnotes due to page limits. Besides, the whole genesis of this parody was the lovely match in scansion between "when I get older" and "pivot and smaller"; so, we're stuck with it. Fortunately, this song (and probably the footnotes, too, depending on future Supereme Court rulings targeting these footnotes, the most important in computing since Ada Lovelace went to some Italian guy's talk) are Creative Commons Licensed. So do feel free to change it yourself. - Sir Charles Antony Richard (Tony) Hoare designed QuickSort in the early 1960's while working on Russian-English machine translation. Hoare's 1962 paper on QuickSort is a fascinating read. Among all the brilliance there mentioned in several places in this song/document, one fun piece is the evocatively-named "nest" data structure (a term for a stack that presumably never caught on).
- As mentioned above, Jon Bentley of Programming Pearls renown worked with M. Douglas McIlroy (of UNIX renown) on Engineering a Sort Function (quicksort, that is) for use in the C library.
*p*here represents the pivot, a common choice of variable name and perhaps a not unreasonable poetic license. Quicksort suffers from a well-known quadratic worst-case at the hands of an adversary armed with knowledge of its pivot-selection strategy. McIlroy (discussed above) designed A Killer Adversary for Quicksort that haunts all our quicksort-y dreams.- To be fair, many stable versions of quicksort exis, but the best known, in-place versions are not stable. However, quicksort, like any unstable sort, can be made stable by tagging each entry with its index and breaking ties in keys by index (in which case no two elements can compare equal, i.e., all elements are unique), or by ensuring that any two elements that compare equal to each other are actually, deeply equal.
- It's tempting to have a base case
of "till
*n*= one". That alone is insufficient, however, since either partition ("half") of the list can easily be empty at any stage, depending on the choice of pivot. Fortunately, as long as you remove the pivot from the recursive calls, both partitions are guaranteed to be smaller than the original list, and a "till*n*= none" base case works nicely - Make the recursive call on the shorter "half" of the list first, and you're guaranteed a maximum call stack depth that is logarithmic in the original input size, since the second call is a tail call, i.e., one that adds nothing to the current continuation, or in call stack terms, one where the current stack frame can be discarded before allocating one for the recursive call. This point came out early in quicksort's history. Tony Hoare pointed out in his 1962 paper: "...to ensure that the number of segments postponed at any one time never exceeds the logarithm (base 2) of the number of items to be sorted, it is sufficient to adopt the rule of always postponing the processing of the larger of the two segments"! In comparison, the naive approach of making the left-hand call first advocated by this song's chorus can result in linear stack depth.
- In Implementing
Quicksort Programs, Sedgewick discusses a clever way to switch to an
O(
*n*^{2}) sort at small subarray sizes: simply ignore partitions below a certain size and then perform a single pass of insertion sort of the entire array, although as with many other variants and questions about quicksort, Hoare's 1962 paper touches on this idea as well. For a twist on this idea, Musser's introsort (one of his two Introspective Sorting and Selection Algorithms) escapes from McIlroy's adversary by limiting the recursion depth (carefully counting recursions that don't increase the stack size) to a logarithmic factor, after which heapsort is used as a "stopper" in Musser's terms. - Does heapsort actually have worse cache performance
than quicksort? According to LaMarca and Ladner (shout-out to Tony and
Richard!), yes. Indeed, QuickSort, which seems so dominant in so many
ways, once again displays amazing performance when considering caching
performance. Someone should write a song about it. (Amazingly, Hoare's 1962
paper
*also*touches on this point. Cache performance, not the song.) - Some would say that bogosort encompasses the likely brute force solution (i.e., generating all permutations deterministically until one is found that is in sorted order). Those people would deny us our slant rhyme on "force" and are therefore wrong. If you're not sure what bogosort is, just take a deck of cards around with you to various people and ask each one if they can teach you how to play "fifty-two card pickup". One of them will have had a younger or elder sibling and can teach you, filled with nostalgic or vengeful schadenfreude, respectively. If you truly wish to learn more of this area, perhaps start with Sorting the Slow Way by Gruber, Holzer, and Ruepp or Pessimal Algorithms and Simplexity Analysis by Broder and Stolfi.
- There's a beautiful decision-tree-based argument for why
Ω(log (
*n*!)) is a lower-bound on the worst-case runtime of any algorithm that sorts using comparisons, and we could probably find a reference to the paper that establishes it, but why bother when Hoare took care of that for the non-asymptotic average case in 1962. Assuming you get there before the heat-death of the universe, check out his snappy little entropy-based argument. Oh, and Ω(log (*n*!)) = Ω(*n*log*n*). Ask your algorithms instructor why. (Yes, your algorithms teacher, not stackexchange.)

Return to Steve's Parody Lyrics Page

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License.