A Sort of Permutation
Practical lessons in the impractical permutation sort
We’ve previously explored esoteric sorting algorithms that either purport to be linear in execution time but are hugely inefficient or actually are linear but are lossy. It’s time to find ourselves at the other end of the scale: lossless and profoundly non-linear. Permutation sort takes us from O(n) to O(n!) — that’s right, factorial time. O(MG)!
In case your only recollection of factorial is as a rather uncompelling demonstration of recursion, what you need to know is that the numbers get very large very fast.
The recursive implementation of factorial arises from having 0! = 1 as a base case and expressing other cases as n! = n×(n-1)!. In Haskell this can be expressed directly as
factorial 0 = 1
factorial n = n * factorial (n - 1)
Or, less declaratively and in a conditional form that translates more readily to most languages, as
factorial n = if n == 0 then 1 else n * factorial (n - 1)
Haskell is structurally and syntactically well suited to recursive definitions — especially using the first, pattern-based form — but this is not the case across all languages. Sometimes a recursive definition seems the most powerful and elegant way to express something, but sometimes, as Edsger Dijkstra notes, not so much:
Although correct, it hurts me, for I don’t like to crack an egg with a sledgehammer, no matter how effective the sledgehammer is for doing so.
There is a simpler definition of factorial that defines it as a sequence product, i.e., n! = Πⁿi. No fuss, sledgehammer or recursion required:
factorial n = product [1..n]
I’m not saying the reason many programmers are hesitant about recursion is because factorial is their introductory example… but I’m sure it doesn’t help. As Michael Jackson (no, not that one, the other one) remarks:
Software development should not be a trade of constructing difficulty from simplicity. Quite the contrary.
Now that we have a way of calculating factorial, we can return to appreciating O(n!) performance complexity. As I was saying, the numbers get very large very fast: 1! is 1; 5! is 120; 10! is 3628800; 12! is the largest integral factorial that can be expressed in a 32-bit integer; the 64-bit budget gets blown after 20!; to go higher, you’re gonna need a bigger int. You get the idea. Very large. Very fast.
But very fast is one thing you won’t get if you have an algorithm with this kind of time consumption. Factorials crop up in determining permutations and combinations of things, from the way a deck of cards might be shuffled to how proteins can be folded. And that’s what permutation sort does: in essence, it is an unoptimised search through the permutations of the input values until it finds the one arrangement that is sorted. Permutation sort is to sorting as blind trial and error is to learning.
Searching permutation space
Python’s permutations function returns an iterable that can be used to iterate through successive rearrangements of the values given as its input:
list(permutations([1, 2, 3]))
Generates a list with 3! = 6 permutations of the input, including the input:
[(1, 2, 3), (1, 3, 2), (2, 1, 3), (2, 3, 1), (3, 1, 2), (3, 2, 1)]
If we apply this to a list with 4 items:
list(permutations([1, 2, 3, 4]))
The resulting list contains 4! = 24 permutations:
[(1, 2, 3, 4), (1, 2, 4, 3), (1, 3, 2, 4), (1, 3, 4, 2),
(1, 4, 2, 3), (1, 4, 3, 2), (2, 1, 3, 4), (2, 1, 4, 3),
(2, 3, 1, 4), (2, 3, 4, 1), (2, 4, 1, 3), (2, 4, 3, 1),
(3, 1, 2, 4), (3, 1, 4, 2), (3, 2, 1, 4), (3, 2, 4, 1),
(3, 4, 1, 2), (3, 4, 2, 1), (4, 1, 2, 3), (4, 1, 3, 2),
(4, 2, 1, 3), (4, 2, 3, 1), (4, 3, 1, 2), (4, 3, 2, 1)]
And so on for higher n.
Permutation sort is, therefore, simply a search through this result set. It follows a generate-and-test approach:
Possible solutions are generated systematically and evaluated until a suitable solution is found. […] The greatest weakness of generate-and-test procedures is combinatorial explosion.
For the specific examples just shown, a linear search will be successful on the first comparison because the lists [1, 2, 3] and [1, 2, 3, 4] were already sorted. If we start instead with the inputs reverse sorted, it takes n! steps to find the sorted result.
In the general case, we do not need to generate all the permutations of the input, only up to the one we need to find and then we can terminate. Forcing the result into a list causes generation of all the permutations up front; leaving the result of permutations as an iterable means the result is lazily rather than eagerly evaluated, so permutations are deferred and generated on demand.
We can express the search in terms of a filter:
This returns an iterable that can be used to walk through the result set. As we know there is always one and only one sorted permutation, we can use the iterator helper, next, to advance it to the lone result:
The returned result is a tuple. Following the expectation set by Python’s sorted function, the following converts the result to a list:
return list(next(filter(is_sorted, permutations(values))))
This can also be expressed in terms of comprehension syntax for a generator, which some readers might find more explicit:
each for each in permutations(values) if is_sorted(each)))
The comprehension syntax also makes explicit a more familiar framing of search: select the result from the permutations of the given values where the result is sorted, i.e., select…from…where….
The only thing missing is the definition of is_sorted, which we can write as
pairs = zip(values, islice(values, 1, None))
return all(i <= j for i, j in pairs)
A sequence of values is sorted if all of its adjacent pairs are sorted.
Minimum viable permute
The Python out-of-place implementation of permutation sort is quite compact, but the in-place C++ version is even more minimal:
void permutation_sort(auto begin, auto end)
while (next_permutation(begin, end))
The standard library next_permutation function shuffles the existing values in place, returning false if the resulting permutation is sorted and true if not, thereby combining into a single construct the lazy evaluation of Python’s permutations, the effect of the next operation and the logic to check for is_sorted.
Let’s talk performance
Apart from being eye-opening, instructive and fun — isn’t that enough? — of what use is permutation sort? One use is for having — even provoking — that awkward conversation about performance.
the degree to which a system or component accomplishes its designated functions within given constraints, such as speed, accuracy, or memory usage
When someone says that performance doesn’t matter, that statement is false, even if they don’t yet recognise it as such. Permutation sort is an example of how to make clear that performance is always a consideration, even if it is not yet understood or not yet critical. If performance doesn’t matter, then it would be fine to include permutation sort and its kin in your code. If you were to do this, you would find that — all of a sudden — performance does matter. Thus, somewhere between “I don’t care about performance” and permutation sort lies acceptable performance. The conversation has begun.
This is a general approach you can use. Although the example is pathological, the approach is not: it’s part of a process of enquiry. Sometimes we don’t know an answer for something and need a strategy, like iterative root finding, to determine suitable answers. The same is true for other qualities — everything from security to maintainability — that may have been dismissed, or may not have been considered, or for which the criteria for correctness or acceptability seemed not to have moved much beyond handwavium. Many software characteristics that are confidently framed as ‘requirements’ are often little more than vague desires or velleities.
To qualify as a requirement, we have to know how to determine whether the software has what is required or not — if we don’t know how to do this, we cannot truly claim to have specified a requirement. Of course, this doesn’t mean articulating, testing or even knowing a requirement is an easy task, but that’s precisely why it helps to have thought experiments and other thinking tools available to probe and explore what we do and do not (yet) know.