*A review of John MacCormick's*

__9 Algorithms that Changed the Future__Recently I read a book, by a person who loves what they do, about a few brilliant ideas in computation from which we benefit constantly. Over nine algorithms and a little extra book, a number of critical methods common to modern machines are made much more tangible through the accompanying and judiciously detailed explanations.

We are delivered a number of deep and critical consequences of usually quite complicated mathematical results, but this work begins where it can be understood by us as a fairly lay audience. If you remember your arithmetic, and perhaps a couple extra functions (like exponentiation), the presentation will take it from there.

The main structure of the book is to introduce us to mathematical and/or conceptual 'tricks,' the approximately two to four concepts per chapter that, combined, form a simplified but highly explanatory base of the algorithm in question. It's all laid out in a book length summary where the (not too) technical detail is, with care, embedded in analogy that for the most part seems quite strong and a picture of how these technologies are actually applied. The demonstration of how we live with these ideas is thorough; from the way Google now flashes searches so quickly you get results between pausing to write another word, through how your phone is so easily secured against a quite determined remote attack on banking information and to how errors... happen, but at least we don't have the garbage with which Richard Hamming dealt, thanks at least in part to him.

MacCormick also discusses people and undecidability. This being a popular work as opposed to a textbook, there is more of the freedom and benefit of discussing the lives and work of the people who came up with the theory and practice. This book taught me not only some more things about Claude Shannon's work in compression, but also that he could ride a unicycle and juggle at the same time (and would, at work).

I'm also quite glad undecidability is brought up at the end of the book. Although it's a few steps removed from our immediate sense of computation compared to everything else covered, I'm of the opinion that it's quite a fascinating, important concept to understand when considering the future marvels of computer science and what we want from it. For those of you that are familiar with this, you may or may not share some of my sentiments; for those of you that don't, there's no time for investigation like the present.