In this digital age of ours we have grown accustomed to representing all forms of information as numbers. Text, drawings, photographs, sound, music, movies — everything goes into the digitization mill and gets stored on our computers and other devices in ever more complex arrangements of 0s and 1s.
In the 1930s, however, only numbers were numbers, and if somebody was turning text into numbers, it was for purposes of deception and intrigue.
In the fall of 1937, Alan Turing began his second year at Princeton amidst heightened fears that England and Germany would soon be at war. He was working on his doctoral thesis, of course, but he had also developed an interest in cryptology — the science and mathematics of creating secret codes or ciphers (cryptography) and breaking codes invented by others (cryptanalysis).1 Turing believed that messages during wartime could be best encrypted by converting words to binary digits and then multiplying them by large numbers. Decrypting the messages without knowledge of that large number would then involve a difficult factoring problem. This idea of Turing's was rather prescient, for it is the way that most computer encryption works now.
Unlike most mathematicians, Turing liked to get his hands dirty building things. To implement an automatic code machine he began building a binary multiplier using electromagnetic relays, which were the primary building blocks of computers before vacuum tubes were demonstrated to be sufficiently ...