Captain Kirk: Everything Harry tells you is a lie. Remember that. Everything Harry tells you is a lie.
Harcourt Fenton Mudd: Now listen to this carefully, Norman. I am... lying.
Norman: You say you are lying, but if everything you say is a lie, then you are telling the truth, but you cannot tell the truth because everything you say is a lie, but you lie... You tell the truth but you cannot for you lie... illogical! Illogical! Please explain! You are human. Only humans can explain their behavior! Please explain!
This conversation between Captain Kirk, Harry Mudd and the android Norman occurred in the original Star Trek episode “I, Mudd” (1967). It is a great illustration of the difference between theory and practice, and also demonstrates Gödel’s Incompleteness Theorem, which says that sufficiently complex systems of expression are capable of containing logically undecidable assertions. It was also my first introduction to the theoretical concept of such undecidability; and an important reminder that theory is insufficient without practice.
Less than two decades after it aired, I took a course in theoretical computer science during my first degree (a Bachelor of Computer Science), where I learned more about this theorem in my favorite textbook from that degree: Douglas Hofstadter’s “Gödel, Escher, Bach: An Eternal Golden Braid” (see https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach). This ground-breaking book built my appreciation for the music of Bach and the art of Escher, and introduced me to Gödel’s mind-bending concepts.
It also introduced me to Alan Turing, inventor of the conceptual Turing Machine, author of the prescient 1950 article, “Computing Machinery and Intelligence,” and solver of the German Enigma machine during the second world war, as later illustrated in the Benedict Cumberbatch movie, “The Imitation Game.”
In retrospect, you could say that his thinking was a Turing point in computing history, when experience, theory and necessity came together to open the pandora’s box of modern computing. After the war, his theorizing played a key role in the realization of computing that went far beyond mere calculation. And, of course, once the technology began to develop in an iterative cycle between theory and practice, the sky was no limit, as highlighted by recent celebrations of System/360’s contributions to the moon landing.
Shoulders of Fortune
Isaac Newton is famous for having said, “If I have seen further, it is by standing on the shoulders of giants.” (Among other important insights.) The same could be said of Turing, whose great insights were key steps in the long journey to the merging of theory and practice in computing. Two key figures whose theory and practice contributed to this journey that eventually culminated in the ultimate 360-degree platform were Charles Babbage and Herman Hollerith.
I know: Those names probably both ring a bell, and not just because they were both contemporary with Alexander Graham Bell, that great British/Canadian/American originator of inventions and commerce that were to become such important fellow-travelers in the history of computing.
Babbage, a polymath and theorist who fascinated key people of his day such as Lady Ada Lovelace with his inventions, also didn’t stand alone. In fact, it was Lady Ada herself who first demonstrated to Babbage how the devices he was theorizing, which were never built in his lifetime, might be used to run programs to calculate such things as a sequence of Bernoulli numbers.
He called his ultimate device the Analytical Engine—built on the shoulders of his previously ideated Difference Engine. While the actual concept didn’t manifest as a practical reality during his lifetime, by 1837 the theory was established, and the idea began to take root.
Ah, theory. “In theory, practice and theory are the same thing. In practice, they’re not.” So it took another seven bits’ worth of years before practice and theory reached their great merger in computing.
Meanwhile, practice continued to budge forwards, and if Babbage was standing on the shoulders of giants in his theorizing, one of them was Joseph Marie Jacquard, whose punch-card-programmable looms were the practical ancestors of programmable computers. But, while Babbage’s ideas remained theoretical, practical need continued to advance.
The 19th century was a time of both optimism and trepidation about what impact science and technology could have on humanity, from Mary Shelley’s 1818 novel, “Frankenstein; Or, The Modern Prometheus” through the science fiction of Jules Verne and H.G.Wells, often expressing a very cautious optimism about what might be possible, mitigated with warnings about conceivable negative outcomes.
This context produced many different innovations, often driven more by practical necessity than theory, and one major example was the need to deal with the growing amount of data generated by the U.S. Census. So it was that mechanical engineer Herman Hollerith was given the opportunity in the 1880s to build forward and create a punched-card-fed Tabulator, drawing on Jacquard’s invention and turning it into a true data processing device.
Now, it has been said that progress moves along in creeps and jerks, and anyone who works in computing will be quite familiar with the prevalence of both of these. Sometimes, a good idea gets established and just creeps along slowly, barely changing over time because it works so well, and then suddenly a new need arises, such as the necessity of solving a wartime encryption mystery, or the subsequent opportunity to compute massive amounts of stored, dynamic data, and everything jerks forward as a precipitous change opens up the doors to future history.
Thus, Hollerith’s practical idea established itself and led to the creation of one of the greatest companies in history, which continued to build related technologies to meet a wide variety of business needs. But once the second great war ended and the world was ready to rediscover optimism, IBM, the company that made electric business machines the norm, became the champion of building on all of these practical innovations and theories.
However, unlike Escher’s pictures of ever-ascending staircases that never reached higher levels, or Bach’s amazing music that could sound like it was continually rising and descending at the same time, practical necessity leapfrogged the danger of getting stuck in a logical dead end or theoretical loop. So, as all of these threads were woven into the invention of the System/360, not only did the journey come full-circle, but it spun up the emergence of modern world-class computing. And, among many great innovators, we have Babbage, Hollerith and Turing to thank for the fact that it just keeps turning out better and better successors in the journey of practical computing.