History of Computing for Government and Military Purposes

anonymous


Computer science has long been tied with mathematics through topics such as discrete math, number theory, probability, and more. But at what point did computer science diverge from mathematics into its own field entirely? It could be argued that modern computer science was born out of Charles Babbage’s Analytical Engine or Ada Lovelace’s algorithms, but I believe that the true source of divergence was Alan Turing’s creation of the Bombe machine in order to crack the Enigma during WWII. This was one of the first events which demonstrated that computer science was a practical tool for problem solving on a worldwide scale. Through his invention, Alan Turing was able to show that the consequences of computer science were much more than just theory by providing crucial intelligence to the military.

In addition to the Bombe machine, the second world war also saw improvements in number theory and cryptography. All of these breakthroughs in computing were largely driven by their necessity and value to the government during the time of war. In other words, innovation was driven by military necessity. Another example of an advancement in computer science driven by military motive is the internet. The earliest conception of the internet was found in a network called ARPANET, a project funded by US military agency DARPA. Though there is some disagreement, many say that ARPANET was built as a way to keep a route of communication alive in the case of nuclear war. It’s easy to see that up until this point, military uses for computer science had largely driven innovation.

However, not long after the invention of ARPANET, personal computers began production and computer science was shifted largely from research with a military focus, to consumerism and leisure products. While computers continued to play a critical role in the development of military infrastructure, government funded research for military problems seemed to quickly decline. This trend continued all the way from the 70’s into the modern day. Indeed, at this point the government seems to be significantly behind in terms of research into AI and quantum computing, both of which could have massive impacts on defensive strategies and governmental operation. Instead, the majority of research in cutting edge topics these days is conducted by corporations seeking to make a profit off of these advancements. Google, for example, is leading innovative research in both quantum computing and AI on a scale that is unmatched.

The rapid decline of government interest and investment in these problems begs the question: would AI and quantum research be more fruitful if the government began pushing as heavily as they did during WWII? Does it make any difference who is motivating the research?

Without seeing the future, this question is impossible to answer. However, I believe that at a certain point, cryptographic uses of quantum computing, and data-sifting uses of AI will become so vital to the government that they will once again become leaders in computer science research. However, I’m uncertain whether or not they will be able to innovate on a scale similar to what they were doing during WWII. It could be the case that social necessity for these capabilities would spark inspiration and determination in researchers, leading to more fruitful findings. However, it could also be the case that high paying positions in industry combined with a social resentment towards the government would leave government research largely discounted and abandoned. It will be exciting to see which of these cases plays out once the information age reaches critical mass.

References

  • Narag, Marlon R., et al. “A Brief History of Computer Science.” World Science Festival, www.worldsciencefestival.com/infographics/a_history_of_computer_science/