The Government Did Too Invent the Internet

The very powerful and the very stupid have one thing in common. They don’t alter their views to fit the facts. They alter the facts to fit their views. Which can be uncomfortable if you happen to be one of the facts that needs altering.

Doctor Who (played by Tom Baker), from The Face of Evil (original airdates: 1 January – 22 January 1977)


Introduction

L. Gordon Crovitz of the Wall Street Journal recently posted an op-ed piece Gordon Crovitz: Who Really Invented the Internet? Contrary to legend, it wasn’t the federal government, and the Internet had nothing to do with maintaining communications during a war. that generated widespread controversy and several detailed rebuttals: No credit for Uncle Sam in creating Net? Vint Cerf disagrees. A legendary figure in the invention of the Internet weighs into a new debate about the U.S. government’s role during that heady era., So, who really did invent the Internet? by Michael Hiltzik, and many others. Of course, Crovitz’s claims are factually inaccurate — extremely inaccurate.

In fact, the US Government — and European governments through their support of the European CERN laboratory — played a major and dominant role in the invention of the Internet. Since World War II, governments have become the major and in many cases sole source of funding for most basic and much applied research in the world, not just the United States. This represents a radical change over the past, prior to World War II, when private individuals and organizations performed a substantial fraction of basic and applied research. Government funded research has had some spectacular practical successes: the Manhattan Project that developed the first nuclear power plants and atomic weapons, the development of the hydrogen bomb, aviation and rocketry through the early 1970s, and, of course, the Internet. There have been other lesser successes as well.

However, as many including Peter Thiel, Tyler Cowen, and (yes!) Paul Krugman have noted, since about 1970, we have made remarkably poor progress in many areas outside of computers and electronics, notably in power and propulsion. This is one of the main reasons that the term “technology” has become increasingly synonymous with “computer and electronic technology” in popular English usage. Where are the flying cars?Why has the expenditure of about $200 billion inflation adjusted dollars, about ten times the inflation adjusted budget of the Manhattan Project, failed to produce any significant advances in the treatment, let alone cure of most cancers? Many other examples can be cited.

Despite the “free market,” anti-government rhetoric of much of the business community — epitomized by the Wall Street Journal editorial page and taken to ludicrous extremes in Crovitz’s article, modern “private” companies often depend heavily on the government to research and develop the core science and technology for their products and services. This is very true in the computer and Internet industry, which ironically is held out falsely as an ideal example of the “free market” at work. What in fact occurs is that if the government research program is on the right track, then private industry has usually been able to exploit the results of the research and progress occurs.

Where the government research programs are stalled, private industry has been unable and, in most cases, unwilling to find an alternative approach that works. This is not simply a matter of willpower. Private individuals and organizations have lost much of the expertise in basic research that was more common before World War II. Consequently when private individuals like former Intel CEO Andy Grove, frustrated with the poor results of government research on cancer and Parkinson’s disease, attempt to find alternatives to stagnant mainstream research programs, they often fail.

The basic research methods popular in the government research no longer include some of the methods used prior to World War II, with, for example, government research mega-projects like the ITER tokamak fusion reactor or the LHC displacing the inexpensive table-top methods of the past in many fields. It can, in some cases, even be difficult to determine what the inventors and discoverers of the past were doing. How significant was the frequent interest in and self-reported use of philosophy and even sometimes the occult found in pre-World War II inventors and discovers such as Johannes Kepler, Isaac Newton, Niels Bohr, Albert Einstein, Wolfgang Pauli, rocket pioneer Jack Parsons (one of the most extreme examples), and others? Thus, it is difficult for private individuals or organizations to recreate some types of expertise that may be needed once again to solve pressing problems like energy shortages or cancer.

There Can Be Only One

For a number of social and political reasons government research programs frequently become fixated on a single or small number of narrowly defined theories and approaches, defend these theories and approaches stubbornly and effectively despite repeated lack of practical results, and successfully market these theories and approaches to policy makers, the general public, and business leaders — again despite the lack of practical results. This has led, I strongly suspect, to stalled or severely slowed progress in many fields compared to the pre-1970 and especially the pre-World War II period. I will present a theory as to why the slowdown occurred in the 1970’s rather than in World War II and has continued since then.

I use the term theory to refer to unifying fundamental concepts such as superstrings in theoretical physics, the oncogene theory of cancer in the War on Cancer, and the Big Bang theory in cosmology. I use the term approach to refer to specific technical approaches or methods such as the heavy reliance on Monte Carlo simulations in experimental particle physics, the use of immortal cell lines in cancer research, and tokamaks in fusion power research.

[ReviewAZON name=”The Trouble with Physics” id=”6″ display=”inlinepost” asin=”061891868X” trackingid=”antoniocangia-20″ country=”us” width=”200px” float=”left” imagetop=”10px”] One of the better known and also one of the more widely criticized examples of this dominance by a single theory is the preeminent position of superstrings theory in theoretical physics since about 1984 (which actually replaced the non-Abelian Gauge Grand Unified Theory (GUTs) fad that dominated from 1974 until the mid 1980’s when the failure to find proton decay presented a serious problem for the GUTs). From what I understand, this dominance has declined somewhat following a barage of negative publicity such as Lee Smolin’s The Trouble with Physics, Peter Woit’s popular Not Even Wrong blog and book , Lawrence Krauss’s Hiding in the Mirror, various comments by Roger Penrose, and other criticisms such as Sheldon Glashow‘s well known antipathy to string theory.

[ReviewAZON name=”Hiding in the Mirror” id=”8″ display=”inlinepost” asin=”0143038028″ trackingid=”antoniocangia-20″ country=”us” width=”200px” float=”left” imagetop=”10px”]

Superstring theory is particularly striking because it had and has no basis in experimental data or evidence. It is simply an example of a “good idea” that somehow either squeezed out or subsumed its competitors. Supersymmetry, for example, was integrated into superstrings along with a number of other theoretical ideas that predated the superstrings synthesis. From a social and political point of view, one of the advantages of superstrings is that it provided a unified framework that could integrate the efforts of several previously competing groups of theoretical physicists in a single group effectively led by Ed Witten and a few other senior theoretical physicists. One big group is usually able to lobby the government, the US Congress, and the public for support more effectively than several medium-sized groups, let alone a large number of competing individuals and small groups.

Superstrings is not an isolated case. In the early 1970’s, during the first “energy crisis,” tokamaks squeezed out a large number of other competing approaches to developing nuclear fusion power. In this case, early promising results with tokamaks were heavily hyped and used to lobby both for greatly increased overall funding for fusion power research and to kill other competing approaches. Forty years later, we are still waiting for the advent of fusion power and the ITER fusion mega-project promises little in practical usable results.

So too, in the mid 1970’s the failing retroviral theory of cancer transformed into the oncogene theory of cancer and rapidly crushed all competing theories of the dread disease, quickly garnering a Nobel Prize and truly massive funding. The US National Cancer Institute alone has an annual budget of over $5 billion today, adjusting for inflation the same annual budget as the wartime Manhattan project. This brief account probably understates the extent to which certain approaches such as immortal cell lines came to dominate both the theory and methodology of cancer research.

In the early 1970’s, the US Defense Advanced Research Projects Agency (DARPA) conducted a contest between different approaches to speech recognition which was won by a team from Carnegie-Mellon University using a method that is now known as the Hidden Markov Model or HMM speech recognition. Essentially all speech recognition engines and research today is now descended from this one approach. Although performance has improved it still leaves much to be desired and progress has been slow at best.

Many, many other examples of this process exist. In most cases, in the 1970s or early 1980s, a single theory or approach that showed early promise on experimental or theoretical grounds or both was hailed as the answer with the desired results — a theory of everything (TOE) in physics, cheap energy in fusion power, a cure for cancer, talking computers — just around the corner. In some cases like the War on Cancer and fusion power, actual funding increased substantially. In most cases, whether funding increased or not, the single theory captured the lion’s share of funding, pushing out all or most competitors. Government research programs became much more narrowly focused in the 1970s and 1980s and have remained so since. In most cases, the promised results never came at all or have fallen far short of initial expectations.


The Ph.D. Glut and the Decline of Scientific Productivity

The mid-twentieth century and World War II in particular saw a transformation of science and research and development. Funding both greatly increased in quantity and shifted to almost exclusively the government. Science became much more professionalized and institutionalized. The role of so-called “amateurs” declined dramatically. The importance of the Ph.D. and other credentials increased markedly. The success of the wartime Manhattan Project spawned numerous attempts to duplicate the success of this giant project; most (not all) of these attempts have so far failed.

These dramatic mid-century changes led to a period of exponential increase in the funding for many kinds of scientific research, especially during the decade following Sputnik (October 4, 1957). This led to an exponential increase in the number of Ph.D’s in many, many fields. For a time, especially after Sputnik, most of these freshly minted Ph.D.s could find jobs in their profession. The bubble burst in the late 1960’s. This produced a huge glut of Ph.D.’s, would-be scientific researchers, since about 1970.

Government, business, leading scientists, and (yes) the Wall Street Journal editorial page have lobbied long and hard — up to the present day — to maintain and, if possible, expand this ocean of Ph.D.’s who cannot find jobs in their supposed professions (see the examples in Appendix II below). One official study after another claims there is a shortage or imminent shortage of scientists in the United States, strikingly at variance with actual reality where most Ph.D. recipients end up doing something other than science, with a lucky few leaving science to develop iPhone apps that sell dogfood and similar gimmicks — anything but actually curing cancer or finding new, cheaper energy sources or solving other pressing problems.

It is likely that the end of the exponential increases in government funding for research and development coupled with the perpetual Ph.D. glut since the late 1960’s account for the rise of the knowledge monopolies in which a single theory and/or approach acquired an extremely dominant position — usually in the early 1970s. With dozens of Ph.D.s competing for each available position, those who organized into the largest and most regimented groups often won out in the political battle for resources within the government. Bigger may not be better but it can lobby for funding much more effectively.

With far more Ph.D.’s than actual jobs, it is trivial to eliminate, consciously or not, anyone who dares to question the One Right Way. Conformity is highly rewarded and true risk-taking usually (not always) career suicide. Ironically, scientific and technological progress would likely have been higher if there had been a true shortage of scientists, as repeatedly falsely claimed over the last forty years. Then, the scientists would have enjoyed the creative freedom and independence so often necessary for major scientific and technological progress.

Conclusion

So we come full circle back to the Internet and computers. Yes, the government did invent the Internet. Yes, government funded research and development has had and may continue to have great successes. People like Gordon Crovitz who deny this are, at best, living in a free market fantasy world.

However, most government research programs have not been very successful despite vast funding. Most attempts to replicate the remarkable success of the Manhattan Project have failed. The rate of progress in many fields appears to have slowed since about 1970. Perhaps this is just the luck of the draw as Paul Krugman has suggested. It is more likely that the slowdown has been caused by the drawbacks of centralized funding and control of scientific research, the dominance of many fields by one or a few narrowly focused theories and sets of approaches, and the many negative effects of the Ph.D. glut.

What can we do to improve the rate of scientific progress and solve the problems such as energy shortages that probably contribute greatly to the current global economic slowdown and seeming conflicts over oil and energy resources? One may wonder about the true nature of seeming conflicts over energy resources that result in less energy and higher energy prices as almost certainly occurred with the wars in Iraq and Afghanistan, but that belongs in a different discussion.

The private sector can step up as Andrew Grove and Peter Thiel have tried to do to provide an alternative to the current government funded research programs that have so far failed to produce much, if any, progress in many fields. It is important to remember that the private sector has lost expertise in basic research. We do not live in the era of colorful private inventors like Octave Chanute and the Wright Brothers. Rather most Silicon Valley businesses have expertise in commercializing proven technologies originally developed in the government research programs. This is not the same as the basic and applied research that leads to a proven technology.

In principle, the federal government can move to break up the huge centralized research institutions and replace them with a more decentralized research funding system, such as many science foundations instead of one National Science Foundation. I would not hold my breath for this one.

The National Academies of Sciences report Bridges to Independence: Fostering the Independence of New Investigators in Biomedical Research contains a number of other suggestions that seem worthy of consideration.

Ironically, creating an actual scientist shortage by intentionally reducing the number of Ph.D’s produced annually at government expense below the number of senior scientists who die or retire each year might yield a significant increase in the rate of scientific progress. 🙂

© 2012 John F. McGowan

About the Author

John F. McGowan, Ph.D. solves problems using mathematics and mathematical software, including developing video compression and speech recognition technologies. He has extensive experience developing software in C, C++, Visual Basic, Mathematica, MATLAB, and many other programming languages. He is probably best known for his AVI Overview, an Internet FAQ (Frequently Asked Questions) on the Microsoft AVI (Audio Video Interleave) file format. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech). He can be reached at [email protected].


Appendix I: Selected Internet Resources on the Ph.D. Glut

“U.S. pushes for more scientists, but the jobs aren’t there” by Brian Vastag in the July 7 Washington Post.

How and Why Government, Universities, and Industry Create Domestic Labor Shortages of Scientists and High-Tech Workers
By Eric Weinstein

Toil, Trouble, and the Cold War Bubble: Physics and the Academy since World War II
MIT Science Historian David Kaiser Presentation at the Perimeter Institute

PhD’s Org’s Collection on Links and Resources on Scientist and Ph.D. Shortage Claims

Bridges to Independence: Fostering the Independence of New Investigators in Biomedical Research (National Research Council 2005)

Women in Science by Philip Greenspun

Appendix II: Selected Internet Examples of the Never Ending STEM Shortage Claims

Our Ph.D. Deficit (Op-Ed by Norman Augustine and Burton Richter, Wall Street Journal, May 4, 2005)

Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future (2005)

Heading Off a Ph.D. Shortage. by John Vaughn and Robert Rosenzweig in Issues in Science and Technology, v7 n2 p66-73 Win 1990-91

EDUCATION; Shortage of Ph.D.’s Imminent, Report Says (January 24, 1990)

4 Comments

  1. John F. McGowan July 30, 2012
  2. John F. McGowan July 31, 2012
  3. Piltdown Proof August 11, 2012
  4. The Plague Doctor July 16, 2015

Leave a Reply