Chapter 9:

Significance of Information Technologies


Information Technology Over the Past 50 Years

IT and the National Science Foundation (NSF) have come of age together. In this year that marks the 50th anniversary of NSF, few areas demonstrate as vividly as IT the progress that has been made in science and engineering in the past half-century.

In 1945, the same year that Vannevar Bush outlined his ideas for what became the National Science Foundation in Science-the Endless Frontier, he also wrote an article in the Atlantic Monthly that described his vision for capturing and accessing information. (See sidebar, "Excerpts from 'As We May Think'.") In the Atlantic article, Bush proposed the development of a kind of work station, which he called a "memex," that would store and provide access to the equivalent of a million volumes of books. The memex would also employ a way of linking documents "whereby any item may be caused at will to select immediately and automatically another"—allowing the user to build a trail between multiple documents. Although Bush proposed using photographic methods for storage and mechanical means for retrieval, and the exact technological capability he dreamed of has not yet come to pass, the proposed function of his memex is remarkably similar to hypertext today.

[Skip Text Box]



When Bush thought about the capabilities that would be dramatically useful to knowledge workers, he envisioned not capable calculators or word processors but capabilities to store and access information that current technology is just now achieving—using quite different approaches. Much R&D and innovation have been necessary to reach these capabilities.

In the same year that Bush's Atlantic article appeared, developments were taking place that would provide a different path for achieving his vision. At the University of Pennsylvania, John P. Eckert and John W. Mauchly were completing, with Army funding, what is commonly recognized as the first successful high-speed digital computer—the ENIAC. Dedicated in January 1946 and built at a cost of $487,802 (Moye 1996), the ENIAC used 18,000 vacuum tubes, covered 1,800 square feet of floor space, and consumed 180,000 watts of electrical power. It was programmed by wiring cable connections and setting 3,000 switches. It could perform 5,000 operations per second (CSTB 1998).

Also in 1945, Hungarian-born Princeton mathematician John von Neumann developed the stored program concept, which enabled computers to be programmed without rewiring. The von Neumann architecture—which refers to a computer with a central processing unit that executes instructions sequentially; a slow-to-access storage area; and secondary fast-access memory—became the basis for most of the computers that followed. Since the middle of the 20th century, software development has emerged as a discipline with its own challenges and skill requirements, complementing the more visible advances in hardware and enabling great systems complexity.

Over the succeeding 50 years, a vast number of innovations and developments occurred. (See sidebar, "IT Timeline.")

[Skip Text Box]

Innovations in IT over this period came from a remarkable diversity of sources and institutional settings, as well as a remarkable interplay among industry, universities, and government. Transistors and integrated circuits were invented by industry. Early computers and advances such as core memory, time-sharing, artificial intelligence, and Internet browsers were developed in universities, primarily with government funding. The World Wide Web was developed at the European Center for Particle Research (CERN), a high-energy physics laboratory. The mouse and windows were developed at a nonprofit research institute, with government funding. High-performance computers were mostly developed in industry with federal funds and with the involvement of federal laboratories. The diversity and close interaction between these institutions clearly contribute to the vitality of innovation in IT.

Innovation in IT has benefited from the support of a diverse set of federal agencies—including the Department of Defense (DOD), including the Defense Advanced Research Projects Agency (DARPA) and the services; NSF; the National Aeronautics and Space Administration (NASA); the Department of Energy (DOE); and the National Institutes of Health (NIH). Federal support has been particularly important in long-range fundamental research in areas such as computer architecture, computer graphics, and artificial intelligence, as well as in the development or procurement of large systems that advanced the technology—such as ARPANET, the Internet (See sidebar "Growth of the Internet"), and high-performance computers (CSTB 1998).

[Skip Text Box]



Often there has been complementary work supported by the Federal Government and industry. In many cases the Federal Government has supported the initial work in technologies that were later developed by the private sector. In other cases Federal research expanded on earlier industrial research. Higher-level computer languages were developed in industry and moved to universities. IBM pioneered relational databases and reduced-instruction-set computing, which were further developed with NSF support. Collaboration between industry and university researchers has facilitated the commercialization of computing research. (See figure 9-5.)[1] 

Most of the relentless cost-cutting that has been so important in the expansion of IT has been driven by the private sector in response to competitive pressures in commercial markets, although here too federal investment—such as in semiconductor manufacturing technologies—has played an important role in some areas.

bar

Footnotes


[1]  For a more complete description of industry and government roles in developing information technologies, see CSTB (1998).




Previous | Top | Next

Indicators 2000 Home | Contents
Help | Comments