| Home | Dean | History | Letters | Journal1 | Journal2 | Innovations | Alumni |
Top 20 Greatest Achievements of the 20th Century

The Greatest Engineering Achievements of the 20th Century is a collaborative project led by the National Academy of Engineering. The achievements, nominated by 29 professional engineering societies, were selected and ranked by a distinguished panel of the nation’s top engineers. This is the second of five articles highlighting the top 20 achievements.

Electronics

In 1955, an early high-speed commercial computer weighed 3 tons, consumed 50 kilowatts of power, and cost $200,000. But it could perform 50 multiplications per second.

Today you can buy $250 palm-sized organizers that link to computers, transmit data and store thousands of addresses. The keys to this stunning revolution in personal power are the transistor and the integrated circuit — the centerpieces of the modern electronics systems that swept the world in the last half of the 20th century.

The ancestor of these miniature electronic devices is the vacuum tube. Vacuum tubes were crucial to the development of radio, television and sound recording, and an essential component in early telephone equipment and computers. They were also fragile, bulky, and produced a considerable amount of waste heat.

What followed was the transistor, invented in 1947 by engineers and scientists at Bell Telephone Laboratories. By the early 1950s, the transistor had captured the world’s imagination, first in the transistorized radio — the fastest selling retail object of the time. Early applications included telephone oscillators and hearing aids.

Bell Labs’ open sharing of technology led to rapid advances in transistor manufacturing from 1952 to 1960 and subsequent developments at various industry and university laboratories. These advances undoubtedly led to Jack Kilby’s invention of the integrated circuit at Texas Instruments in 1958. Kilby came up with the idea of organizing numerous transistors and other electronic components on a silicon wafer, complete with wiring.

The real breakthrough in transistor production was the use of oxide on silicon wafers, which allowed selective doping by diffusion of impurities through openings in the oxide. Photographic techniques were used to pattern the openings in the silicon oxide. With these techniques, hundreds of chips could be cut from a single slice of silicon and multiple transistors could be placed on each chip.

In the early 1950s, a transistor cost between $5 and $45 to make. Now the transistors on a microchip cost less than a hundred-thousandth of a cent.

Microchips took the transistor to an exciting new level. One microchip can operate an automobile’s electrical system or launch an air force. It made thousands of new products possible. From microprocessors engineers developed microcomputers — systems about the size of a lunch box or smaller, but with substantial computing power.

Where will microchips appear next? They might appear on the front of refrigerators to monitor food supplies and send grocery lists to the store, automatically charging credit cards or bank accounts. Or they could be implanted inside the human brain to cure blindness or other medical conditions. Only imagination will govern its potential.

Agricultural Mechanization

At the beginning of the 20th century it took a large team of farmers and field hands weeks to plant and harvest one crop, and it took four farmers to feed 10 people. Today, machinery allows the entire Midwestern corn crop to be planted in 10 days and harvested in 20. One U.S. farmer can produce enough food to feed 97 Americans and 32 people in other countries. Twentieth century engineering has made the difference.

Mechanization did not advance rapidly until the 20th century, with the advent of the internal combustion engine. It made the evolution of the tractor possible and led to sweeping changes in agriculture. In 1907, some 600 tractors were in use; by 1950, the figure had grown to almost 3,400,000.

An amazing array of innovations in farm machinery peppered the century, such as tractor-attachable cultivators and harvesters. Self-tying hay and straw balers arrived in 1940 along with a spindle cotton picker. Shielded corn-snapping rolls were developed in 1952, and rotary and tine separator combines were introduced in 1976, each reducing labor significantly.

A major necessity on many farms is a way to control soil erosion and reduce the time and energy to prepare seedbeds. The development of chisel and disc tillage tools and no-till planters in the 1970s and 1980s solved these problems. Even in the 1940s, sweep plows undercut wheat stubble to reduce wind and water erosion and conserve water.

At the turn of the century there were about 16 million acres of irrigated land in the United States. Today there are over 62 million acres, made possible by various types of mechanized irrigation, such as gated-pipe, side-roll, big-gun, or center-pivot machines. These machines can automatically irrigate areas from 150 to 600 acres, and can also apply some fertilizers and pesticides.

Over the century, the average amount of labor required per hectare to produce and harvest corn, hay, and cereal crops gradually fell more than 75 percent. In the process, a massive shift from rural to urban life took place. This shift began to have a lasting impact on the nature of work, the consumer economy, women’s roles in society and even the size and nature of families.

Farm mechanization freed women from many of the time-consuming household chores required to support a large family and hired hands. Mechanization also meant empowerment for men. Traditionally, farm ownership and responsibility shifted from generation to generation; roles were set at birth, and choosing a career was not an option. Mechanized farms needed fewer people to work them, bringing a different kind of personal freedom for many.

In combination with other improvements in crop techniques and food processing, mechanization has significantly altered food production and distribution throughout the world.

Radio & Television

Radio and television have opened the windows to other lives and to remote areas of the world. Each has engaged millions in the major historical events that have shaped the world.

By 1900, Nikola Tesla had developed the Tesla coil. James Clerk Maxwell and Heinrich Hertz had proved mathematically the possibility of transmitting electromagnetic signals between widely separated points. Guglielmo Marconi’s “wireless” telegraph demonstrated its great potential for worldwide communication in 1901. Radio technology was just around the corner.

Immediate engineering challenges addressed the means of transmitting and receiving coded messages, and developing a device that could convert a high-frequency oscillating signal into an electric current capable of registering as sound. The first significant development was the “Edison effect,” the discovery that the carbon filament in the electric light bulb could radiate a stream of electrons to a nearby test electrode if it had a positive charge.

In 1904, Sir John Ambrose Fleming of Britain took this one step further by developing the diode that allowed electric current to be detected by a telephone receiver. Two years later, American Lee De Forest developed the triode, introducing a third electrode (the grid) between the filament and the plate. It could amplify a signal to make live voice broadcasting possible, and was quickly added to Marconi’s wireless telegraph to produce the radio.

By the mid-1930s almost every American household had a radio. The advent of the transistor in the 1950s completely transformed its size, style, and portability.

A study of how human vision works enabled engineers to develop television technology. John Baird in England and Charles Jenkins in the United States worked independently to combine modulated light and a scanning wheel to reconstruct a scene in line-by-line sweeps. In 1925, Baird succeeded in transmitting a recognizable image.

Philo T. Farnsworth, a 21-year-old inventor from Utah, patented a scanning cathode ray tube, and Vladimir Zworykin of RCA devised a superior television camera in 1930. Shortly after World War II, televisions began to appear on the market. The first pictures were faded and flickering, but more than a million sets were sold before the end of the decade. An average set cost $500 at a time when the average salary was less than $3,000 a year. In 1950, engineers perfected the rectangular cathode ray tube and prices dropped to $200 per set.

Today this technology is in transition again, moving toward discrete digital signals carried by fiber optics. This holds the potential for making television interactive. Cathode ray tubes with power-hungry electron guns are giving way to liquid crystal display (LCD) panels. Movie-style wide screens and flat screens are readily available. High Definition Television (HDTV) has almost double the usual number of pixels, giving a much sharper picture. Cable television and advances in fiber-optic technology will help lift the present bandwidth restrictions and increase image quality.

Computers

The computer is a defining symbol of 20th century technology. Less than 50 years ago, the idea of one machine that could be applied to many tasks was foreign to the scientific world. In 1956, Howard Aiken of Harvard University wrote: “If it should turn out that the basic logics of a machine designed for the numerical solution of differential equations coincide with the logics of a machine intended to make bills for a department store, I would regard this as the most amazing coincidence that I have ever encountered.” Indeed, this “coincidence” came to pass, and it has been amazing.

In 1945, Alan Turing laid out the principles for a machine that could store programs as well as data, and quickly switch to perform tasks as diverse as arithmetic, data processing, and chess playing.

The earliest digital machines, beginning with ENIAC and continuing into the late 1950s, were unreliable and difficult to program, used lots of power and required very large rooms. The machines could only solve one problem at a time.

Random access memory, developed at the Massachussettes Institute of Technology, would make information retrieval quick and easy. In 1952, Admiral Grace Hopper introduced the concept of reusable code, and laid out the general concepts of language translation and compilers. Two years later, John Backus of IBM proposed FORTRAN. Other early languages included COBOL and BASIC. In 1979, Dan Bricklin’s introduction of VISICALC defined the modern software industry of today.

The integrated circuit was introduced in 1958 by Jack Kilby. The 1970s and 1980s saw the concepts of increased integration. The microprocessor led to a proliferation of computers in various forms.

As systems evolved, each new generation of hardware needed a new operating system, rendering the user’s current (and expensive) software useless. In 1964, IBM introduced the System 360 “family” of computers, and changed these practices. Efforts to develop software that would operate on any hardware platform took the concept further. Pioneers of early systems included Dennis Ritchie and Ken Thompson, who developed UNIX, a system that would soon be ubiquitous.

Commercially viable time-sharing was introduced by IBM in 1961. Time-sharing was be the forerunner of technologies that permit higher degrees of interaction between the system and the end user. The first of these technologies, DEC’s PDP-8 minicomputer, was introduced in 1965. Now engineers had their “own” machine — one they could program and use for various purposes.

In 1981 IBM introduced the PC, a key event in the development of the consumer computer industry. It was based on an Intel microprocessor and the operating system DOS, licensed from Microsoft. In 1983 Apple Computers introduced the Macintosh and started a revolution in making computers easy to use.