The development of the computer

The ideas and inventions of many engineers, mathematicians, and scientists led to the development of the computer. The ancient abacus served as the earliest sort of calculating device. But its use was limited by the need to move each counter individually (see ABACUS).

Early calculating devices. The first true calculating machines were developed in the 1600's. In 1642, the French mathematician, scientist, and philosopher Blaise Pascal invented the first automatic calculator. The device performed addition and subtraction by means of a set of wheels linked to each other by gears. The first wheel represented the numbers 1 to 10, the second wheel represented 10's, the third stood for 100's, and so on. When the first wheel was turned 10 notches, a gear moved the second wheel forward a single notch. The other wheels became engaged in a similar manner.

In the early 1670's, the German mathematician Gottfried Wilhelm von Leibniz extended the usefulness of Pascal's calculator. Leibniz's improvements included gear and wheel arrangements that made multiplication and division possible.

Leibniz also sought a counting system that would be easier for a machine to handle than the decimal system. He developed the binary system of mathematics in the late 1600's. Binary mathematics uses only the 0 and the 1, arranging them to represent all numbers.

An important contribution to the development of binary mathematics was made in the mid-1800's by George Boole, an English logician and mathematician. Boole used the binary system to invent a new type of mathematics. Boolean algebra and Boolean logic perform complex mathematical and logical operations on the symbols 0 and 1. Thus, a mechanical representation of binary mathematics would require the representation of only those two digits. This advance had a major effect on the development of computer logic and computer languages.

Early punched-card computing devices. A French textile weaver named Joseph Marie Jacquard made the next great contribution to the development of the computer. In the weaving process, needles directed thread to produce patterns. In 1801, Jacquard invented the Jacquard loom, which used punched cards to automate this process for the first time. The cards had patterns of holes punched in them, and were placed between the rising needles and the thread. The presence or absence of a hole could be compared to the two digits of the binary system. Where there were holes, the needles rose and met the thread. Where there were no holes, the needles were blocked. By changing cards and alternating the patterns of punched holes, it became possible to mechanically create complex woven patterns.

The punched cards of the Jacquard loom inspired the English mathematician Charles Babbage. During the 1830's, Babbage developed the idea of a mechanical computer that he called an analytical engine. He worked on the machine for almost 40 years. When performing complex computations or a series of calculations, the analytical engine would store completed sets of punched cards for use in later operations. Babbage's analytical engine contained all of the basic elements of an automatic computer--storage, working memory, a system for moving between the two, and an input device. But the technology of Babbage's time was not advanced enough to provide the precision parts he needed to construct the machine, and he lacked funding for the project. Babbage, like others of his time, also lacked an understanding of the nature and use of electricity.

The first successful computer. In 1888, American inventor and businessman Herman Hollerith devised a punched card system, including the punching equipment, for tabulating the results of the United States census (see CENSUS). Hollerith's machines used electrically charged nails that, when passed through a hole punched in a card, created a circuit. The circuits registered on another part of the machine, where they were read and recorded. Hollerith's machines tabulated the results of the 1890 census in the United States, making it the fastest and most economical census to date. In a single day, 56 of these machines could tabulate census information about more than 6 million people.

Hollerith's tabulator enjoyed widespread success. Governments, institutions, and industries found uses for the machine. In 1896, Hollerith founded the Tabulating Machine Company. He continued to improve his machines during the following years. In 1911, he sold his share of the company. Its name was changed to the Computing-Tabulating-Recording Company (C-T-R). In 1924, the name was changed to International Business Machines Corporation (IBM).

The first analog computer. Vannevar Bush, an American electrical engineer, worked to develop a computer that would help scientists. In 1930, he built a device called a differential analyser to solve differential equations. This machine was the first reliable analog computer. It derived measurements from the movements of its gears and shafts.

The first electronic computers. Some scientists and engineers saw greater computing potential in electronics. The first special-purpose electronic digital computer was constructed in 1939 by John V. Atanasoff, an American mathematician and physicist. In 1944, Howard Aiken, a professor at Harvard University, U.S.A., built another early form of digital computer, which he called the Mark I. The operations of this machine were controlled chiefly by electromechanical relays (switching devices).

In 1946, two engineers at the University of Pennsylvania, U.S.A., J. Presper Eckert, Jr., and John William Mauchly, built the first general-purpose electronic digital computer. They called it ENIAC (Electronic Numerical Integrator And Computer). ENIAC contained about 18,000 electronic valves, which replaced the relays that had controlled the operation of Mark I. The machine weighed more than 27 metric tons, occupied more than 140 square metres of floor space, and consumed 150 kilowatts of electricity during operation. ENIAC operated about 1,000 times as fast as the Mark I. It could perform about 5,000 additions and 1,000 multiplications per second, and could store parts of its programming.

Although ENIAC performed its work rapidly, programming the huge machine took a great deal of time. Eckert and Mauchly next worked on developing a computer that could store even more of its programming. They worked with John von Neumann, a Hungarian-born American mathematician. Von Neumann helped assemble all available knowledge of how the logic of computers should operate. He also helped outline how stored-programming techniques would improve computer performance.

In 1951, a computer based on the work of the three men became operational. It was called EDVAC (Electronic Discrete Variable Automatic Computer). EDVAC strongly influenced the design of later computers.

Also in 1951, Eckert and Mauchly invented a more advanced computer called UNIVAC I (UNIVersal Automatic Computer). Within a few years, UNIVAC I became the first commercially available computer. Unlike earlier computers, UNIVAC I handled both numbers and alphabetical characters equally well. It also was the first computer system in which the operations of the input and output equipment were separated from those of the computing unit. UNIVAC I used electronic valves to perform arithmetic and memory-switching functions.

The first UNIVAC I was installed at the U.S. Bureau of the Census in June 1951. The following year, another UNIVAC I was used to tabulate the results of the United States presidential election. Based on available data, UNIVAC I accurately predicted the election of President Dwight D. Eisenhower less than 45 minutes after the polls closed.

The miniaturization of computer components. The invention of the transistor in 1947 led to the production of faster and more reliable electronic computers. Transistors control the flow of electric current in electronic equipment. They soon replaced the bulkier, less reliable electronic valves. In 1958, Control Data Corporation introduced the first fully transistorized computer, designed by American engineer Seymour Cray. IBM introduced its first transistorized computers in 1959.

Miniaturization continued with the development of the integrated circuit in the early 1960's. An integrated circuit contains thousands of transistors and other tiny parts on a small silicon chip. This device enabled engineers to design both minicomputers and high-speed mainframes with tremendous memory capacities.

Despite the shrinking size of their components, most computers remained relatively large and expensive. But dependence on computers increased dramatically. By the late 1960's, many large businesses relied on computers. Many companies linked their computers together into networks, making it possible for different offices to share information.

During the 1960's, computer technology improved rapidly. Different kinds of circuits were placed on silicon chips. Some of the circuits contained the computer's logic. Other chips held memory. By the early 1970's, the entire workings of a computer could be placed on a handful of chips. As a result, smaller computers became possible. The central chip that controlled the computer became known as a microprocessor.

The personal computer. The first personal computer, the Altair, was introduced in 1975. Only electronics hobbyists bought these computers.

In 1977, two American students, Steven P. Jobs and Stephen G. Wozniak, founded the Apple Computer Company and introduced the Apple II personal computer. The Apple II was much less expensive than mainframes. As a result, computers became more widely available. Personal computers were purchased by businesses that could not afford mainframes or did not need the immense computing power that mainframes provided. Millions of individuals, families, and schools also bought them.

In 1981, IBM entered the personal computer market with its PC. It used an operating system licensed from a firm called Microsoft, based in Redmond, Washington, U.S.A. The operating system was called DOS (short for Disk Operating System). It used a command-line interface (the user communicated with the computer by typing commands on an otherwise largely blank screen). Since IBM had only licensed DOS, the company could not stop it from being used in other computers. The flexibility of DOS had by the mid-1980's made it the most successful operating system available. Another successful operating system from the early 1980's was UNIX. Developed by the U.S. company Bell Laboratories, it also used a command-line interface but allowed several users to do different things on the same system at the same time (a process called multitasking). Like DOS, UNIX was flexible and became particularly popular on large university and business computers.

In 1984, Apple Computer introduced the Apple Macintosh, an easy-to-use computer with a graphical user interface (GUI). The Macintosh was the first successful commercial use of a GUI, in which the user was presented with a menu of choices with pictures (icons) arranged in boxes (windows) representing programs and applications. With a movable pointing device called a mouse, the user could select an icon, press a button on the mouse twice, and start a chosen program. The terms "Windows", "Icons", "Menus", and "Pointers"--the key elements of the GUI--abbreviate to "wimps", which is a popular name for this interface. Word processors on the Macintosh used graphic representations of typefaces in black on a white background, just like words on paper. The Macintosh also allowed multitasking.

The Apple Macintosh helped launch the industries of desktop publishing and computer-aided design on affordable machines. In 1986, Microsoft launched its rival GUI, Windows, to run on IBM PC's and similar machines. For nonpublishing applications, Windows outsold Apple's Macintosh system and, in 1995, Microsoft launched the latest version of its GUI, Windows 95.

GUI's were made possible because computers in the 1980's and 1990's became, faster and more powerful. In the late 1990's, RISC (Reduced Instruction Set Computing) technology increased computer speed and capacity. RISC computers use microprocessors that work at high speed because they carry the circuitry for performing fewer operations than those of other computers.

Alongside the quest for speed and power, computers are continuing to get smaller. Hand-held electronic organizers, laptop computers, and palmtop computers are already available commercially. Even smaller computers will continue to be made from integrated circuits. But some experts foresee the production of biological computers, which will be grown rather than manufactured. Other experts believe that computer technology will develop ways of storing data on individual molecules.

Software research continues to focus on artificial intelligence, with the intention of helping computers make decisions. One type of artificial intelligence is the expert system, which seeks solutions to problems by narrowing the field of inquiry. Medical doctors, for example, use such systems in diagnosing illness. A computer asks a patient various questions about his or her symptoms. Each answer dictates what the computer asks next, and its final response is based on the data of medical experience upon which its program has drawn.

The computer's ability to share data with other computers over a network linked by telephone lines is at the heart of a major telecommunications revolution. A global network of computer networks called the Internet has expanded enormously since the early 1990's. The Internet began as a U.S. defensive network of scientific and military computers in the 1960's. Now it is an international system for sending and receiving electronic mail, software, and electronic document and picture files all over the world. The Internet has already cut the cost of long-distance communications for many people. It has the potential to change the way people work. With the Internet, increasing numbers of people can work from home.

Problems of computer age (Next)

 

History of computers | Importance of computers | Basic Principles | Kinds of computers | How Computer Works | Programming a computer | Computer Industry | Development of computers | Problems of computer age|

 

Main Page | Products | Services | Entertainment | History | Contact Us