The ENIAC—Electronic Numerical Integrator and Computer—had made its appearance in 1946, the first of many electronic calculators to be known as computers. (Before this time, the term computer was applied to people who did mathematical calculations.) By the 1950s, these vacuum tube-filled giants occupied entire rooms (and had far less memory than today's cell phones), had been dubbed "brains" by the press, and were expected to perform an amazing array of tasks. Among the uses anticipated by the end of the decade: the complete translation of books from one language to another, the "control" of the Missouri River, and, of course, weather prediction.
The first UNIVAC machine (1951) took up a 140-square-foot (13-m2) space, contained over 5,000 vacuum tubes, and performed 465 multiplications per second. IBM's first commercial machine, the 701 (1952), was about four times faster, at 2,000 multiplications per second. That may sound fast, but the average desktop computer today handles about 100 million instructions per second and 2004's fastest supercomputer handled 70 trillion instructions per second—35 billion times faster than IBM's 701, which was used to compute the Weather Bureau's first operational weather maps in 1955.
These early computers, with their blinking lights and miles of wire, did make computations faster, and more accurately, than could be accomplished with adding machines and slide rules. Their unreliability was a continuing problem. The vacuum tubes burned out quickly and when one died, the "brain" stopped working until it was replaced. It was difficult to detect and solve both hardware and software problems. Getting data in and out of the machines took a long time. When the early weather modelers were trying to run their programs, they took three weeks to process the data to load into the computer and just three hours to compute the forecast. Considering that people could compile the data on pieces of paper and in their heads and make the forecast in less than 12 hours, the electronic "brain" was not an improvement over the human brain.
Engineers working for the large computer companies continuously developed methods of increasing the memory, data handling, and computational capabilities of their machines to meet the needs of business, government, and academic customers. In the 21st century, computers and the small handheld electronic devices that everyone takes for granted will continue to become more powerful and be used to solve increasingly complex problems. It remains to be seen whether they ever become the "brains" anticipated by 1950s-era media accounts. After all, the human brain handles about 10 quadrillion instructions per second.
Was this article helpful?