Home   About   Timeline   The Book


Computer History 5 - Personal Computers




 

Then, with the arrival of the humble Altair in 1975, the scale suddenly plunged to a level never imagined by industry leaders. What made such a compact, affordable machine possible was the microprocessor, which concentrated all of a computer's arithmetical and logical functions on a single chip—a feat first achieved by an engineer named Ted Hoff at Intel Corporation in 1971. After the Intel 8080 microprocessor was chosen for the Altair, two young computer buffs from Seattle, Bill Gates and Paul Allen, won the job of writing software that would allow it to be programmed in BASIC. By the end of the century the company they formed for that project, Microsoft, had annual sales greater than many national economies.

Nowhere was interest in personal computing more intense than in the vicinity of Palo Alto, California, a place known as Silicon Valley because of the presence of many big semiconductor firms. Electronics hobbyists abounded there, and two of them—Steve Jobs and Steve Wozniak—turned their tinkering into a highly appealing consumer product: the Apple II, a plastic-encased computer with a keyboard, screen, and cassette tape for storage. It arrived on the market in 1977, described in its advertising copy as "the home computer that's ready to work, play, and grow with you." Few packaged programs were available at first, but they soon arrived from many quarters. Among them were three kinds of applications that made this desktop device a truly valuable tool for business—word processing, spreadsheets, and databases. The market for personal computers exploded, especially after IBM weighed in with a product in 1981. Its offering used an operating system from Microsoft, MS-DOS, which was quickly adopted by other manufacturers, allowing any given program to run on a wide variety of machines.

The next 2 decades saw computer technology rocketing ahead on every front. Chips doubled in density almost annually, while memory and storage expanded by leaps and bounds. Hardware like the mouse made the computer easier to control; operating systems allowed the screen to be divided into independently managed windows; applications programs steadily widened the range of what computers could do; and processors were lashed together—thousands of them in some cases-in order to solve pieces of a problem in parallel. Meanwhile, new communications standards enabled computers to be joined in private networks or the incomprehensibly intricate global weave of the Internet.

Where it all will lead is unknowable, but the rate of advance is almost certain to be breathtaking. When the Mark I went to work calculating ballistics tables back in 1943, it was described as a "robot superbrain" because of its ability to multiply a pair of 23-digit numbers in 3 seconds. Today, some of its descendants need just 1 second to perform several hundred trillion mathematical operations—a performance that, in a few years, will no doubt seem slow.


 


     Computers
     Timeline
     1 - Binary Computer
     2 - EDVAC
     3 - UNIVAC
     4 - Applications
     5 - Personal Computers
     Essay - William H. Gates III





Copyright © 2024 National Academy of Sciences on behalf of the National Academy of Engineering.

Privacy Statement. DMCA Policy. Terms of Use.

Printer-Friendly Version. Text-Only Version. Contact Us.