In Search of the First Personal Computer

Editors’ Note: Do we need to label one or another individual as a “founding father” or, perhaps, “founding mother” ? Why do partisans battle to do so? Susan B. Barnes contributes to Antenna’s ongoing discussion of “firsts,” detailing just how complex and contingent any such determination can be, in this case regarding the “first personal computer.”

Two Altair microcomputers: the 680b, 1976 (on top), and the famous 8800, which contained a new and powerful Intel chip.
Two Altair microcomputers: the 680b, 1976 (on top), and the famous 8800, which contained a new and powerful Intel chip. Model Instrumentation Telemetry Systems (MITS) in Albuquerque, New Mexico, owned by Ed Roberts, manufactured both of these as kits. Popular Electronics featured the 8800 on a cover in 1975, inspiring hundreds to purchase this kit from MITS to build and own their own computer. Legend has it that Paul Allen saw that magazine cover and convinced Bill Gates to join him in developing a programming language for the 8800—Basic.

In August 2001, newspapers, magazines and television news programs around the country ran stories to celebrate “the” 20th anniversary of the personal computer. The sound bites often failed to mention that it was the 20th anniversary of the IBM-PC that they marked. In contrast, the 20th anniversary of several earlier computers, such as the LINC, Dynabook, MICRAL, Altair, Apple I, and Notetaker, had come and gone without much fanfare. Still, amidst the IBM hoopla, the New York Times did remind its readers that Ed Roberts and Alan Kay had developed earlier personal computers. Clearly, many people and technologies contributed to personal computing, which raises the question: what machine should be historically recognized as the first personal computer?

When we look at the history of scientific and technological ideas, computers developed after a Kuhnian paradigm shift in which information emerged as the fundamental scientific and technological concept. Ideas about information were applied to all types of phenomena from DNA to corporate management. According to M. S. Mahoney, “current awareness of the fundamental nature of information and of its determinative role in modern life makes it difficult to unravel the threads of its history” (1990, p. 537). Key information ideas included Norbert Weiner’s concept of cybernetics and Claude E. Shannon’s research on switching technology that was applied to electromechanical relays and later to microscopic transistors etched on silicon chips. The work of Weiner and Shannon was used to explain all types of human and technological communication systems.

The information revolution encompassed improved telephone services and the invention of digital computers. Originally developed during World War II as calculating machines, computers encouraged new concepts about information and how it could be transformed and distributed over time and space. For instance, traditional forms of bookkeeping became “data processing.” Martin Campbell-Kelly and William Aspray’s Computer: A History of the Information Machine examines the history of computing from an information-processing perspective. Historically, we can, in a real sense, trace the roots of personal computing back to human computers who manually calculated mathematical tables. Adding machines, calculators, and office equipment contributed to the replacement of human computers with machines.

When the first stored-program computers went into operation in the 1950s, American corporations began to explore how computers could be sold as office products. At that time, computers operated in air-conditioned rooms isolated from people. IBM was a leader in the effort to make commercial machines available. In 1964, IBM’s System 360 revolutionized office computing because it combined miniaturization with large-scale integration to establish the business minicomputer. With the advent of the minicomputer, a new data-processing industry emerged. Computer hardware enabled companies to manage their accounting procedures with punch cards that could be processed by computers. However, batch-processing cards is highly impersonal because people had to submit data and then wait for delivery of their results. In contrast, a new group of researchers wanted to operate computers interactively.

Ideas about how people could and should interact with computers played a central role in the history of personal computing. For instance, Paul Ceruzzi (1998) considers DEC’s PDP-10 to have been an ancestor of the personal computer because it created the feeling that it was one’s own personal machine. Time sharing, the light pen, and ideas about human-computer interaction contributed to transforming the computer from a data processor into a personal computer. Research on SAGE and Sketchpad combined with J.C.R. Licklider’s ideas about man-machine symbiosis and Douglas Engelbart’s Augment system to inspire the development of interactive computing.

In 1968, Engelbart demonstrated the mouse, teleconferencing, graphical display screens, and hypertext at a computer conference. Sitting in the audience was Alan Kay, a graduate student who soon began drawing sketches of personal machines called Dynabooks. Because of these early designs, journalists have called Kay the father of the personal computer. Kay, however, believes that the first personal computer was the LINC, a small computer developed by Wesley Clark. Built in 1962, the LINC (short for Lincoln Labs) was a modular machine whose console fit on top of a large desk. That LINCs had to be assembled by the scientists and engineers who used them created a personal tie between users and their computers.

Building on Clark and Engelbart’s ideas, Kay brought the notion of personal computing to Xerox PARC. PARC researchers invented minicomputers and they had computer terminals on every desk. As a result, some credit PARC with inventing personal computing. But Kay wanted to develop small computers that people could carry around like a notebook. In the late 1970s, PARC researchers designed the Notetaker, a plump attache case that looked like the first generation of “luggable” computers created six years later by IBM. Notetaker was difficult to carry, so they built a rolling cart for it, which allowed researchers to slide the computer under an airline seat. A PARC researcher became the first person to use a personal computer on an airplane when he turned on the battery-powered Notetaker during a flight to Rochester. It should be noted that Notetaker is absent from recent histories, such as the Enclyopedia of Computers and Computer History, which credits IBM with the first transportable computer (model 5100) and calls the Osborne I the first successful portable computer.

While Xerox was developing personal computing, many smaller computers were invented. For instance, in 1973 Thi T. Truong, a Viet Nam immigrant living in France invented the first general-purpose computer based on a microprocessor. This MICRAL was based on the Intel 8008 microprocessor. It was a commercial machine, and customers wrote specialized software that would be burned into a ROM (read-only memory) chip to make the system function. By 1974, Intel had all of the components to build personal computers, but they never realized it.

Instead of having evolved from minicomputers, some writers argue that early personal computers appeared from bottom up advances in semiconductor electronics. Robert Cringely (1991) asserts that personal computers should be considered really large microchips. Viewing the microprocessor as the defining technology for personal computers takes the history of personal computers back to micro chips and calculator technologies. In 1974, Hewlett-Packard advertised their HP-65 calculator as a personal computer. Paul Ceruzzi (1998) believes that this may have been the first time that the term personal computer appeared in the press.

In the meantime, during the early 1970s a hobbyist movement developed around calculators and electronics. However, a rhetorical schism soon occurred in which some hobbyists focused on the word personal and others on computer. As early as 1971, an advertisement appeared for a computer-like kit called the Kenback-1. Three other kits came available in 1974, including the Scelbi-8H mini-computer, the Mark-8 personal computer, and the Altair.

Many historians and journalists consider the Altair to have been the first personal computer because it led to the beginning of software giant Microsoft and the formation of today’s personal computer industry. Paul Allen and Bill Gates wrote the first software program for the Altair. Steve Wozniack, a member of the Homebrew Computer Club, built his first computer called the Apple I because he could not afford an Altair. From a social perspective, however, the Altair was the first computer that individuals owned, another reason many consider it the first personal computer. It started a social revolution that brought the use of computers out the control of corporations and into the hands of individuals.

As a further complication, some give the title of the father of the personal computer to both Ed Roberts and Alan Kay. However, there is a minor difference. Roberts is the father of the first personal computer, the Altair. In contrast, Kay is the father of the personal computer because he envisioned the technology as it came to be used. Both men made significant contributions to personal computing. Likewise, there are other people including Wesley Clark and Thi T. Truong who are contenders for the title of inventor of the first personal computer.

Size, social accessibility, history of ideas, design concepts, and technological developments can all be used to identify the first personal computer. The perspectives we use in our historical explorations will determine what machines receive the title of first. So many different social, historical, and technological factors came together in the making of personal computers that it is difficult to name one machine or person.

Nonetheless, despite the complexity of real historical processes, IBM’s recent public relations campaign could mislead some people into believing that the IBM-PC was the first in some real sense. For this reason, Siegfried Giedion argued in Mechanization Takes Command that we need to examine technologies during their developmental processes to make sure that inventions and inventors are not forgotten. Otherwise commercially successful products eventually come to dominate historical accounts unchallenged. As digitalization takes command, we should remember the myriad of people, ideas, and inventions that contributed to this new technological revolution.


• Campbell-Kelly, Martin & Aspray, William. Computer: A History of the Information Machine. New York: Basic Books, 1996.
• Ceruzzi, Paul E. A History of Modern Computing. Cambridge, MA: The MIT Press, 1998.
• Cringely, Robert X. Accidental Empires. Reading, MA:
Addison-Wesley Publishing Company, Inc., 1992.
• Giedion, Sigfried. Mechanization Takes Command. New York: W.W. Norton & Company, Inc., 1948.
• Rojas, Raul. Encyclopedia of Computers and Computer History. Chicago: Fitzroy Dearborn Publishers, 2001.
• Mahoney, Michael S. “Cybernetics and Information Theory.” In R.C. Olby, G.N. Cantor, J.R.R. Christie & M.J.S. Hodge (eds), Companion to the History of Modern Science, pp. 537-556. New York: Routledge, 1990.
• Waldrop, M. Mitchell. The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal.
New York: Viking, 2001.

Susan B. Barnes is Associate Chair and Associate Professor in the Communication and Media Studies Department at Fordham University’s Rose Hill Campus.
Her publications include Online Connections:
Internet Interpersonal Relationships,
(Cresskill: The Hampton Press, 2001).

%d bloggers like this: