Not the First Word on “Firsts”

To the Editors:

I enjoyed reading the editorial on “firsts” in the recent issue of Antenna. It resonated very strongly with the activities of The Computer Museum History Center, where I am Curator & Manager of Historical Collections. We are very frequently asked: “what or who was first?” Some time ago, I instructed all History Center staff to never use the “f-word” (or, in fact, any superlatives) as a matter of Museum policy. The reasons for this particular institutional prohibition are several.

To begin, as Robert Merton noted in his great paper on “Singletons and Multiples in Science,” it is rare that anything is objectively invented first or, if so, the unpacking of the relevant historical details in support of such claims blurs into “matters of religion” or patent disputes, neither of which necessarily correspond to priority beyond the narrow “qui bono” constructions that often underlie them. That is not to say the facts do not matter, but simply that often the reasons a particular question is being asked will tell you more than any putative answer. Are we thus mired in our own “subject-positions?” Do we always have to “follow the money”? No, but it’s almost always more historically truthful to get questioners to change tack and ask the right question rather than telling them what they want to hear.

Also, the concept of “firsts” doesn’t sit well with a sophisticated, critical understanding of the historical process of invention generally. With respect to the history of computing, I am constantly amazed at the invention of “firsts” by later aspirants to the throne. Innovations in microprocessor design, for example, almost invariably have antecedents in the “ancient” technologies of the mainframe from thirty or forty years ago. It is only the lack of disciplinary identity and historical consciousness among each generation of engineering graduates that fosters this “I did it first” perspective.

In complex systems, such as are typical of 20th century innovations, the adage “No man is an island” is particularly apt. As we know well, Edison, for example, did more than invent the light bulb. He devised an infrastructure for the bulb to live in. The same can be said for inventors of the telegraph, the telephone, jet aircraft, atomic weapons, radio and television, the laser, and in our case, computers. The “Search for Firsts,” as suggested above, is usually of legalistic and journalistic significance only. Lawyers and journalists are particularly reluctant to accept the Center’s refusal to make definitive claims on behalf of any one individual!

Claimants to firsts generally deploy as their first line of attack the concealment of context. This is particularly disturbing to any historian of technology and has the “hero worship” alluded to in the recent Antenna editorial as its substrate. I think this also nicely echoes your quoted source’s remark that “…everyone loves an argument over fact rather than theory.” Rather than addressing the subtleties of many people within a specific technical community working on related problems, it is simply the desire to encapsulate in a few short thoughts someone who can become iconic for a specific invention. If they are deceased, so much the better! There are no messy letters to the editor refuting public claims to the contrary. Newspapers and the media generally are the worst offenders. Nonetheless, with some explanation, reporters can be guided gently towards succinct descriptions of an invention that remain true to historical fact, viz. an explanation that will satisfy both layperson and sophisticate.

Because we at the Center view ourselves as one of the few keepers of the interpretive and historical conscience in the history of computing, and that we are considered authoritative within this specialty by a large as well as diverse audience, it is only ethical for us to not only explain the facts, but to instruct seekers of knowledge in their deployment. Piaget noted that a child develops the ability to ask “where did this come from?” at about the age of 12, and that the explanations they find satisfactory at this age are relatively simplified (if not simplistic). Not accidentally, the “intelligent twelve year old” is generally the level at which most media target their audience in terms of vocabulary and, I think, understanding! To break out of the heroic mold of history, we need to overcome this rather low bar with explanations that put individuals within their communities, moving from the paradigm “L’invention, c’est moi” to one in which the earthly inventions homo faber creates revolve around communities not individuals. Even towering figures, like Seymour Cray, to cite someone in the history of computing, had his refrigeration engineer without whom Cray’s machines would have melted within minutes.

I often meditate on the agony that Elisha Gray must have felt. He filed his patent application for the telephone only hours after Alexander Graham Bell, and as a result, he was written out of history. I think it is Gray’s resultant lifetime of disappointment that makes me so wary of, and reluctant to claim, anything as a “first.” There are rare exceptions, I’ll admit, but it seems the height of injustice to perpetuate narratives that turn the history of technology and the process of invention into such a zero-sum game.

I would enjoy hearing from others about this topic. On having these thoughts I am certain of one thing: I am not the first.


Dag Spicer is an electrical engineer and historian of technology with advanced degrees from the University of Toronto and Stanford University. He is Curator and Manager of Historical Collections at The Computer Museum History Center, home to the world’s largest collection of computing artifacts, in Mountain View, California. See https://web.archive.org/web/20050206001545/http://gizmo.org/ds/resume/mother/index.html

%d bloggers like this: