ISSUES IN THE HISTORY OF COMPUTING

Michael S. Mahoney
Princeton University

Paper prepared for the Forum on History of Computing at the ACM/SIGPLAN Second History of Programming Languages Conference
Cambridge, MA, 20-23 April 1993 - ©1993 by the Association for Computing Machinery

Compiling the Record

It should be easy to do the history of computing. After all, computing began less than fifty years ago, and we have crowds of eye-witnesses, mountains of documents, storerooms of original equipment, and the computer itself to process all the information those sources generate. What's so hard? What are we missing?

Well, the record is not quite as complete as it looks. As the Patent Office discovered during the '80s, much of the art of programming is undocumented, as is much of the software created during the '50s and since. The software itself is becoming inaccessible, as the machines on which it ran disappear through obsolescence. Critical decisions lie buried in corporate records -- either literally so or as trees are in a forest. Many eye-witnesses have testified, but few have been cross-examined. Much of the current record exists only online and even if archived will consist of needles in huge haystacks. But these are minor matters, to be discussed in the context of broader issues.

The major problem is that we have lots of answers but very few questions, lots of stories but no history, lots of things to do but no sense of how to do them or in what order. Simply put, we don't yet know what the history of computing is really about. A glance at the literature makes that clear. We still know more about the calculating devices that preceded the electronic digital computer --however tenuously related to it-- than we do about the machines that shaped the industry. We have barely begun to determine how the industry got started and how it developed. We still find it easier to talk about hardware than about software, despite the shared sense that much of the history of computing is somehow wrapped up in the famous inversion curve of hardware/software costs from 1960 to 1990. We still cast about for historical precedents and comparisons, unsure of where computing fits into the society that created it and has been shaped by it. That uncertainty is reflected in the literature and activities of the History of Science Society and the Society for the History of Technology, where the computer as the foremost scientific instrument, and computing as the salient technology, of the late twentieth century are all but invisible, not only in themselves but in the perspective they shed on earlier science and technology.

The presentations in the second session today are addressed to the various materials that constitute the primary sources for the history of computing: artifacts, archives and documents, first-person experience both written and oral. The business of HOPL-II itself is to enrich the sources for programming languages. As I noted at the outset, we seem to have no shortage of such materials. Indeed, as someone originally trained as a medievalist, I feel sometimes like a beggar at a banquet. I hardly know where to begin, and my appetite runs ahead of my digestion. It's a real problem, and not only for the consumer. To continue the metaphor, at some point the table can't hold all the dishes, and the pantry begins to overflow.

We have to pick and choose what we keep in store and what we set out. But by what criteria? Conflagration has done a lot of the selecting for medievalists. Historians of computing --indeed of modern science and technology in general-- have to establish principles of a less random character. Once collecting and preserving become selective (I mean consciously selective; they are always unconsciously selective), they anticipate the history they are supposed to generate and thus create that history. That is, they are historical activities, reflecting what we think is important about the present through what we preserve of it for the future. What we think is important about the present depends on who we are, where we stand --in the profound sense of the cliché, where we're coming from. Everyone at HOPL-II is doing history simply by being here and ratifying through our presence that the languages we are talking about and what we say about their origin and development are historically significant.

There's an important point here. Let me bring it out by means of something that seems to be of consuming interest to computer people, namely "firsts". Who was first to ... ? What was the first ... ? Now, that can be a tricky question because it can come down to a matter of meaning rather than of order in time. Nothing is entirely new, especially in matters of scientific technology. Innovation is incremental, and what already exists determines to a large extent what can be created. So collecting and recording "firsts" means deciding what makes them first, and that decision can often lead to retrospective judgments, where the first X had always been known as a Y. Let me give an example prompted by a distinguished computer scientist's use of history. I want to return to the topic at the end of this paper, so the example is more detailed than it need be for the present point.

In a review of business data processing in 1962, Grace Hopper, then at Remington Rand UNIVAC Division, sought to place the computer in the mainstream of American industrial development by emphasizing a characteristic sequence of venture -- enterprise -- mass production -- adventure. The model suggested that computing was entering a period of standardization before embarking in daring new directions. The first two stages are tentative and experimental, but

As the enterprise settles down and becomes practical mass production, the number of models diminishes while the number made of each model increases. This is the process, so familiar to all of us, by which we have developed our refrigerators, our automobiles, and our television sets. Sometimes we forget that in each case the practical production is followed by the new adventure. Sometimes we forget that the Model T Ford was followed by one with a gear shift.[Hopper 62]

That Model T, an icon of American industrialism, deserves a closer look. Shifting-gear transmissions antedated the Model T. But the softness of the steel then available and the difficulty of meshing gears meant the constant risk of stripping them. Concerned above all with reliability and durability, Ford consciously avoided shifting by going back to the planetary transmission, in which the gears remain enmeshed at all times and are brought to bear on the driveshaft in varying combinations by bands actuated by footpedals. Although the Model A indeed had a gear shift, made reliable by Ford's use of new alloys, it is perhaps more important historically that the planetary transmission had just begun its life. For later, with the footpedals replaced by a hydraulic torque converter, it served as the heart of the automatic transmissions that came to dominate automotive design in the 1950s. Now, how does one unravel the "firsts" in all this? Would we think to keep a Model T around for the history of automatic transmissions? What then of the first operating system, or the first database? Does the equivalent of a planetary transmission lurk in them, too?

Documenting Practice

In deciding what to keep, it may help to understand that historians look at sources not only for what is new and unusual but also for what is so common as to be taken for granted. In the terms of information theory, they are equally interested in the redundancy that assures the integrity of the message. For much of common practice is undocumented, and yet it shapes what is documented. The deep effect of innovation is to change what we take for granted. This is what Thomas S. Kuhn had in mind with the notion of "paradigm shift", the central concept of his immensely influential book, The Structure of Scientific Revolutions [Kuhn 62]. He later replaced "paradigm" with "disciplinary matrix", but the meaning remained the same: in learning to do science, we must learn much more than is in the textbook. In principle, knowing that F = ma is all you need to know to solve problems in classical mechanics. In practice, you need to know a lot more than that. There are tricks to applying F = ma to particular mechanical systems, and you learn how to do it by actually solving problems under the eye of someone who already has the skill. It is that skill, a body of techniques and the habits of thought that go with them, that constitutes effective knowledge of a subject.

Because all practitioners of a subject share that skill, they do not talk about it. They take it for granted. Yet it informs their thinking at the most basic level, setting the terms in which they think about problems and shaping even the most innovative solutions. Thus it plays a major role in deciding about "firsts". Over time, the body of established practice changes, as new ideas and techniques become common knowledge and older skills become obsolete. Again, one doesn't talk much about it; "everyone knows that!" Yet, it is fragile. It is hard to keep, much harder than machines or programs.

To gain access to undocumented practice, historians are learning to do what engineers often take for granted, to read the products of practice in critical ways. In The Soul of a New Machine Tracy Kidder relates how Tom West of Data General bluffed his way into a company that was installing a new VAX and spent the morning pulling boards and examining them.

Looking into the VAX, West had imagined he saw a diagram of DEC's corporate organization. He felt that the VAX was too complicated. He did not like, for instance, the system by which various parts of the machine communicated with each other; for his taste, there was too much protocol involved. He decided that VAX embodied flaws in DEC's corporate organization. The machine expressed that phenomenally successful company's cautious, bureaucratic style. Was this true? West said it didn't matter, it was a useful theory.[Kidder 81, 26]

Historically, it is indeed a useful theory. Technology is not a literate enterprise; not because inventors and engineers are illiterate, but because they think in things rather than words.[Ferguson 79, Wallace 78] Henry Ford summed it up in his characteristically terse style:

There is an immense amount to be learned simply by tinkering with things. It is not possible to learn from books how everything is made --and a real mechanic ought to know how nearly everything is made. Machines are to a mechanic what books are to a writer. He gets ideas from them, and if he has any brains he will apply those ideas.[Ford 24, 23-24]

In short, the record of technology lies more in the artifacts than in the written records, and historians have to learn to read the artifacts as critically as they do the records. It is perhaps the best way of meeting Dick Hamming's challenge to "... know what they thought when they did it."[Hamming 80] For that, historians need, first and foremost, the artifacts. That is where museums play a central role in historical research. But historians also need help in learning how to read those artifacts. That means getting the people who designed and built them to talk about the knowledge and know-how that seldom gets into words.

As I use the word "artifact" here, I suspect that most of the audience has a machine in mind. But computing has another sort of artifact, one that seems unique until one thinks about it carefully. I mean a program: a high-level language compiler, an operating system, an application. It has been the common lament of management that programs are built by tinkering and that little of their design gets captured in written form, at least in a written form that would make it possible to determine how they work or why they work as they do rather than in other readily imaginable ways. Moreover, what programs do and what the documentation says they do are not always the same thing. Here, in a very real sense, the historian inherits the problems of software maintenance: the farther the program lies from its creators, the more difficult it is to discern its architecture and the design decisions that inform it.

Yet software can be read. In Alan Kay's talk, we'll hear a counterpart to Tom West's reading of the VAX boards:

Head whirling, I found my desk. On it was a pile of tapes and listings, and a note: "This is the ALGOL for the 1108. It doesn't work. Please make it work". The latest graduate student gets the latest dirty task.
The documentation was incomprehensible. Supposedly, this was the Case-Western Reserve 1107 Algol -- but it had been doctored to make a language called Simula; the documentation read like Norwegian transliterated into English, which in fact was what it was. There were uses of words like activity and process that didn't seem to coincide with normal English usage.
Finally. another graduate student and I unrolled the listing 80 feet down the hall and crawled over it yelling discoveries to each other. The weirdest part was the storage allocator, which did not obey a stack discipline as was usual for Algol. A few days later, that provided the clue. What Simula was allocating were structures very much like instances of Sketchpad.
[Kay 93, 71(5)]

In the draft version, Kay added as a gloss to his tale that such explorations of machine code were common among programmers: "[B]atch processing and debugging facilities were so bad back then that one would avoid running code at any cost. 'Desk-checking' listings was the way of life." Yet, it is not the sort of thing one finds in manuals or textbooks. Gerald M. Weinberg gives an example of how it is done for various versions of Fortran in Chapter 1 of The Psychology of Computer Programming [Weinberg 71], and Brian Kernighan and P.J. Plauger's Elements of Programming Style [Kernighan 74] can be read as a guide to the art of reading programs. But the real trick is to capture the know-how, the tricks of the trade that everyone knew and no one wrote down when "desk-checking" was the norm. We need to think about how best to do that, since for historians without access to the machines on which programs ran or without the resources to have emulators written, desk-checking could again become a way of life. At the very least, it will demand fluency in the languages themselves.

The Importance of Context

An emphasis on practice is also an emphasis on context. In focusing on what was new about computing in general and about various aspects of it in particular, one can lose sight of other elements of tradition (in the literal sense of "handing over") and of training that determined what was taken for granted as the basis for innovation. From the outset, computing was what Derek J. DeS. Price, a historian and sociologist of science, termed "big science" [Price 63]. It relied on government funding, an expanding economy, an advanced industrial base, and a network of scientific and technical institutions. Over the past 40-odd years it has achieved autonomy as a scientific and technical enterprise, but it did so on the basis of older, established institutions, from which the first generations of computer people brought their craft practices, their habits of thought, their paradigms, and their precedents, all of which shaped the new industry and discipline they were creating.

As in many other things, Alan Perlis offered a productive metaphor for thinking about context. At the first HOPL he reflected on the fate of Algol 58 in competition with Fortran, noting that:

The acceptance of FORTRAN within SHARE and the accompanying incredible growth in its use, in competition with a better linguistic vehicle, illustrated how factors others than language determine the choice of programming languages of users. Programs are not abstractions. They are articles of commerce in a society of users and machines. The choice of programming language is but one parameter governing the vitality of the commerce.[Perlis 81, 83]

A comment by Kristen Nygaard during discussion led Perlis to expand the point:

I think that my use of the word 'commerce' has been, at least in that case, misinterpreted. My use of the word 'commerce' was not meant to imply that it was IBM's self-interest which determine that FORTRAN would grow and that ALGOL would wither in the United States. It certainly was the case that IBM, being the dominant computer peddler in the United States, determined to a great extent --that the language FORTRAN would flourish, but IBM has peddled other languages which haven't flourished. FORTRAN flourished because it fitted well into the needs of the customers, and that defined the commerce. SHARE is the commerce. The publication of books on FORTRAN is the commerce. The fact that every computer manufacturer feels that if he makes a machine, FORTRAN must be on it is part of the commerce. In Europe, ALGOL is part of a commerce that is not supported by any single manufacturer, but there is an atmosphere in Europe of participation in ALGOL, dissemination of ALGOL, education in ALGOL, which makes it a commercial activity. That's what I mean by 'commerce' -- the interplay and communication of programs and ideas about them. That's what makes commerce, and that what makes languages survive.[Perlis 81, 164-5]

One can extend Perlis' metaphor ever further, shifting the focus from survival to design. Henry Ford insisted that the Model T embodied a theory of business. He had designed it with a market in mind. That same may be said of any product, including programs. An industrial artifact is designed with a consumer in mind and hence reflects the designers' view of the consumer. The market --another term for "commerce"-- is thus not a limiting condition, an external constraint on the product, but rather a defining condition built into the product. "What does this mean?" can often best be answered by determining for whom it was meant. For both hardware and software, that may not be an easy thing to do. Stated goals may not have coincided with unstated, the people involved may not have agreed, and the computing world has a talent for justifying itself in retrospect.

As Perlis suggested, "commerce" in a general sense extends beyond industry and business to encompass science, technology, and the institutions that support them. The notion emphasizes the role of institutions in directing the technical development of computing and, to some extent, conversely. In Creating the Computer, Kenneth Flamm has revealed the patterns of government support that gave the computer and computing their initial shape.[Flamm 88] The recent historical report by Arthur Norberg and Judy O'Neill on DARPA's Information Processing Techniques Office explores the interplay between defense needs and academic interests in the development of time-sharing, packet-switched networks, interactive computer graphics, and artificial intelligence.[Norberg 92] What is particularly striking is the mobility of personnel between government offices and university laboratories, making it difficult at times to discern just who is designing what for whom. The study of the NSF's program in computer science, now being completed by William Aspray and colleagues, opens similar insights into the reciprocal influences between programs of research and sources of funding. In IBM's Early Computers, Charles Bashe and his co-authors show how commercialization of a cutting-edge technology forced the corporation, until then used to self-reliance in research and development, to open new links with the larger technical community and to play an active role in it, as evidenced by the IBM Journal of Research and Development, first published in 1957.[Bashe 86] Closer to my own home, the development of computer science at Princeton has been characterized by the easy flow of researchers between the University and Bell Labs, and that same pattern has surely obtained elsewhere.

Taken broadly, then, the notion of "commerce" directs the historian's attention to the determinative role of the marketplace in modern scientific technologies such as computing. It is one thing to build number-crunchers one-by-one on contract to the government for its laboratories; it is another to develop a product line for a market that must be created and fostered. The explosive growth of computing since the early '50s depended on the ability of the industry to persuade corporations and, later, individuals that they needed computers, or at least could benefit from them. That meant not only designing machines but also, and increasingly, uses for them in the form of computer systems and applications. Henry Ford put Americans on wheels in part by showing them how they could use an automobile. Similarly, the computing industry has had to create uses for the computer. The development of computing as a technology has depended, at least in part, on its success in doing so, and hence understanding the directions the technology has taken, even in its most scientific form, means finding the market for which it was being designed, or the market it was trying to design.

"Market" here includes people and people's skills. Above, I said a program seems "unique until one thinks about it". One may think about a program as essentially the design of a machine through the organization of a system of production. Other people thought about the organization of production and the management of organizations, and in many cases creating a market for computing meant creating a market for the skills of organizing systems of production. In examining the contexts of computing, historians would do well to explore, for example, the close relations between computing and industrial engineering and management. The Harvard Business Review introduced its readers to the computer and to operations research in back-to-back issues in 1953 [HBR 53a, HBR 53b], and the two technologies have had a symbiotic relation ever since. Both had something to sell, both had to create a market for their product, and they needed each other to create it. Computers and computing have evolved in a variety of overlapping contexts, shaping those contexts while being shaped by them. The history of computing similarly lies in the intersections of the histories of many technologies, to which the historian of computing must remain attuned.

The notion of computing as commerce brings out the significance of techniques of reading artifacts, combined with the more usual techniques of textual criticism. To repeat: from the outset, computing has had to sell itself, whether to the government as big machines for scientific computing essential to national defense, to business and industry as systems vital to management, or to universities as scientific and technological disciplines deserving of academic standing and even departmental autonomy. The computing community very quickly learned the skills of advertising and became adept at marketing what it often could not yet produce. The result is that computing has had an air of wishful thinking about it. Much of its literature interweaves performance with promise, what is in practice with what can be in principle. It is a literature filled with announcements of revolutions subsequently (and quietly) canceled owing to unforeseen difficulties. In the case of confessed visionaries like Ted Nelson, the sources carry their own caveat. But in many instances computer marketers and management consultants, not to mention software engineers, were no less visionary, if perhaps less frank about it. What sources claimed or suggested could be done did not always correspond to what in fact could be done at the time. An industry trying to expand its market, engineers and scientists trying to establish new disciplines and attract research and development funding, new organizations seeking professional standing did not talk a lot about failures. The artifacts, both hard and soft, are the firmest basis for separating fact from fiction technically, provided one learns to read them critically. Doing so is essential to understanding the claims made for computers and programs, and why they were believed.

Doing History

Recognizing the elements of continuity that link computing to its own past, and to the past of the industries and institutions that have fostered it, brings out most clearly the relation of history and current practice. History is built into current practice; the more profound the influence, the less conscious we are of its presence. It is not a matter of learning the "lessons of history" or of exploring "what history can teach us", as if these were alternatives or complements to practice. We are products of our history or, rather, our histories; we do what we do for historical reasons. Bjarne Stroustrup notes in his history of C++,

We never have a clean slate. Whatever new we do we must also make it possible for people to make a transition from old tools and ideas to new.[Stroustrup 93, 294(47)]

History serves its purpose best, not when it suggests what we should be doing but when it makes clear what we are doing or at least clarifies what we think we are doing. On matters that count, we invoke precedents, which is to say we invoke history. We should be conscious of doing that, and we should be concerned both that we have the right history and that we have the history right.

Earlier, to make a point about "firsts", I cited Grace Hopper's evocation of the Model T. You may well have felt that my critique focused on a matter of detail that is irrelevant to her main point. Perhaps; but consider the historical basis of that main point. She was taking the automobile industry as a precedent for business data processing and by extension for computing as a whole. It was not the first time she had done so. The precedent she invoked in her famous "Education of a Computer" [Hopper 52] was also taken from the automotive production line, and it has persisted as a precedent down to the present: look at the cover of IEEE Software for June '87, where the Ford assembly line around 1940 serves as backdrop to Peter Wegner's four-part article on industrial-strength software [Wegner 84]; then look at the Ford assembly line of the '50s on the jacket of Greg Jones's Software Engineering in 1990. These are not isolated examples, nor are they mere window dressing. They reflect a way of thinking about software engineering, and there is history built into it, as there is in the notion of engineering itself; witness Mary Shaw's comparison of software engineering with other engineering disciplines [Shaw 90]. There were other ways to think about writing programs for computers. If people thought, and even continue to think, predominantly about automobiles or engineering, it is for historical reasons.

In commenting on a draft of this paper, Bob Rosin noted at this point that "automobiles and engineering are (outdated?) paradigms that are largely unfamiliar to younger computer people. Many kids, who grew up hacking on Atari's and PC's and Mac's, didn't hack automobiles and rarely studied engineering." The implied objection touches my argument only where I shift from history to criticism of current practice. Historically, at least through the '80s, software engineering has taken shape with reference to the assembly line and to industrial engineering as models. Those models are built into the current enterprise. One only has to read Doug McIlroy's "On mass-produced software components", presented to the first NATO Software Engineering Conference in 1968, to see where the conceptual roots of object-oriented programming lie [McIlroy 76]. Ignorance of the automobile and engineering, or at least of their role in the formation of software engineering, will not free a new generation of software developers from the continuing influence of the older models built into the practice they learn and the tools they use. Rather, it means that practitioners will lack critical understanding of the foundations on which they are building.

Moreover, it is not just software engineers who talk about the automobile. How often have we heard one personal computer or another described as the "Model T of computing"? What would it mean to take the claim seriously? What would a personal computer have to achieve to emulate the Model T as a technological achievement and a social and economic force? Would it be useful for historians of computing to take the Model T as a historical precedent? If not, what is a useful historical precedent? Are there perhaps several precedents?

An Agenda for History of Computing

So, maybe the history of computing is not so easy, but what's to be done? Let me conclude by setting out an agenda, for which I claim neither completeness nor objectivity. For one thing, it reflects my own bias toward software rather than hardware.

We need to know more than we do about how the computing industry began. What was the role of the government? How did both older companies and startups identify a potential market and how did the market determine their products? What place did research and development occupy in the companies' organization, and how did companies identify and recruit staffs with the requisite skills?

We need to know about the origins and development of programming as an occupation, a profession, a scientific and technological activity. A proper history here might help in separating reality from wishful thinking about the nature of programming and programmers. In particular, it will be necessary for historians of computing to embed their subject into the larger contexts of the history of technology and the history of the professions as a whole. For example, in The System of Professions Andrew Abbott argues that

It is the history of jurisdictional disputes that is the real, the determining history of the professions. Jurisdictional claims furnish the impetus and the pattern to organizational developments. ... Professions develop when jurisdictions become vacant, which may happen because they are newly created or because an earlier tenant has left them altogether or lost its firm grip on them.[Abbott 88, 2-3]

That is, the professions as a system represent a form of musical chairs among occupations, except that chairs and participants may be added as well as eliminated. Not all occupations are involved. Competition takes place among those that on the basis of abstract knowledge "can redefine [their] problems and tasks, defend them from interlopers, and seize new problems." Although craft occupations control techniques, they do not generally seek to extend their aegis.

One only has to page through the various computing journals of the late '50s and early '60s to see conflicts over jurisdiction reflecting uncertain standing as a profession. Whose machine was it? Soon after the founding of the ACM, it turned its focus from hardware to software, ceding computing machinery to the IRE. Not long thereafter, representatives of business computing complained about the ACM's bias toward scientific computing and computer science. Numerical analysts scoffed at "computerologists", inviting them to get back to the business at hand. One does not have to look hard to find complaints in other quarters that computing was being taken over by academics ignorant of the problems and methods of "real" programming. Similar tensions underlay discussions of a succession of ACM committees charged with setting the curriculum for computer science.

Following from that is the need for histories of the main communities of computing: numerical analysis, data processing, systems programming, computer science, artificial intelligence, graphics, and so on. When and how did the various specialties emerge, and how did each of them establish a separate identity as evidenced, say, by recognition as a SIG by the ACM or the IEEE or by a distinct place in the computing curriculum? How has the balance of professional power shifted among these communities, and how has the shift been reflected in the technology?

The software crisis proclaimed at the end of the '60s is still with us, despite the accelerated expansion of software engineering in the '70s and '80s. The origins and development of the problem and the response to it provides a rare opportunity to trace the history of a discipline shaping itself. It did not exist in 1969. Throughout the '70s and early '80s, keynote speakers and introductions to books repeatedly asked, "Are we there yet?", without making entirely clear where "there" was. Today, software engineering has a SIG, its own Transactions, its own IEEE publication, an ACM approved curriculum, and a growing presence in undergraduate and graduate programs. One only has to compare Barry Boehm's famous article of 1973 [Boehm 73] with reports about DoD software in any issue of Software Engineering Notes to doubt that the burgeoning of software engineering reflects its success in meeting the problems of producing reliable software on time, within budget, and to specifications. How, then, has the field grown so markedly in twenty years?

Software takes many forms, and we have begun the history of very few of them, most notably programming languages and artificial intelligence. Still awaiting the historian's attention are operating systems, networks, databases, graphics, and embedded systems, not to mention the wealth and variety of microcomputer programs. This is HOPL II; we still await HOS I (operating systems) or any of a host of HOXs. SIGGRAPH undertook a few years ago to determine the milestones of the field, but there has been little or no work since then. In each of these areas, history will have to look well beyond the software itself to the fields that stimulated its development, supplied the substantive methods, and in turn incorporated computing into their own patterns of thinking.

Finally, there remains the elusive character of the "Computer Revolution", first proclaimed by Edmund C. Berkeley, then editor of Computers and Automation, back in 1962 and subsequently heralded by a long line of writers, both in and out of computing.[Berkeley 62] Clearly something epochal has happened. Yet, as I pointed out several years ago in an article [Mahoney 88], it would be hard to demonstrate for the computer, in whatever form, as pervasive an impact on ordinary people's lives as Robert and Helen Lynd were able to document for the automobile in their famous study of Muncie, Indiana in 1924 [Lynd 29]. But here I am, back at the automobile again. Maybe I've been spending too much time with computer people.