HOME

Position paper on the impact of microprocessors (IFIP77)

Show original manuscript

During the opening ceremony of this IFIP Congress you have heard from the official platform that the congress will be a great success.

During the closing ceremony you are certain to hear from the official platform that the congress has been a very great success.

The reason for this praise is obvious: it is to compensate for the sore truth that it is mostly pretty miserable, and pretty confusing. But from which official platform will you hear so? Not from those who present their paper. It it's here, than panels have a special opportunity, a special obligation even.

So here comes, as an antidote for the euphoric mood that characterizes congresses in general and as counterpoison for the sales literature, the motto for my contribution to this panel discussion. The motto is:

"Microcomputers are not great."
(One might even add that they are usually too small!)

*                *
*

The most common argument in favour of microcomputers that I hear is their low price. Now let me give you a general advice for "deconfusing" an otherwise confusing presentation. The advice is to remember that when anyone is talking in terms of money, it is highly improbable that he knows what he is talking about, for "money" is a very vague and elusive notion when you come to think about it.

For some people, their dollar or their pound, their guilder or their yen, is not just their currency unit, but has become their unit of thought. Love of perfection is then driven out by love of cheapness, and eventually they find themselves surrounded by junk.

Coins and currency notes are the mechanization of the counting process, invented for the balancing of our mutual rights and obligations; but the invention of money took place so long ago that in the meantime no one is any longer quite sure what it is supposed to count, so, like the King in his counting house, we count money instead.

To illustrate the confusing nature of money, let us return to computers. A quarter of a century ago we were told that computers were so expensive because they were so great; now we are told that microcomputers are great because they are so cheap!

Well, if you can accept both arguments without getting your brains into a most painful knot, you can do more than I. To save my sanity, I have decided long ago not to understand money; I find this decision most salutary and refreshing and can recommend it warmly to all of you.


*                *
*

The early computers were such feats of technology that no one complained that when one tried to use them, the shoe always pinched: either the machine was too slow or the store was too small or both. But, as I said, one did not complain, impressed by the marvel that the machine existed at all. For many years the art of programming seemed to exists of squeezing as much as possible out of the limited equipment, and all dirty tricks were permitted, provided they were effective.

It was only later, when we got a little bit more latitude, a little bit more "headroom" so to speak, that we learned that the price we had paid for the squeezing out of the last ten percent or so had been excessive, and that we learned that we greatly reduced the trustworthiness of our programs at the price of great expenditure of our scarcest resource, viz. our brainpower. It was only slowly that we learned that good taste and a systematic mind were greater assets for a reliable programmer than his inclination to solve puzzles. This much better understanding of the adequate demarcation between what Man should think and Machine should compute, could only emerge in the sixties thanks to the improved hardware technology.

And what has happened now? Technology has so dramatically "improved", that programming is invited to make the gigantic leap: that is, a leap of 25 years backwards! The paradoxical fact is that we are back where we were 25 years ago. Again the arithmetic is too slow or the store is too small; again we have machines with chaotic, unsystematic order codes, the design of which has been influenced more by considerations of hardware technology than by the question whether a wise programmer would care to use them. Recently someone showed me an issue of one of the monthly magazines for the microprocessor hobbyist, and I tell you, it was a severe shock to see a revival of all our old mistakes. It was frightening, it was depressing, it was sickening, and I hope never to see such magazines again.


*                *
*

A possible explanation for the lure of the microprocessor is that now each person can have his own computer, and will be relieved from the burdens and risks of having to share a big monstrum with many others. For many the sharing of a big installation has, indeed, been a most frustrating experience. But is the historical accident that a most influential computer manufacturer chose to market a hardware configuration for which the design of a decent operating system was impossible, a sufficient justification for the conclusion that sharing a big computer is a bad idea? I think the conclusion rather rash: lovely operating systems have been built as well, you see.

How rash that conclusion is will be experienced by all who are seduced by the slogan that microprocessors are now so cheap that, when you need more processing power, you just hook a number of them together! The problem how to distribute the computing load over a larger number of processors is, as far as I am aware, in its full generality far from understood. And even in those circumstances where the load distribution does not present a serious problem, the organization of the communication between the different processes is a problem of which only the fool talks lightly. The simplicity provided by a large central store is perhaps even more striking than the simplicity provided by a fast central processor.


*                *
*

In order to complete my tale of doom, let me tell you one further respect in which the so-called improved technology may set us back 25 years: be prepared for very unreliable hardware. First of all, it seems that the available techniques for proving that the design of a chip is flawless still leave a lot to be desired, and, as far as they are available, it is far from certain that, indeed, they are applied. But secondly, even if the design is flawless, how do we convince ourselves that each of the thousands and thousands of chips made according to that design, is, indeed, a faithful embodiment of that design? The problem how to implement that final quality control is unsolved, and I expect chip manufacturers to resort to the same practices as the automobile industry which gladly leaves the final stage of the quality control to the poor customer: it is becoming rarer and rarer that one buys a new car that, as delivered, is in perfect condition. I fear that chip manufacturers are technically forced to adopt the attitude that a chip sold is perfect as long as the customer doesn't complain. And when he complains, just give him a new one and let him try again!

This imposes upon the customer the obligation to distrust the hardware he has bought and to check continually its proper functioning by means of redundant computations, as I do not know of an established discipline for achieving a sufficiently effective redundancy.


*                *
*

In summary: the microprocessor will provide at low material cost a delightful outlet for the uneducated computnik with nothing better to do, but the possibility of massproduction of those infernal gadgets carries with it the danger of draining our intellectual powers to an extent that no society can afford. In short:

"microprocessors are not great!"
Plataanstraat 5
5671 AL NEUNEN
The Netherlands
prof.dr.Edsger W. Dijkstra
Burroughs Research Fellow