That American Technopoly has now embraced the computer in the same hurried and mindless way it embraced medical technology is undeniable, was perhaps inevitable, and is certainly most unfortunate.
This is not to say that the computer is a blight on the symbolic landscape; only that, like medical technology, it has usurped powers and enforced mind-sets that a fully attentive culture might have wished to deny it.
Thus, an examination of the ideas embedded in computer technology is worth attempting. Others, of course, have done this, especially Joseph Weizenbaum in his great and indispensable book Computer Power and Human Reason.
Weizenbaum, however, ran into some difficulties, as everyone else has, because of the “universality” of computers, meaning (a) that their uses are infinitely various, and (b) that computers are commonly integrated into the structure of other machines.
It is, therefore, hard to isolate specific ideas promoted by computer technology. The computer, for example, is quite unlike the stethoscope, which has a limited function in a limited context.
Except for safecrackers, who, I am told, use stethoscopes to hear the tumblers of locks click into place, stethoscopes are used only by doctors. But everyone uses or is used by computers, and for purposes that seem to know no boundaries.
Putting aside such well-known functions as electronic filing, spreadsheets, and word-processing, one can make a fascinating list of the innovative, even bizarre, uses of computers. I have before me a report from The New York Times that tells us how computers are enabling aquatic designers to create giant water slides that mimic roller coasters and eight-foot-high artificial waves.
In my modest collection, I have another article about the uses of personal computers for making presentations at corporate board meetings. Another tells of how computer graphics help jurors to remember testimony better. Gregory Mazares, president of the graphics unit of Litigation Sciences, is quoted as saying, “We’re a switched-on, tuned-in, visually oriented society, and jurors tend to believe what they see. This technology keeps the jury’s attention by simplifying the material and by giving them little bursts of information.”
While Mr. Mazares is helping switched-on people to remember things, Morton David, chief executive officer of Franklin Computer, is helping them find any word in the Bible with lightning speed by producing electronic Bibles. (The word “lightning,” by the way, appears forty-two times in the New International version and eight times in the King James version. Were you so inclined, you could discover this for yourself in a matter of seconds.)
This fact so dominates Mr. David’s imagination that he is quoted as saying, “Our technology may have made a change as momentous as the Gutenberg invention of movable type.”4 And then there is an article that reports a computer’s use to make investment decisions “which helps you, among other things, to create “what-if” scenarios, although with how much accuracy we are not told.5 In Technology Review, we find a description of how computers are used to help the police locate the addresses of callers in distress; a prophecy is made that in time police officers will have so much instantly available information about any caller that they will know how seriously to regard the caller’s appeal for help.
One may well wonder if Charles Babbage had any of this in mind when he announced in 1822 (only six years after the appearance of Laënnec’s stethoscope) that he had invented a machine capable of performing simple arithmetical calculations.
Perhaps he did, for he never finished his invention and started work on a more ambitious machine, capable of doing more complex tasks. He abandoned that as well, and in 1833 put aside his calculator project completely in favor of a programmable machine that became the forerunner of the modern computer.
His first such machine, which he characteristically never finished, was to be controlled by punch cards adapted from devices French weavers used to control thread sequences in their looms.
Babbage kept improving his programmable machine over the next thirty-seven years, each design being more complex than the last.6 At some point, he realised that the mechanisation of numerical operations gave him the means to manipulate non-numerical symbols.
It is not farfetched to say that Babbage’s insight was comparable to the discovery by the Greeks in the third century B.C. of the principle of alphabetisation—that is, the realisation that the symbols of the alphabet could be separated from their phonetic function and used as a system for the classification, storage, and retrieval of information.
In any case, armed with his insight, Babbage was able to speculate about the possibility of designing “intelligent” information machinery, though the mechanical technology of his time was inadequate to allow the fulfilment of his ideas. The computer as we know it today had to await a variety of further discoveries and inventions, including the telegraph, the telephone, and the application of Boolean algebra to relay-based circuitry, resulting in Claude Shannon’s creation of digital logic circuitry.
Today, when the word “computer” is used without a modifier before it, it normally refers to some version of the machine invented by John von Neumann in the 1940s. Before that, the word “computer” referred to a person (similarly to the early use of the word “typewriter”) who performed some kind of mechanical calculation. As calculation shifted from people to machines, so did the word, especially because of the power of von Neumann’s machine.
Excerpt From: Neil Postman. “Technopoly: The Surrender of Culture to Technology.”