What ever happened to performance? (Philosophy - how it ought to be) Will Fastie.
We are told, over and over again at every opportunity that computer technology, specifically the computer technology the speaker is trying to sell us, has taken another giant leap forward. Although the claim is sometimes exaggerated for profit, it is nonetheless true: those computers sitting on our desk, whatever flavor or color they might be are more powerful than the biggest computers of 30 years ago. For that matter, the calculators in our pockets and purses are more powerful than those monsters of three decades past.
Making this same claim for the ten years that Creative Computing has spanned is more difficult, but it is true as well. To see the improvement, however, it may be necessary to consider more complicated issues, such as the amount of work performed by computers ten years ago as compared with today, and the relative costs of each. To veteran readers of Creative, the answer is clear.
It is fine to know that we pay less today for more collective power. A more interesting query has to do with the power of an individual system today as compared with a similar computer of ten years ago. I find myself scratching my head on this one; although I could never have afforded the rough equivalent of my basement IBM PC back then (not to mention the cost of electricity to run it), I find myself longing for some of its features and functions and missing, oh so sorely missing, its power. And what has been bothering me is that I don't know why I can't have it.
Photo 1 (recently taken) shows a system equivalent to the one I used professionally for software development in 1974 when I worked for a division of General Instrument Corporation. It consisted of a Data General Nova 1200 processor with 32K works (64K) of core memory; two 45 inch per second (ips), 9-track, 1/2", 800 bits per inch (bpi) tape transports capable of handling reels of up to 2400 feel in length; one fixed-head disk drive storing 500K of data (within a year we had a whole megabyte!); and, as the 10 character per second (cps) console, the venerable Teletype model 33 KSR terminal, later replaced with the 30 cps DECWriter. We had three or four of these systems; one or two of them also had a 600 line per minute (lpm) line printer. Because the system was more or less equivalent to the configuration we sold, we could also use customer systems for developdevelopment before they shipped, and we often did.
The system lived in a double-bay cabinet standing over six feet tall. It required about 12 square feet of floor space, not counting the terminal, printer, or access to the rear. The main power cable was 3/4" in diameter. Using 110-volt power, the system required its own 20-amp circuit. Even though the Nova had its own cooling fan, the cabinet included an integral ventilation system. The system operated with a muted roar.
At list, the system cost about $50,000. The printer cost $20,000. I have about $6000 invested in my IBM PC. A quick comparison of the two can be seen in Table 1. Raw Power
The instruction execution times for my old Nova and my new PC are not by themselves enough to measure their comparative raw power. Nonetheless, we will try. Table 2 shows the execution times for a few typical instructions on the Nova, the 8088 in the PC, and a theoretical faster PC. (There has been some speculation that a new, 8088-based IBM PC/XT will operate at 8 MHz instead of the current PC's 4.77, or that an 8086-based machine will emerge.)
The Nova fares pretty well, especially in the memory access category. For most of the processing I am thinking about, the primary type of activity is memory access, for example, an assembler inputs source code and translates mnemonic codes into binary numbers, that is, executable instructions. Although something of an over-simplification, this activity mostly requires moving things around in memory as opposed to performing arithmetic calculations.
Overall, the 8088 chip running at IBM's 4.77 MHz rate is much more powerful. The instruction set is rich compared to the older, somewhat primitive Nova. For example, the Nova has no multiply instruction and no instructions to manipulate bytes. The 8088, on the other hand, includes powerful string handling commands, an assortment of memory access instructions, and other special features which combine to make an effectively written program quite powerful. At the bottom line, I consider the PC more powerful than my Nova of ten years ago, by a comfortable margin.
Yet, the performance of the PC is lacking. I find myself wishing for the Nova. What's wrong? Comparable Software?
What's wrong is that the software of today is not as optimized for performance as those more primitive programs I used on my Nova. They are also not as small. The facts of the matter are presented in Table 3.
Why has this happened? Why are the new programs neither as fast nor as small as their older counterparts?
There are several reasons. The newer IBM PC has considerably more memory resources than the Nova. Nature abhors unused main memory, and programs grow accordingly. Associated with this fact is that most of the tools of today are written using high level languages. These languages generate moderately efficient code that is nevertheless far from optimal.
In sharp contrast, the program writers of ten years ago had no high level languages with which to work. In addition, main memory was so expensive ($80,000 per megabyte vs. $3000 today) that many systems were configured with less than the maximum supported. Even in such tight systems, the basic tools had to operate. Amazing as it might seem today, the complete assembler occupied less than 8000 bytes of memory and ran faster than a speeding bullet.
I think it may also be fair to point out that the assembler writer pored over that code for a long time, tightening things up, finding faster or cleverer ways to do things, and generally optimizing the bejabbers out of the program. Today, I suspect the major consideration is to get the program written as quickly as possible and working to a specified level of functionality without over-abundant concern for speed of operation. Programmers are expensive, more expensive than the computers upon which they work. That is a reversal from ten years ago, and it may account for the change.
What ever happened to performance? I guess it just got lost in the shuffle. If I see it again, I think it will be the result of ever more powerful hardware and large chunks of cheap memory; tightly written, highly optimized code seems to be a thing of the past.