Who's in control? (Philosophy - how it ought to be) Peter Rodwell.
Britain's first microcomputer show worthy of the name took place back in 1978, about the time I was becoming seriously involved in the industry. I remember standing at one booth watching an enthusiastic salesman show a puzzled prospect the very latest piece of gee-whiz hardware, a circuit board covered in chips with a rat's nest of wires connecting it to an uncased keyboard, a cassette deck, and a screen, which was displaying an impressive hex dump. "But how do I get it to print out my invoices," the punter asked in despair.
At a nearby booth, another smart-suited character was waving a cassette under a clien'ts nose; they were standing in front of one of the first 8K Pets to arrive in Europe. ". . . and you can store details of up to 150,000 stock items on this one tape!" the salesman intoned.
Well, we have come a long way from those days! It took a long time, but the computer industry has finally recognized the importance of the "user interface." We had a brief spell when we all ran about proclaiming the necessity of computer literacy: everybody must learn about computers, we chanted. Then, of course, it dawned on us that if we were so bloody clever, why couldn't we make our customers people literate instead? Suddenly our micro world was infested with mice and adorned with more icons than Leningrad's Hermitage Museum.
Over here in europe, we like to think that several thousand miles of ocean and continent give us a balanced perspective on the latest micro-trends pouring forth from California. We haven't, for example, taken the IBM PC to our hearts at all--partly because it is expensive and partly because IBM delayed for 18 months before selling it over here, giving Victor and others the chance to make it with much nicer and cheaper computers.
The Victor 9000 (called the Sirius 1 here) still outsells the IBM, and IBM is reported to be very, very worried about the super new Apricot. Likewise, for price reasons, the Apple II was never a home/hobby machine here but was always a business computer--at first rather upmarket, now fading rapidly. Going Overboard
When Lisa appeared, the world seemed to go bananas over mice. At least America did; we thought it was one answer, not the answer, and in fact I am increasing skeptical about mice. For instance, like most writers I know, I work in total desktop chaos. Besides my Sirius and two boxes of disks, I have great mountains of paper and assorted micro-junk covering every horizontal surface, with only a little free space left for a notepad and a coffee cup. There simply isn't room to swing a mouse.
Icons, while being quite nice to look at and probably fairly helpful to a raw beginner, can begin to grate after some experience on a system; I have a theory--as yet unproven--that the average executive might start to resent being treated like a child by the unspoken assumption, inherent in icons, that he can understand pictures but not words.
I don't want you to think I'm totally negative, however. I think mice and icons have a place, but it is a much smaller place than the current trend would have it. Personally, I think the touch screen--as on the HP 150--is a far better pointing and selecting device than the mouse. The operation is more natural--you look at the screen and point, all in one movement, while the mouse requires you to look at the screen, then at the mouse, then back at he screen while you maneuver the rodent and finally to press a button or a key to tell the computer you have chosen.
In 1984 at least one very low cost--but powerful--business computer with speech recognition built in will hit the market. There is nothing new about speech recognition in itself, of course, and it still has a long way to go before it is perfect, but it is interesting that we are now at the point where this facility can be offered as standard.
"Human engineering" is now the single most important area of microcomputing, and we have a long way to go yet! Take, for instance, the operating systems we micro users currently have at our disposal. The market leaders--PC-DOS/MS-DOS and the CP/M family--all leave a lot to be desired, but then so do the challengers, the UCSD p-System and Unix and its spinoffs. They all share a common fault: they were designed by programmers. They are all easy to use if you are computer literate, but they are progressively more awful (in the order in which I have named them) for the uninitiated. We need some-thing new . . . but what? The Ideal
I suggest a combination of what we have now, taking the best points of each and refining them. Now that 16-bit micros are the rule rather than the exception, we have a lot more space for providing refinements; a major advantage of more CPU bits is, after all, the ability to address more memory.
Let's start with the heart of the p-System, because it is a good idea and we need as much in the way of portability as we can get now. Let's build it up a bit by incorporating Digital Research's GSX system, expanded to provide a fully portable terminal handler and software printer interface, because GSX is undoubtedly the way to go with the graphics interface problem. (MicroSoft's MSX standard is, in my opinion, silly; the last thing we need is hardware-dependent standardization.) And then let's wrap this up in a software interface that is as comprehensive as that of the Macintosh in terms of the number of facilities it gives the programmer.
What we now need is a common user interface, and here's where we enter more difficult ground. Personally, I like the simplicity of the CP/M and PC-DOS/MS-DOS approach when compared to the more complex systems, and I would like a more graphical refinement of the p-System interface, approaching that of the Macintosh/Lisa but accepting commands from whichever device the user happens to prefer--keyboard, speech, touch screen, mouse I think much more research needs to be done into developing a man-machine dialogue which the uninformed can use without needing to look at a manual but which still allows the more experienced user to work fluently and efficiently.
As if all this wasn't enough, the system must be written so that manufacturers can install it with minimal effort (just a machine-dependent core). It must allow extra device drivers to be added easily by third-party software writers. It must be able to run existing CP/M, PC-DOS, and MS-DOS software unaltered, and it must be ROM-able--all of it.
Can this be done? Yes, I'm sure it can, and it probably will be done one day. A major obstacle is the tendency to believe that because we have developed one new interface--icons, mice, windows, whatever--this is it and we can go overboard about it to the extent of blinding ourselves to its disadvantages and to other possible solutions. The first person to overcome this and to adopt a broad user interface perspective will be the one to do it--and I'll be he or she turns out to be a self-taught programmer.
If you were to write such an operating system today, you would have the devil's own job selling it because the opposition has gained too strong a hold. My final suggestion, therefore, is that we will never see such a "perfect," universally standardized operating system unless its author gives it away! And, crazy though it may sound, I have a sneaky feeling that this might just happen.