This article in the premiere issue of Information Today was written while I was near the beginning of my bicycle adventure… I remember it feeling almost abstract as I held forth on matters of machine selection while my own life was focused on ultra-portability. It deals with some of the philosophical issues, and is amusing from the perspective of almost 40 years later…
by Steven K. Roberts
Buying a computer can be a somewhat unnerving experience. If you are new to the game, it can be a lesson in humility as computer-store sales personnel snow you with disk drive specifications and the subtle trade-offs of competing machines. If you are an old hand at the computing game, you may be shocked at the profusion of alien machines that have displaced the old standbys of yesteryear. You may well find that the simple desire to “find a computer for online use” sets you off on a long and maddening quest.
The problem, of course, is one of information. Nobody wants to plunk down a few thousand dollars for a mysterious contraption backed only by the exuberant claims of a commissioned salesperson, yet this is sometimes your only option. How do you decide?
This series of articles is intended to give you the basic background information you need to evaluate the micro field. But don’t panic! Although we’ll be considering all sorts of hardware and software matters, this will not degenerate into an arcane discussion of competing integrated circuits and design techniques. We don’t want to make your eyes glaze over with an overdose of technical jargon — we just want to demystify the technology. There’s a big difference.
These are user-friendly technical articles.
A fitting opening, perhaps, is a discussion of one of those great burning issues of system acquisition: “How many bits?” Ideally, you wouldn’t have to worry about such things, being more concerned with various machines’ abilities in the context of your actual needs, but the technology is yet young enough that you have to care.
So now that you care how many bits your machine has, let’s see what that means, and why it’s so important.
Von Neumann and his bottleneck
All computers currently available, with the exception of a few experimental systems developed for artificial intelligence applications, suffer from one basic structural limitation. All data must be funneled through a single site called the Central Processing Unit. This “CPU” might be impressively fast, yielding performance that is well nigh awesome, but still, everything going on in the system — be it a spread-sheet program or a video game — actually takes place sequentially. No matter how fast the machine is, the CPU still fetches one datum at a time from memory and then trundles off with it to turn a primitive logical crank.
This basic structure has been tagged the “von Neumann architecture” in honor of the computer pioneer who conceived it. Back then, it was pretty exciting stuff, but serious effort is now underway to make it obsolete forever and to conjure a new style of machine that can deal more effectively with “knowledge” and “‘inference’’ rather than “data” and “procedures.”
But that, alas, is not our concern. When you wander into a computer store with a purchase order, you are buying a von Neumann machine.
So that leaves us with an obvious question: if the CPU is indeed fetching one chunk of data at a time for processing, isn’t its performance dependent upon the size of that chunk? Indeed.
That size is expressed in bits — eight bits being the most common microsystem “bus width.” This evolved some years ago when it became obvious that it takes 7 bits to uniquely represent each key on a keyboard, since 7 bits can be arranged 128 different ways. Seven was inelegant from a logic-design standpoint, however, so 8 bits (yielding 256 possible combinations) became the standard. This pervaded the microcomputer industry so thoroughly that anything other than an 8-bit machine was considered exotic as recently as 1980.
The width of a system’s data path has a number of ramifications besides the convenience of handling keyboard information. Among other things, it determines how easily the computer can deal with a large amount of memory. 8-bit machines have an intrinsic limitation at 64K bytes of memory (about 40-45 double-spaced typewritten pages, for an appreciation of scale). Many systems include some method of expanding this through techniques such as “bank switching” and “paging,” but that is relatively crude in comparison to machines that can address large amounts of memory directly.
To complicate matters, some 16-bit machines also rely on such trickery to handle large amounts of memory. The Intel 8086, for example, can directly address only 64K, just like its 8-bit ancestors. But it comes with built-in mechanisms for doing things that are a major pain with the 8-bangers, so you still come out ahead. Other 16-bit machines use different approaches, with the Motorola 68000 being the star performer as far as memory management is concerned: it allows use of 16 megabytes without any extra effort. This is about 250 times the amount: we are used to in the micro world.
Look to the application
Details about memory addressing are moot, of course, unless you have an application that forces you to worry about such things. The typical single-user office machine gets along just fine with an 8-bit bus, only bogging down seriously when you try to do more than one thing at a time. Suppose, for example, you are trying to make the unit perform word processing and database manipulation simultaneously. Since the processor can only do one thing at a time, and since both tasks require a fair amount of memory, an interesting requirement appears. The computer must context switch every few thousandths of a second in order to give both users the illusion of a dedicated machine.
This is time-consuming and can be a major design headache with an 8-bit system. With limited memory, the programs and data associated with the different tasks must share the available space, raising the number of disk accesses required — as well as the frustration of the users. The bottom line: slow. So in this kind of environment, it starts to make sense to consider some of the more robust 16-bit systems.
The larger machines are also of considerable value if raw computing speed is an issue. This is not the case in a word-processing application, where the computer spends most of its time in a boring loop, waiting for the human to hit another key. But serious number-crunching, expert system applications, or extensive compilation and software development may well justify the added expense of a 16-bit system.
Cost is not the only negative aspect that you should consider when that salesperson is heralding the magic of the latest super-zoomer. 16-bit systems offer undeniably higher performance, but they haven’t been around very long, relatively speaking. In the 8-bit world, you have a significant resource at your disposal: thousands of well-debugged and inexpensive programs to choose from, as well as a massive user population to provide support and buying power. In the 16-bit world, you can find some truly remarkable computers, but the software is rare and costly, the local computer clubs are unlikely to yield much in the way of seasoned veterans who know your system, and you are very likely tied to a single vendor that may or may not remain cooperative after the sale. Caution is in order.
Somewhere in here, I should mention the IBM PC. Something of a hybrid between the 8- and 16-bit worlds, this system uses an 8-bit bus but performs internal calculations with 16 bits. It has quickly amassed a substantial body of hardware and software support — and it can be upgraded to full-scale 16-bit operation by an ambitious enough user.
So. As usual, life is a tyranny of trade-offs. Buying the hottest new products can be a costly mistake, unless you really need the added horsepower of the latest technology. In typical single-user online applications, such as smart database terminal packages and downloading/formatting programs, you can be very well served by a standard 8-bit machine and the wealth of available support that comes with it.
Watch this space
In future articles of this series, we will be considering all sorts of micro-related issues: Vendor relations, maintenance, insurance, peripheral selection, user education, accessories, file organization, communications features, and countless other subjects will occupy our attention. You are invited to write to me in care of Information Today — my intent here is to help demystify this crazy micro field in any way I can, and knowing your concerns will help immensely.
Next month: The System Environment.
About the author
Steven K. Roberts is the author of numerous magazine articles, as well as three books about microcomputer-related subjects including Creative Design with Microcomputers and The Complete Guide to Microsystem Management, both from Prentice-Hall. He travels full-time on a word-processor-equipped recumbent bicycle and communicates with his publishers via the CompuServe network.
Steve can be reached at CompuServe ID# 70007,362 or via postage mail care of this publication.