Digital image processors; give your computer the gift of sight. Owen W. Linzmayer.
Digital Image Processors
Apicture is worth a thousand words. At least I'm told that's how it was before inflation hit, regardless, no one can dispute that the pictorial display of information is often invaluable. Until recently, however, the technology to display and process pictures on computers was available only to NASA and other high priority scientitic pursuits. Now there is a new crop of peripherals on the market that can turn your home computer into a digital image processor capable of taking snapshots of your friends or adding graffiti to still frames from an MTV video. Digital image processing involves capturing and manipulating computerized photographs.
Typically sold by third-party manufacturers as peripherals, digitizers are hardware devices that allow you to take the output of any standard video component and turn it into a graphics display. Most digitizers come with software that helps you capture a clear picture and then modify, enhance, merge, or manipulate it. Text can be superimposed onto digitized photographs, or you can put a bowler hat on the image of President Reagan delivering the State of the Union address. The possibilities are truly unlimited. Much could be written, and it has, about the many productive uses of digital image processors, but this is a tutorial on how computers have come to see the world as we do: visually.
In The Beginning
Let's begin with an object that you want to take a picture of and store in your computer. You place the object on your desk and focus the video camera on the object. Although in this example we are using a video camera, most of the digitizers available can be connected to any device, such as a laserdise player or video cassette recorder, that outputs standard video format (more on this later). It is a good idea to split the signal coming out of the camera so that you can view the original image on a monitor while the computer digitizes it (see Figure 1 for example set-up).
Light is reflected by the object, and the electromagnetic lightwaves travel to an image sensing device within the camera. This device translates the brightness of the light at a given location on its surface into an electrical voltage. Synchronization pulses are added to the signal to allow the receiving device to know where the sequence is in the frame data.
The camera rapidly makes hundreds of thousands of such translations every second to generate a complete video picture. At this point the original image of the object and its surroundings has undergone optical processing. The light intensity information is then sent from the camera to the closed-circuit monitor. Direct manipulation of the video signal by adjusting the contrast and brightness control knobs on the monitor is called analog image processing.
A Standard Emerges
Back in the 1950's, the Electronics Industries Association (EIA) developed the RS-170 specification, which prescribes all of the timing and voltage level requirements for standard video signals used in black and white television. Since then, the RS-170 standard has been modified to accommodate color signals. This color standard is commonly referred to as NTSC, which stands for National Television System Committee. In this article we concern ourselves with black and white digitizing only, but color digitizers work similarly. In fact, the Photocaster from Commsoft can actually do color digitization by scanning an image three times using red, green, and blue filters. The accompanying software then mixes the individual color images appropriately for a life-like color "photo.'
The standard video format image is sent to the monitor line-by-line, starting in the top lefthand corner of the screen and working its way down. As the electron beam sweeps the phosphor-coated inside of your monitor or television screen, the phosphors become excited and glow, when struck by the beam (see Figure 2). The voltage of the video signal corresponds to the strength of the beam and the brightness of the picture element (pixel). At the end of every horizontal sweep, a synchronization pulse, which moves the electron beam down and to the beginning of the next line, is received (see Figure 3). At the bottom righthand corner of the image a longer vertical sync pulse is sent to re-position the beam to the upper lefthand corner of the monitor.
However, to make a complete screen image, the electron beam must sweep the entire screen twice, with the second pass filling in between the horizontal lines of the first. This process is known as interlacing. Each complete top-to-bottom pass is a field, and two fields make up a video frame. A video frame consists of 525 lines and is sent from the camera to the monitor 30 times a second, but interlacing gives the appearance of a new screen every 1/60 of a second.
Pixel This if You Can
The image that is produced on the closed-circuit monitor in our example is made up of shades of grey ranging from black to white and is called a continuous tone image because there is no discernible difference between two adjacent tones of grey. This continuous tone image must be chopped into small, discrete pieces of information that the computer can understand. The smallest element of a display that can be controlled individually is called a picture element, or pixel. The number of pixels that must be digitized is limited by the display resolution of the computer in question. Optimally, the number of pixels in our digitized image would equal the number of pixels in the original image.
A pixel can be described by its two characteristics: brightness and placement. Quantization is the process by which the digitizer chops up the original image and assigns brightness and coordinate values to each pixel. Exactly how the digitizer does this is described below.
The RS-170 signal enters most digitizers from a cable that is connected to the video source (in our case, a camera). Inside most digitizers the signal is then routed to both a sync extractor and an analog-to-digital (A/D) converter (see Figure 1). The extractor concerns itself exclusively with the synchronization pulses that must be used by the system controller for timing purposes. The A/D chip has the job of converting the voltage signal into digital values that the computer can handle.
Simply Brilliant
As mentioned above, the brightness of a pixel is determined by the voltage level of the RS-170 signal at a particular time. When a pixel is at its maximum brightness level, the signal has a certain voltage. Any time this voltage is supplied, the pixel in question is turned up to its maximum brightness. Inside the digitizer there is an analog comparator which compares the incoming voltage to a pre-set level that is called the threshold. The threshold can be set manually by a knob on the digitizer, as is the case with ComputerEyes from Digital Vision, or via software that comes with the package. If the incoming voltage is greater than the threshold, the computer knows that the pixel is brighter than the specified threshold level. To determine exactly how much brighter that pixel is, however, we need to sample its voltage several times with graduated threshold levels. These different thresholds correspond to the levels of grey that make up the grey scale of our digitized image.
Grey Hairs, Grey Scales
If we were dealing with a high-contrast image (a line drawing, for example), we would need only one threshold and two grey levels: white and black (for blank paper and ink respectively). Only one bit of memory would be required to store the brightness information in binary form. The more levels of grey you have, the smoother the transition from one level to the next. However, if you have several grey levels, you need more memory to accommodate the increased information. For each additional level of grey you wish to define, twice as much memory is needed to store the information and increase by one the number of samples that must be taken to determine the brightness. Most digitizers use the successive approximation method of analog to digital conversion. Since this method can handle only one pixel per line, a 3-bit digitization would require three scans. More expensive digitizers use flash A/D converters which have individual voltage comparators for each grey level to be detected and converted. An 8-bit quantization image with resolution of 256 X 256 pixels is standard for high end digitizers and allows for 256 different levels of grey. However, since such a device requires approximately 66K of memory, this level of sophistication is overkill for most home computer applications.
Places, Everybody
Once the brightness level of a pixel has been determined, the digitizer must assign coordinates for the pixel so that it can be positioned correctly in the final digitized image. The digitizer software uses a timing delay in conjunction with the sync pulses of the video signal to determine where a pixel belongs. Since most microcomputers aren't fast enough to quantize and store each pixel as it is sent to the system from the video source, several digitizer manufacturers have decided to sample columns of pixels, rather than rows. If the system waits 13 micro-seconds, for example, each time it receives a horizontal sync pulse, then it will always sample pixels in the same column (see Figure 4). By increasing the time delay by a fraction of a miscro-second, the digitizer samples a different column. Each time the brightness of a pixel is determined, the software assigns coordinate values by plugging the delay information into a special algorithm. By gradually incrementing the delay after scanning each frame, the digitizer samples all of the columns of pixels that make up a screen. Once the digitizer has determined the brightness value and coordinates of a pixel, the information is placed into the memory of the computer.
Since flash A/Ds process images so much more quickly, the information would run into a transfer bottleneck if it were piped directly into main memory. Although quantization is fairly quick and easy, the situation is complicated and slowed down by the constant writing to memory to store the image. For this reason, most digitizers that use flash A/Ds are equipped with on-board memory banks that can accept the quantization information as quickly as the flash A/D can process it. Such a configuration is called a frame grabber, because it can literally digitize an entire video frame in real-time in contrast to units that take two to ten seconds to digitize a static image.
To display the digitized image, the software must tell the digital-to-analog converter to translate the numeric values that are stored in memory into an electrical signal which has the appropriate synchronization pulses added by a sync mixer. This reconstructed RS-170 signal is then sent to the computer display as a digitized version of our original image. If you know something about computer graphics, you may know that most computers can't display levels of grey, but rather, only black or white; on or off. You may be asking yourself how the computer can possibly display grey scales. Well, quite frankly, most can't.
New Year's Resolution
The Apple IIe computer, for example, has a maximum display resolution of 560 pixels by 192 lines, for a total of 107,520 pixels--each of which can be either on or off at any time (see Figure 5). A television, due to the limits of the RS-170 video specification, can display 184,300 pixels, yet each one can be a different brightness level and color. To get the image on the television screen digitized on the computer monitor requires some software tricks.
Since you may have more information in memory than you can display, you must use the information that is most representative of a certain location in an image. The computer takes an average brightness level of several adjacent pixels in the original image and uses this average intensity for the pixel in the digitized image (see Figure 6). A good package, however, saves the grey scale information rather than the displayed image to disk, because the digital image is restricted by the resolution of the computer and its monitor. The grey scale information may be very useful at a later date when using a printer capable of even greater resolution than that of the monitor.
Change is Constant
Now that you have the image up on the screen, you may wish to make slight, or even major, alterations. Images can be processed just as easily as words--however, the right software is needed. As mentioned earlier, most manufacturers bundle image processing software with the digitizer. As you would expect, these packages range from marginally adequate to exceptionally useful. A spartan system is one which simply captures, displays, and then stores an image. More complex packages allow you to clean up the image with commands that smooth surfaces, highlight edges, and remove extraneous data. You can also take an image created by MacVision from Koala, for example, and alter it using the MacPaint program. The extent of digital image processing you can do is limited only by the manner in which the image was digitized (what information is retained) and the sophistication of the applications software you are using.
Hopefully, you now have a basic understanding of how computers have been given the gift of sight by video digitizers. The principles are the same for most digitizers, with specifics varying from one model to another. As digital image processing works its way into the mainstream of computing, expect to see more and more interesting applications for digitizers. Prices will drop and digitizing technology will advance--in speed, image quality, and color capabilities. So, take the blindfold off your computer's eyes and let it see the world in a whole new light.
Photo: Figure 1.
Photo: Figure 2.
Photo: Figure 3. RS-170 field interlacing.
Photo: Figure 4. A constant time delay insures that you will sample all of the pixels in a given column (#105 in this example).
Photo: Figure 5. Pixel resolution.
Photo: Figure 6.