Famous Graphics Chips: IBM’s VGA

By Dr. Jon Peddie
Published 03/12/2019
Share this on:

It is said about airplanes that the DC3 and 737 are the most popular planes ever built, and the 737, in particular, the best-selling airplane ever. The same could be said for the ubiquitous VGA, and its big brother the XGA. The VGA, which can still be found buried in today’s modern GPUs and CPUs, set the foundation for a video standard, and an application programming standard. XGA expanded the video standard to a higher resolution, and with more performance.

On April 2, 1987, when IBM rolled out the PS/2 line of personal computers, one of the hardware announcements was the VGA display chip, a standard that has lasted for over 25 years. While the VGA was an incremental improvement over its predecessor EGA (1984) and remained backward compatible with the EGA as well as the earlier (1981) CGA and MDA, its forward compatibility is what gives it such historical recognition.

Integrated motherboard

Figure 1: IBM’s highly integrated motherboard-based VGA chip (Source Wikipedia)

The IBM PS/2 Model 80 was the first 386 computer from IBM and was used to introduce several new standards. Most notably there was the onboard VGA graphics with 256 kB RAM, the 32-bit bus Microchannel Architecture (MCA), card identification and configuration by BIOS, and RGB video signal route through. The MCA could accommodate the previous generation 8514/A graphics board, while the VGA chip was on the motherboard.

One of the significant features of the VGA was the integration of the color look-up-table (cLUT) and digital to analog converter (DAC). Before the VGA, LUT-DACs, as they were called, were separate chips supplier by Brooktree, TI, and others — those products were soon to be obsolete, but it didn’t happen overnight. The integrated logic of the VGA also contained the CRT controller and replaced five or more other chips; only external memory was needed. The VGA showed the path to future fully integrated devices.

The VGA also sparked a new wave of cloning and made the fortunes of several companies such as Cirrus Logic, S3, Chips & Technologies, and three dozen others.

The IBM 5162, more commonly known as the IBM PC XT/286 was an extremely popular PC and used a 16-bit expansion bus, which allowed upgraded graphics boards to be plugged in replacing the IBM EGA board. Because the PS/2 used the MCA, some board manufacturers offered a board with two tabs, one for ISA, and one for MCA. And shortly later in 1988 the Extended Industry Standard Architecture bus for IBM PC compatible computers was introduced. It had MCA and ISA signaling. It was developed by a consortium of PC clone vendors (the “Gang of Nine”) as a counter to IBM’s use of its proprietary Micro Channel Architecture (MCA) in its PS/2 series, and boards appeared with tabs for both ISA and ELISA socket.

 

VGA board

Figure 2: A VGA board with ELISA tab (top) and ISA tab (bottom) – note VGA connector on each end of the board (Source ELSA/Wikipedia)

The basic system video was generated by what IBM referred to as a Type 1 or Type 2 video subsystem, that is, VGA or XGA. The circuitry that provided the VGA function included a video buffer, a DAC, and test circuitry. Video memory was mapped as four planes of 64Kb by 8 bits (maps O through 3). The video DAC drove the analog output to the display connector. The test circuitry was used to test for the type of display attached, color or monochrome.

The video subsystem controlled access to video memory from the system and the CRT controller. It also controlled the system addresses assigned to video memory. Up to three different starting addresses could be programmed for compatibility with previous video adapters. In the graphics modes, the mode determines the way video information was formatted into memory, and the way memory was organized.

Related: Want to stay ahead in tech? Subscribe to ComputingEdge, our free, curated monthly digest.

In alphanumeric modes the system wrote the ASCII character code and attribute data to video memory maps O and 1, respectively. Memory map 2 contained the character font loaded by BIOS during an alphanumeric mode set. The font was used by the character generator to create the character image on the display.

Three fonts were contained in ROM: an 8-by-8 font, an 8-by-14 font, and an 8-by-16 font. Up to eight 256-character fonts could be loaded into the video memory map 2; two of those fonts could be active at any one time, allowing a 512-character font. In those days characters were an important feature/function, and not considered just another bit map.

The video subsystem formatted the information in video memory and sent the output to the video DAC. For color displays, the video DAC sent three analog color signals (red, green, and blue) to the display connector. For monochrome displays, the BIOS translated the color information in the DAC, and the DAC drove the summed signal onto the green output. Thus, the green line or signal became the default synch signal for monitors that still used BNC connectors.

The auxiliary video connector allowed video data to be passed between the video subsystem and an adapter plugged into the channel connector. This was a common technique carried up until the late 1990s. Companies offering higher resolution and/or 3D capable graphics chips would not include a VGA controller to save costs and assumed a VGA controller would already be in a system as a default. IBM didn’t provide any high-resolution graphics drivers for the VGA.

 

VGA block diagram

Figure 3: IBM VGA block diagram (Source IBM)

The original VGA specifications deviated from previous controllers by not offering hardware support for sprites.

The on-board specifications included 256 KB Video RAM (The very first systems could be ordered with 64 KB or 128 KB of RAM, at the cost of losing some or all high-resolution 16-color modes.) It had support for 16-color and 256-color paletted display modes and 262,144-color global palette (6 bits, and therefore 64 possible levels, for each of the red, green, and blue channels via the RAMDAC) as well.

The clock was selectable at 25.175 MHz or 28.322 MHz for the master pixel clock, but the usual line rate was fixed at 31.46875 kHz. The VGA specified a maximum of 800 horizontal pixels and 600 lines, which was greater than the 640 × 480 monitors that were being offered at the time.

Refresh rates could be as high as 70 Hz, with a vertical blank interrupt (not all the clone boards supported that in order to cut costs.)

The chip could support a planar mode: up to 16 colors (four, bit planes), and a packed-pixel mode: 256 colors (Mode 13h as it was commonly referred to). The chip did not have bit-BLT capability (i.e., a Blitter), but did support very fast data transfers via “VGA latch” registers. There was some primitive Raster Ops support, a barrel shifter, and something IBM called hardware smooth scrolling support, which was just a bit of buffering.

A barrel shifter is a digital circuit that can shift a data word by a specified number of bits without a CPU. A common usage of a barrel shifter is in the hardware implementation of floating-point arithmetic. In today’s modern GPUs there are thousands of 32-bit floating processors.

 

VGA Connector

Figure 4: The ubiquitous 15-pin VGA connector

The VGA specification included a resolution, a physical connector specification, and video signaling. Still supported today, one can find projectors with VGA connectors which require an adaptor cable when used with newer computers or graphics boards.

VGA clone suppliers

Table 1: Some of the VGA clone suppliers

In addition to the clone chip suppliers, several other companies incorporated the VGA structure into their chips.
No other chip has had the profound impact on the computer business as the VGA has, and the industry owes a great debt to IBM for developing it; sadly, IBM didn’t profit as much from their invention as did other suppliers.

Related: Like what you’re reading? Explore our collection of more than 50 magazines and journals.