What Is a Standard VGA Graphics Adapter?

by Rhian Hibner

Video Graphics Array, or VGA, is a legacy technology upon which all modern video cards are based. Every video card produced in the last two decades can output a standard VGA display resolution of 640 by 480--though most of them can go higher.

History

VGA was first introduced by IBM in 1987, when they introduced the PS/2 line of computers. It could display 16 different colors at a time at standard resolution, and up to 256 colors at half resolution. It was also the first graphics array that could be integrated directly to the motherboard.

BIOS

While VGA has become obsolete for day to day use, it is still used as the standard display mode for a computer's BIOS system. BIOS (Basic Input Output System) is a low-level system environment which allows a computer to boot up and load a full-featured operating system, like Microsoft Windows and Linux. Essentially, every video card made since VGA was introduced, can function as a VGA adapter for this reason.

Other Uses

VGA doesn't just refer to ancient video cards and backwards compatible low-level display modes. It also is used as the basic element when describing a displays resolution capabilities. For instance, many smartphones have displays that are described as HVGA (Half VGA, or 320 by 480) QVGA (Quarter VGA, or 320 by 240) or even WVGA (Wide VGA, or 800 by 480.) Of course, these display modes do vary from the original VGA spec, because they can display a much larger range of colors.

References

About the Author

Rhian Hibner has been writing professionally since 2004. He spent four years writing for the New Mexico "Daily Lobo," the student-run newspaper at the University of New Mexico, where he received a Bachelor of Arts in English. After graduating from college he moved to Seattle and now does freelance writing.

Photo Credits

  • photo_camera Technology - Graphics Card image by Rob Hill from Fotolia.com