Difference between VGA and SVGA

By: | Updated: Feb-21, 2018
The contents of the Difference.guru website, such as text, graphics, images, and other material contained on this site (“Content”) are for informational purposes only. The Content is not intended to be a substitute for professional medical or legal advice. Always seek the advice of your doctor with any questions you may have regarding your medical condition. Never disregard professional advice or delay in seeking it because of something you have read on this website!

This article seeks to define the difference between these two display graphics standards.

Summary Table

VGA SVGA
Capable of 640×480 display resolution. Can display 800×600 pixel resolution when first released.
Developed by IBM. Developed by several hardware and monitor manufacturers.

Definitions

image in VGA resolution
An image displayed in VGA resolution

VGA, or video graphics array, is a display standard that was first used in IBM PS/2 computers in 1987. It makes use of analog signals, delivering 640×480 resolution screens with 16 colors at a time and a refresh rate of 16 colors. However, if the resolution is brought down to 320×200, a VGA monitor can show 256 colors. A computer that is booted in Safe Mode will usually show this display resolution. The term VGA also points to the 15-pin connector and the analog display standard.

VGA is the last IBM graphics standard that many computers used during the late 1990s. IBM tried to follow it up with XGA, or the extended graphics array, standard that offered a resolution of 1024×768. Enhanced standards (called SVGA) were released shortly thereafter by other manufacturers and overtook XGA. The VGA analog interface can handle 1080p (or higher) HD videos and is still being used today. Some picture quality degradation may occur but can be avoided by using a long enough, good quality cable.

computer game in SVGA graphics
A computer game in SVGA graphics

Many graphics and monitor manufacturers were able to develop their own display standard called SVGA (Super Video Graphics Array), also known as Ultra Video Graphics Array. This group of standards is a step higher than IBM’s VGA and can display 800×600 resolutions in 16 million colors on 14-inch monitors.

Depending on the video memory installed in the computer, either 256 simultaneous colors or 16 million colors can be supported by the system.

SVGA cards came out around the same time VGA did in 1987, but a benchmark for programming SVGA modes was not established until 1989. The first version was capable of showing 800×600 4-bit pixels and was extended to 1024×768 8-bit pixels and further in the succeeding years. The first SVGA was supposed to be replaced by Super XGA, but with manufacturers dropping the unique names for every upgrade, the majority of display systems manufactured from the late ’90s to the early 2000s were all called SVGA.

VGA vs SVGA

So, what‘s the difference between VGA and SVGA? Although both video display standards use analog signals and the same ports on a computer, their similarities end there.

The VGA display standard can support a maximum resolution of 640×480 pixels. SVGA monitors were capable of displaying 800×600 pixels when initially introduced and have drastically increased in display capabilities ever since. IBM developed the VGA display standard, which became the default standard in screen resolution when it was released in 1987. SVGA is a collective term for the several upgrades from VGA developed by various hardware and monitor manufacturers.

(Visited 627 times, 1 visits today)
Did this article help you?
Thank you!
Thank you!
What was wrong?