The information processed by the graphics card is ultimately output to the display. The output interface of the graphics card is a bridge between the computer and the display. It is responsible for outputting corresponding image signals to the display. Because of its design and manufacturing reasons, CRT monitors can only accept analog signal input. This requires the graphics card to output analog signals. The VGA interface is the interface for outputting analog signals on the graphics card, and the VGA (Video Graphics Array) interface, also called the D-Sub interface. Although liquid crystal displays can directly receive digital signals, many low-end products use VGA interfaces in order to match the VGA interface cards. The VGA interface is a type D interface with a total of 15 pins empty, divided into three rows of five in each row.
The VGA interface is a type D interface with a total of 15 pinholes, divided into three rows, five in each row. Among them, in addition to two NC (Not Connect) signals, three display data buses, and five GND signals, it is more important that there are three RGB color component signals and two scanning synchronization signals HSYNC and VSYNC pins. The color component of the VGA interface uses the RS343 level standard. RS343 level standard peak-to-peak voltage is 1V. The VGA interface is the most widely used interface type on graphics cards. Most graphics cards have such interfaces. Some graphics cards that do not have a VGA interface but have a DVI (Digital Visual Interface) interface can also convert the DVI interface to a VGA interface through a simple adapter. A graphics card without a VGA interface will usually include such an adapter. .





