Video card VS. Video Graphics Array(VGA a popular display standard)

a cooperator

Distinguished
Aug 7, 2012
422
0
18,780
Hi,
I know that a video card is an internal circuit board that allows a display device such as a monitor to display images from the computer.

However, I have desktop computer to whose video card a monitor is connected. Thus, before installing any drivers of video card, an exmanation mark in front of the Video Graphics Array(VGA) which is a popular display standard. This Video Graphics Array(VGA) which is a popular display standard was functioning well without installing the drivers of the video card.
Thus, my questions are:

First: Why Will I need to install drivers of a video card as long as a Video Graphics Array(VGA) which is a popular display standard is functing well?
Standard VGA: on Both of my Laptop, and desktop computer

SkQ8bU.jpg


VNe4yd.jpg


NVIDIA Video adapter on my Desktop computer:

FMFP8U.jpg

AMD Radeon 6770M (1 GB DDR5) Video adpater on my laptop compuer:


Second: On my desktop computer, I have a video card connected to a monitor. However, I only have installed the drivers of video card, and I didn't need to install the drivers of the monitor, although I have a CD disc which came with the View sonic monitor.

Fnally: Why will Monitor category not be shown in the device manager until the drivers of a Standard Video Graphics Array(VGA) get properly installed?

SkQ8bU.jpg



 

a cooperator

Distinguished
Aug 7, 2012
422
0
18,780
I am only wondering what differencs there are between Video/graphics card, and Video Graphics Array(VGA a popular display standard). I.e. before installing the Nividia graphics driver, then there was a VGA controller in the device manager having an exclamation mark. That VGA is called, I think, a Standrad Video Graphics Array(Standard VGA). However, after installing the nividia grphics driver, then the exlamantion mark was removed from the VGA, then it will be called a nividia VGA (nividia Video Grapics Adapter)
 
How to say this.

The initial VGA is an standar, 800x600 or similar, only give you the option to show image but not some other possibilities like external display, clone display and those stuffs.

The specific VGA (FX 5200) driver, give to your rig the "possibility" to address all the image process to an specific device, in this case, the FX 5200 and add some extra functions like external display (at different resolutions), clone display, extend display...and those stuffs.
 

a cooperator

Distinguished
Aug 7, 2012
422
0
18,780


Thanks a lot,
I CAN UNDERSTAND FROM YOUR EXPLANATION:
FIRST: You mean without installing the driver for the VGA, and staying using the Standard VGA, HDMI port(Connects an external VGA monitor or projector.), External monitor port(Connects an optional video or audio device, such as a high-definition television, or any compatible digital or audio component.) will not functioning at all SINCE THERE WILL NOT BE IDENTIFIED BY A COMPUTER
SECOND: only installing the specific VGA driver will let all HDMI port and External monitor port recognised. I.e. there is no specified drivers which will be installed for HDMI port and External monitor port in addition to the driver of VGA WHICH SHOULD BE INSTALLED?

Could you please answer me:
Second: On my desktop computer, I have a video card connected to a monitor. However, I only have installed the drivers of video card, and I didn't need to install the drivers of the monitor, although I have a CD disc which came with the View sonic monitor.

Fnally: Why will Monitor category not be shown in the device manager until the drivers of a Standard Video Graphics Array(VGA) get properly installed?

 
First: Yes and no, all can be identified by your computer BUT with not all features. For example, if you use standard VGA and try to extend or duplicate your desktop in another display (external one), you will find than there isn't option to do that.

Second: Correct, the specific drier gives to your rig the possibility of manage all ports and the features of all those ports. Higher resolutions, use external displays for project presentation and those stuffs.

Now, answering to your questions:

Second: Correct, in this case, the most important thing is the GPU driver. The monitor driver gives you the change to "recognize" the exact model of your monitor and sometimes, use some other functions of the monitor. For example, my monitor have USB ports on it and 3D Vision Ready. The specific driver for your GPU is capable of get the monitor's model and his main features (resolution, calibration...)

Finally: Because all monitor can read a 800x600 resolution, the initial OS installation needs to verify that to check what is the max resolution that your monitor can handle. Even on Windows 7 or later, on 1st boot you see a big 800x600 screen while all drivers are installed.
 

a cooperator

Distinguished
Aug 7, 2012
422
0
18,780




As long as the initial VGA is an standar, 800x600 or similar, only gives me the option to show image but not some other possibilities like external display, clone display and those stuffs.
I would be saying that no need to these possbilities in Almost ALL desktop computers, since they originaly Only have external monitors needed to be plugged to a VGA with a cable to show the picture.(Not like laptops have built-in monitors, and other ports, HDMI port, External monitor port). As a result, what other benefits are there from installing the drivers to let the initial VGA be functioned well, aside from resolution, calibration)

Also, I really someday have disabled the initial VGA in the device manager(Screen shot below) .Thus I have lost the image on the monitor completely, although the OS was working fine. So, How could in this case be able to enable VGA again as long as there was no image at all on the mointor? I was told that
'There could be a solution, but you will need to borrow a graphics card.
Use it to boot windows and get picture, then re-enable your old video adapter (you might need to show all devices under device manager for it to be visible).'

However, if I didn't have additonal Graphics Card, what would I have done to get VGA enabled gain. I thougth of booting to recovery environment, then use the Command line 'CMD'(second screen shot below) to run any commands from there. Howevever, I didn't know any command to run to enalble the VGA
91tapI.jpg

mVt9Rq.jpg
 


I am not sure if you can enable the VGA adapter using the command promt, I would have to investigate that.

I would think that using Hiren's boot and mini windows options could works, you should try that.
 

a cooperator

Distinguished
Aug 7, 2012
422
0
18,780


Could you please reply to me:

As long as the initial VGA is an standar, 800x600 or similar, only gives me the option to show image but not some other possibilities like external display, clone display and those stuffs.
I would be saying that no need to these possbilities in Almost ALL desktop computers, since they originaly Only have external monitors needed to be plugged to a VGA with a cable to show the picture.(Not like laptops have built-in monitors, and other ports, HDMI port, External monitor port). As a result, what other benefits are there from installing the drivers to let the initial VGA be functioned well, aside from resolution, calibration)

Also, I noticed that while using the default VGA driver, then I could increase 'resolution from '800X600' to 1024x768 Pixels. Also, the 'colour quality' to 32Bit or 16Bit, although it is intial, standard VGA, I could change the resolution.

BUc9eT.jpg

Wxdc96.jpg





Thus, my questions are:

First: what benefit is there from instlling the driver of the default VGA as long as I can change the screen resolution between '800X600' and '1024X 768', and change colour quality between 32Bit and 16bit while using default VGA driver, which means that icons, and display would be OK??

Second: With using a default VGA driver on any Computer, there are only '800x600' to '1024X 768'? (I.e. The resolution of all default monitor can be only between 800x600 and '1024X 768'?