Disabling Integrated Video Chip

tvl

Distinguished
Jan 4, 2006
58
0
18,630
First, I am not quite sure where to post this questuon, so if this isn't the correct forum, please let me know! Thanks!

I have purchased an ASUS video graphics card and need to disable the integrated video chip. I have been told this can be done through the BIOS, but I can not find ANY reference at all to the integrated video chip once I get in the BIOS settings.

However, if I go to the device manager (Windows XP) and look under display adapters I find the following: INTEL 82945G Express Chipset Family. If I right click on this adapter, there is an option to DISABLE. If I disdable the adapter in this method, is this the same as one doing it through the BIOS???

If this is an appropriate method of disabling an integrated video chip, then I am still confused as to what would be the appropriate steps for installing the new graphics card. If I disable the integrated chip, I will lose the video output making it impossible to make an orderly shutdown. And then, once down, do I simply insert the new graphics card, boot up and the system find the new card automatically. As you can see. I am confused. Can someone please shed some ligh on this subject?

Thanks in advance!

Tony
 

sleepdeprived82

Distinguished
Dec 21, 2005
350
0
18,780
Technicaly you dont have to disable the onboard graphics at all.

Here is how I used to do these upgrades:

Slot in your new graphics card, plug monitor into new graphics card, turn on computer, Motherboard should see new card, windows should find new graphics card and install drivers for it. There you go

notes:
1. find the latest drivers from ati of nvidia and install them once windows has finished installing drivers.

2. Prob should disable the onboard just to stop any potential conflicts and yes the method in display adaptors is fine.
 

Pompeii

Distinguished
Dec 30, 2005
173
0
18,680
After the video card is installed using the steps sleepdeprived told you to do, the rest of the work is within windows itself.

Make sure the video card is installed with the drivers before you do this!

Right click on the desktop, and select properties. Go to the settings tab, and under the display drop down menu select your video card. After that, check use this device as the primary monitor. After you do that, plug your monitor cable into your video card and you should be set.

Enjoy your new video card!
 

tvl

Distinguished
Jan 4, 2006
58
0
18,630
Thanks for the input!

I just received a response from Sony with the same question I posted at this forum. There response was as follows:

"Thank you for contacting Sony Online Support.

We regret the difficulties you are experiencing. If the computer shipped with an onboard video adapter, the onboard video adapter cannot be disabled in the systen BIOS. Because of this, the add on PCI video card can only be used as a secondary video card, and not the primary one."

Not sure if this is important, but the new card is PCI-E and based on what I have seen myself and learned here, this does not appear to be a true statement. Am I correct in my thinking? If I disable the onboard adapter through the device manager, would not the onboard adapter "go away" and the new PCI-E card become the primary device?

If I am correct in my thinking, how could technical support from the manufacturer of the PC miss this option. Maybe I should steer clear of their support?

Thanks again!

Tony
 

sleepdeprived82

Distinguished
Dec 21, 2005
350
0
18,780
It will be a stock response,

1st are you sure you have a Pci-e slot, some big name companys will leave the slot out sometimes if they use onboard graphics.

2nd Yes disabling the onboard in windows will make your new card primary

3rd The only way you will know if it works in the end is just to go for it, if it doesnt work try different combs to try and get it to work. Onboard video can cause grief if putting in a new card but you should be fine.

Go on go for it ( if it goes wrong I didnt say that) :lol: