How to install a video card

Status
Not open for further replies.

BlueCat57

Distinguished
Apr 7, 2009
430
4
18,815
OK, maybe I'm confused, maybe I'm not. :??:

I am installing a video card in my HTPC.

My motherboard has an NVIDIA GPU.

The video card is Radeon so I need to install different video drivers.

I use the NVIDIA RAID.

I have Windows 7 Professional.

What are the steps for installing the card?

The Quick Installation Guide says:
1. Disable the on-board graphics on the motherboard. (I assume that is through BIOS setup.)
2. Install the card.
3. Boot the system.
Now it becomes unclear.
4. Let Windows detect your card.
5. Restart your system.
6. Use the installation disk that came with your card to install the drivers.
(I'm confused since I found several recommendations to NOT use the disk that comes with the card.)
7. Update the drivers from Sapphire or ATI.

From what I have been reading regarding problems with installations the procedure that Tom's Experts suggest would be:

Slap the card in your box, boot the system and full speed ahead, d@mn the torpedoes! :kaola:

Not really. Here is what I've gathered.

1. Download the latest driver package from the card vendor. (I'm guessing I should do this before I install the card since the installation may make connecting to the Internet and navigating to the vendor's site difficult.)
? Should I install the driver package now or after I've installed the card and booted up?

2. Uninstall the on-board GPU software.
? The Quick Install doesn't say this. Should I leave it installed in case the card fails? Would that cause conflicts?
? I'm not sure I can just uninstall the GPU software. If I have to uninstall the whole package then my RAID array may be affected.

I'm not sure what to do next, but I think this is the process.
3. Shut the system down.
4. Install the card.
5. Boot the system and enter BIOS setup.
6. Disable the on-board (internal) graphics and enable the card (external).
(On my mobo that is MCP7A Chipset Configuration. It calls the options Internal and External.)
7. Finish the boot and log into the OS.
I'm guessing again, but I assume Windows 7 would recognize the card and install some driver so that I get a display. Then I would proceed as follows.
8. Install the driver package downloaded in #1 if I haven't already done that.
9. Restart the system and pray everything works. :sweat:

At this point everything should be up-to-date and working.

Is my second set of instructions the better method? Have I missed anything?
 
Solution
Download the drivers.
Shutdown the computer
Install the new card
Boot up the machine
Run the installation you downloaded in step 1.
All will now be good.

You do not need to disable the onboard video that will be done automatically when it detects a video card is there upon boot.

Nor do you need to uninstall it before installing the other. This is a "legacy" step that many still use because of issues 10+ years ago. I personally have upgraded or installed probably over 100 cards and never once have I disabled or uninstalled drivers first. Feel free to do it if you like though. Maybe it will give you the same warm and cozy feeling all these other guys get.
before installing the card you suppose to uinstall the onboard video drivers (nvidia). go to control panel add /remove programs. then you should restart the pc, go to bios and make sure to disable the onboard, slap the card in the box, connect the monitor to the new card and cross your fingers you will get a signal. after it boots up go online and download the newest amd drivers
 

Skippy27

Distinguished
Nov 23, 2009
366
0
18,860
Download the drivers.
Shutdown the computer
Install the new card
Boot up the machine
Run the installation you downloaded in step 1.
All will now be good.

You do not need to disable the onboard video that will be done automatically when it detects a video card is there upon boot.

Nor do you need to uninstall it before installing the other. This is a "legacy" step that many still use because of issues 10+ years ago. I personally have upgraded or installed probably over 100 cards and never once have I disabled or uninstalled drivers first. Feel free to do it if you like though. Maybe it will give you the same warm and cozy feeling all these other guys get.
 
Solution

swoz

Distinguished
Jan 4, 2012
29
0
18,540
First you should get the newest Radeon driver then do the follow steps in order:

1. Remove the current (nvidia) drivers through Device Manager (right-click and uninstall/remove the driver)
2. Remove the nvidia management software in Programs & Features, then power down (make sure you unplug AC input)
3. Remove old card, install new card (be sure you are touching ground at all times to prevent ESD)
4. Boot machine - make sure the BIOS knows to use the PCIe card instead of the on-board. This is usually in auto-detect mode though.
5. Log in and let Windows 7 install the auto-detect driver
6. Reboot
7. Install the newest Radeon driver for the card
8. Reboot

Done.
 
1) Any graphics card has a default low resolution mode that will work without drivers.

2) It is not clear to me if your motherboard will allow both integrated and discrete graphics at the same time. Regardless it is a moot point if you are attaching only one monitor. The normal default for boot is to go to the integrated controller, so you should disable it if the motherboard does not automatically switch to where your monitor is attached.

3) Windows 7 allows multiple graphics drivers, so there is no real need to uninstall unused drivers.

4) Once windows is loaded, you can install the appropriate graphics drivers. You could do it from the supplied driver cd.
Ultimately, you will want the latest driver which will come from the web, either the graphics card web site, or from amd or nvidia.
Navigation to the web site is a bit difficult in low res, but it can be done. It is easier if you download, in advance, the appropriate driver, and make note of where it is stored.
 

BlueCat57

Distinguished
Apr 7, 2009
430
4
18,815
Thank you for the quick replies. I'm just about done with the installation. It should take less than 30 min total, after several hours of research and learning. I'll be back later to pick a best answer and to not what I've learned so others may benefit.
 

BlueCat57

Distinguished
Apr 7, 2009
430
4
18,815
OK, the card is in and it appears to be working. Almost Mac like. Windows 7 actually works despite what people say.
Here’s how I did the installation.
1. I downloaded the latest driver software from the card vendor website. Note: I updated my BIOS first and updated the on-board GPU software as well.
2. I did not unzip or install the software.
3. I shut the system down.
4. I installed the card.
5. I booted the system. I did not have to tell it to disable the on-board GPU.
6. Windows 7 installed a driver so after initially booting to 800x600 I was able to move up to 1024x768.
7. I installed the driver software. I didn’t even need to reboot for the software to recognize the monitor.
So far, so good.
The card moved my Windows Experience Index from 4.4 to 5.1 with Windows Aero still being the limiting factor. I wonder if there are some tweaks I can do to improve that score.
PCMark 7 (free edition) went from 291 to 1572. Not sure what scores changed since I apparently didn’t save the first test. I may pull the card out just so I can see what changed.
Again thank you all for you input. Since most of what we see here at Tom’s are the problems people have with card installations I assumed it would be much harder than it was. Turns out it was pretty simple. Let’s hope those aren’t famous last words.
 
Status
Not open for further replies.