Forgive me for my ignorance, but got a quick question,
On one of my older machines my video card went out, and I went and got a newer one. The old one was a nvidia 6600 gt, this one a e-geforce 7200 gs. When trying to use this newer one, I turn my computer on, but I cannot get anything displayed on my monitor. I thought it might have been a bad card, but I checked the lable of the box and it says it requires 300 watts and 18 amps on the 12+ rail.
Its a cheaper psu, at 450 watts, but I knew I was fine there. However, when I checked the 12+ rail amps, it has it divided into 2, 12v1 and 12v2. 12v1 has 17 amps, and 12v2 has 16 amps. I assume you don't just add them up and use that total, so could this be why the card is not working with this pc? Or is that enough power to run this older card? Thanks,
B
On one of my older machines my video card went out, and I went and got a newer one. The old one was a nvidia 6600 gt, this one a e-geforce 7200 gs. When trying to use this newer one, I turn my computer on, but I cannot get anything displayed on my monitor. I thought it might have been a bad card, but I checked the lable of the box and it says it requires 300 watts and 18 amps on the 12+ rail.
Its a cheaper psu, at 450 watts, but I knew I was fine there. However, when I checked the 12+ rail amps, it has it divided into 2, 12v1 and 12v2. 12v1 has 17 amps, and 12v2 has 16 amps. I assume you don't just add them up and use that total, so could this be why the card is not working with this pc? Or is that enough power to run this older card? Thanks,
B