Forgive me for my ignorance, but got a quick question,
On one of my older machines my video card went out, and I went and got a newer one. The old one was a nvidia 6600 gt, this one a e-geforce 7200 gs. When trying to use this newer one, I turn my computer on, but I cannot get anything displayed on my monitor. I thought it might have been a bad card, but I checked the lable of the box and it says it requires 300 watts and 18 amps on the 12+ rail.
Its a cheaper psu, at 450 watts, but I knew I was fine there. However, when I checked the 12+ rail amps, it has it divided into 2, 12v1 and 12v2. 12v1 has 17 amps, and 12v2 has 16 amps. I assume you don't just add them up and use that total, so could this be why the card is not working with this pc? Or is that enough power to run this older card? Thanks,
Old Geforce card displays artifacts at bios, and will only run in vga mode (at lowest settings) or in safe mode (with medium display settings). As diagnosed last week, felt it was safe to assume that this card was dying, though it has been working like this for the last couple months.
I just bought a cheap e-geforce 7200 gs, and based off what I saw on the box, should have worked perfectly with this machine. However, when I swap cards, the 7200 will not display anything, not even the bios and loading screen, like my older card does.
Anyone have any kind of a clue what is going on here? Could this be more than just my video card? Im gettin' frustrated as I want to get this thing working, and its been awhile. Tired of the choppiness when I am scrolling down webpages and low quality colors in vga mode. Any advice is appreciated, getting to the point where I am about to scrap this old beast and dip into some savings to get something new. Though this is really a last resort, thanks