PC Not Recognizing My Card?

Cstomp

Distinguished
Aug 11, 2010
9
0
18,510
I will preface this thread with this: I understand that slimline PC's are poop... my old one broke and this was a birthday present... so no need to come in and tell me what I know.
Anyway, so today my new LP GFX card came and I was very happy.
I took it out, put on the LP brackets and popped it in the DVI input on my PC. I started it in safe mode and went to my device manager to disable my integrated gfx card from my slimline plc only to find that it was the only card listed under display adapters... I still went ahead and disabled it and tried maybe installing the drivers (this is my first time installing a card) and that didn't work.
Thinking maybe I hadn't tried hard enough I restarted and went to the integrated section of the BIOS menu only to find the option to disable the on board graphics card grey'd out (as in, I couldn't hit it)

Now I am stumped. I know the card is in properly, my DVI chord works and I have tried disabling my onboard card, my PC simply will not recognize it as far as I can tell.
Thanks for any help you can offer!
 

Helltech

Distinguished
IF you all had before were integrated graphics you shouldn't need to do anything in the OS in terms of disabling them.

You MAY, note MAY, have to disable them in BIOS, and/or set your PCI-e slot to be your dedicated graphics adapater.

And did you say you HAVE the card in the computer? If you have the card in the computer and you have video then you don't need to even diable onboard graphics. Just install the latest drvers for that card from ATI/AMD's website and you're good to go.
 
What are the specs on your power supply? The wattage and amps are what you want to know. Also, exactly what model video card did you get, can you post a link? Does the card require an additional power cable connected to it?

If it does and you didn't plug it in, or your psu isn't up to the task of handling the new card then that could be your problem.

Did you try to remove the card and then reseat it? Be sure to secure it with screws or the cover some cases use to secure all pci/ex cards...
 

Cstomp

Distinguished
Aug 11, 2010
9
0
18,510
no, It's like my VGA slot is running the integrated card (even when I disable it, then the resolution just goes wonky) and my new card, which I did link to (and which I am 99% I have enough power for) is running off the DVI input. So when I unplug my vga chord, my screen goes blank.
So, for example, when I go to see in the device manager, it's not even there.
Sorry if that's hard to understand but it's like this: I put in the new card, which, afaik, isn't being recognized by my computer. If I don't use the VGA slot on my PC (with what I would imagine is the integrated) then my screen goes blank/no signal off the DVI (which is where my ATI Radeon HD 5570 is)
Thanks again though, guys :)
 
As in my original post, I question your psu. I looked back over your original post and I don't see a link to your video card. With the wacked up ads on here I may not be able to use the link if you changed the original link text.

Can you see the label on the side of the psu to get the specs or give the model of the system and we can google the specs...

 

Cstomp

Distinguished
Aug 11, 2010
9
0
18,510
the video card, iirc, requires roughly 30-40 watts to run.
My PC has 220 watts.
Also, it won't let me selected integrated video. I do see it but can do nothing with it.
 



Removing the video card, you still can't use the onboard video? You may have damaged your psu trying to run that card on your system. Did you mean onboard video or pci-ex?... :heink:
 

zidane721

Distinguished
Apr 12, 2010
13
0
18,520
i had this problem with a slim line emachine it would not recognize my 8400gs,when the monitor cable was hooked up to the vga output it displayed but when hooked up to the dvi from the video card no display appeared. i thought this was due to a small psu but the real problem was i needed to update the motherboard bios after that the card worked fine
 

wh3resmycar

Distinguished


this is the first time i've seen this. but my experience on intel g31s' (your board is newer i know, and probably contains a different bios) bios gives me the option to select not disable the display device (eg. select primary graphics adapter > internal/pci/pcie). do you have anything in the bios that is similar?
 

Helltech

Distinguished
A 5570 doesn't need a lot of power, however even with a 220w PSU you are pushing it, a lot. Just becuase you have a 220w PSU doesn't mean it will give you 220w of power either.

That particular PSU may not have been up to the task..

Did you switch the bios to get video from the PCI-e slot? If so you are going to have to reset CMOS before you can use the integrated video again,.
 

rikishi19

Distinguished
Jun 29, 2010
123
0
18,690
I haven't read that review, however, you shouldn't just go by one review, if that review says one thing, the one I found says another thing, then another review is likely to recommend something else.

However, here is AMD's website and minimum requirements:

http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5570/Pages/hd-5570-system-requirements.aspx

Which says 400watts minimum power supply. And you should never just have minimum requirements when it comes to power supplies, always go at least one power grade higher.

There is also a link to certified power supplies for your gpu on that AMD page, so I'd recommend investing in one of those. Seasonic and Corsair are very good brands.