What is the is the correct graphics card for windows xp

Solution
there is no correct or default card. as long as the card was direct x 7 8 or 9 complaint it had either a pci or pci-e or agp socket it was used with xp. xp itself supported direct x 9 but was backwards compatible to 2 earlier versions with direct x updates. the same as direct x 11 is compatible with 10 and 9 today.

a typical direct x 9 card was the 6600gt to 7950gt from nvidia or something like an x800 to x1950xtx from amd.

the sweet spot for performance was around the 6800-7800gt cards and the 1900pro from amd paired with a dual core of 2.6ghz+ and 2 gigs of ram
most people used xp 32bit as 4 gig wasnt really needed.
hope that helps.
there is no correct or default card. as long as the card was direct x 7 8 or 9 complaint it had either a pci or pci-e or agp socket it was used with xp. xp itself supported direct x 9 but was backwards compatible to 2 earlier versions with direct x updates. the same as direct x 11 is compatible with 10 and 9 today.

a typical direct x 9 card was the 6600gt to 7950gt from nvidia or something like an x800 to x1950xtx from amd.

the sweet spot for performance was around the 6800-7800gt cards and the 1900pro from amd paired with a dual core of 2.6ghz+ and 2 gigs of ram
most people used xp 32bit as 4 gig wasnt really needed.
hope that helps.
 
Solution
the sweet spot recommendations above are wrong............... I run xp pro right now with a gtx660. could go bigger if I wanted........... depends on the hardware in your case and the resolution you're playing at. DX version really doesn't have anything to do with what card you are using. DX is only an indication of what the cards are able to render.
 
@swifty, mate i was referring to use back in 2007... you know when xp was king.
yeah you can run modern cards on xp. but your robbing yourself of performance if you game on xp with a dx10/11 card.
dont get me wrong, you can put what cards you like in because you know its usage, but to me as a gamer thats a waste.
 
you're not wasting performance per se......... other than not being able to display DX10/11 attributes. you'll still get more fps and get better "performance" in modern games vs using "older" "2007" ish cards. 2007-ish cards couldn't play modern games smooth enough at high settings and resolutions. it's kinda like slapping an r7 250 in a machine and trying to play crysis3 at 1080p maxed out and expecting good results.
 
mate theres so many more reasons to switch your o.s than just fps.
you get to play dx10+ only games
60 fps at medium dx11 settings is still way better looking than 120 fps on dx9.

think about this...
most games tie there packet data into the fps until a max value is reached. typically 100 for fps/rts games, so even if you get 150+ fps you still only get 100 of the frames updated with accurate information from the data packets. 1 in 3 is a wasted double frame rendered with the packet data of the previous frame.
again wasted performance.
my rule of thumb for my screen is 60hz, 60fps,60 packets at the best quality my card can handle (med high).

there are exceptions to the rules, games like CS which seem to send as many packets as you can get fps which results in a distinct advantage for high fps low ping players but also more ping lag and rubber banding because of dropped packets.
i hope that clarifies what i was getting at.