GTX 260 or Dual 9800 GTX+'s?

jmerithew87

Distinguished
Jul 15, 2008
7
0
18,510
I've noticed that the prices on the GTX 260s have dropped to $300. For most current games would I be better off running two 9800 GTX+'s in SLI or one GTX 260. My motherboard is an evga 750i ftw if that needs to be taken into consideration.
 

leo2kp

Distinguished
Wouldn't SLI'd 9800GTX cost about the same as one 280?

I would go for the 280 if that's the case because if you SLI 9800's, you only still get 512mb of RAM to use. Not 1GB. With the 280 you'll have higher resolution advantage against the SLI'd GTX in my opinion.
 

mathiasschnell

Distinguished
Jun 11, 2007
406
0
18,780
SLI'd 8800GTs beat out the GTX 280 in most, if not all, things. So 9800GTX+s in SLI will most definately beat a GTX 260. However, I'd just recommend a single HD 4870 or SLI'd 8800GTs because 9800GTX+'s in SLI wouldn't be worth it IMO. The + to the 9800GTX is not that noticeable for the price bump on it and the single 4870 or the SLI'd 8800GTs will soundly beat a single GTX 260.
 

jmerithew87

Distinguished
Jul 15, 2008
7
0
18,510
I do like the newer ati cards, but I'm leaning more towards Nvidia just because of the onboard physics support. It's a really nice feature
 

jmerithew87

Distinguished
Jul 15, 2008
7
0
18,510
currently it supports ut 3, which i probably will try, as well as alot of games based off the unreal engine. And that was before Nvidia got the rights for physics support. Its just a hunch, but most companies lean with nvidia, so I'm thinking there'll be more games in the near future that will take advantage of it.
 

mathiasschnell

Distinguished
Jun 11, 2007
406
0
18,780
While that may be true now I think companies are starting to inch away from Nvidia due to their recent screw-ups and troubles. Also, physics can be done on the CPU too, so whether or not your graphics cards support physics processing, you'll still get it and you probably won't notice any severe dips in performance.

Another thing to note is that Havok (which ATI uses on their cards) is more widely supported than PhysX (Nvidia's). However, I'm not sure if that's really an issue anymore. I'm not too savvy on how the physics stuff works in games.
 
I read that Diablo 3 will use physics too, in 2010.

Edit: I still remember, my level 99 character in Diablo 2 who could kick every monster's behind without any problems, she still had to go around every little bush or rock or teleport over it, no way to just blast through :)
 
Actually, that would seal your choice with ATI. I meant physics, as in "Havok-powered physics", I did not mean PhysX, sorry about the confusion.
See here: http://www.blizzard.com/diablo3/faq/ D3 will use Havok for physics.

No cards from either ATI or nVidia support Havok right now AFAIK, but the HD 4870 will support it in a few months when the drivers are done, according to this article:
http://www.guru3d.com/article/radeon-hd-4870-review--asus/2

I'm sure nVidia will find a way to support Diablo 3 properly in future cards, since it's expected to be a massive bestseller and they still have a year and a half for it. But, if you need to buy a video card now and D3 is important to you, get a HD 4870.

 

MxM

Distinguished
May 23, 2005
464
0
18,790
260 would be my choice for very simple reason - power consumption. 4870 consumes much MUCH more power, and if you have computer in a relatively small room with standard air conditioning, then you have to be prepared to be wet in your sweat. And of cause any SLI or XFire consumes even more power, though 4870 is so power hungry that I guess that some dual cards would need less power.
In my opinion extra 6% frame rate of 4870 does not worth extra 5 degree F in your room.
 

Links? I havnt seen this much MUCH more. Also, do you know how this works? The heat/power uage? And what comes out of your case? Or what stays in it?
 

MxM

Distinguished
May 23, 2005
464
0
18,790

You can look even here on Toms Hardware http://www.tomshardware.com/reviews/radeon-hd-4870,1964-15.html
For example at idle, 4870 consumes 34W more than 260. This is quite a lot! And all electrical power that is consumed by your computer is converted into heat, and all heat sooner or later gets into your room out of PSU - there is simply no any other place this power can go but into your room as heat.

Also according to this http://benchmarkreviews.com/index.p...sk=view&id=198&Itemid=1&limit=1&limitstart=11 it consumes 50W more than even NVidia engineering sample of 280 under load!
 
Where I live, 34W costs me $5 for every 1000 hours, so about 5 bucks a year. I wouldn't call it "quite a lot", TBH.

Yes, the HD 4870 consumes 34W more than the GTX 260 when idle. So, underclock it as far as it goes when you're not playing, if you care. For example my 8800GTX runs at 20 degrees less and with a slower fan if I do that.

When you are playing, the HD 4870 consumes only 13W more than the GTX 260 and you actually get more fps per W from the HD 4870.

But you've got a good point there. The additional heat goes into the room, attracts the cat even more than usual, and then I get cat hairs in the PC, etc.
 

MxM

Distinguished
May 23, 2005
464
0
18,790

You know, tom's numbers are the lowest in terms of difference I have seen in various reviews. I have given you another link where HD 4870 consumes 50W more power than 280 (not even 260) under load. Most of the data I have seen on different sites is about the same as in my second link. I do not know how to explain the difference between what is on tom's and what is on the other sites, may be the particular application tom's running is beneficial for 4870, I do not know, but, if you indeed need 50W more power under load for HD 4870 then for GTX 260, then it will significantly increase temperature in the room. In my room (which is small room on second floor) it will be more than 5F if the door is closed but the home air-conditioner is on (I know that - I have tested it with other cards). That means for me that I can not use HD 4870 with closed doors at all!

Your situation may be different, you may have separate air conditioner in your room, or you may live somewhere with colder climate, but for me HD 4870 is simply a "no go" for this reason alone (and I have XFire capable mobo). I wish I could have HD 4870, because it is better in terms of performance per dollar, but performance per watt simply sucks (no matter what their marketing say)!

BTW in my price calculation I have to include the cost of air conditioner power required to cool the whole house so that I could play in that room, or I have to spend extra $400 on separate air conditioner, and have noisy environment in that room. Again, your millage can vary, just pay attention to these factors when you are making the choice.