Titan X vs R9 295x2 CF

likes0079

Reputable
Jul 3, 2014
90
0
4,640
Since the benchmark for Titan X came out, and I never had any AMD/ATI cards, should I give it a try with two 295X2 over the up coming Titan X?

I don't mind the heat output and power efficiency, just needing pure performance for gaming and very little 3D modeling.
PSU: Corsair 1200AXi
Will also put a custom block on either the Titan X or both 295x2.
It seems like the cost difference of these two set-up varies from $250~$350 (295x2 CF more expensive)
 

chenw

Honorable
The second 295x will scale fairy poorly because you are effectively adding two dual card Xfire to become a Quad Xfire. Fire/SLI scaling usually drop off very quickly after the second card.

Besides, XFire profiles has been lacking lately from AMD.
 

Sahaj Pal

Honorable
May 15, 2013
287
0
10,860
72536.png

72537.png


As you can see, the R9 295X2 actually runs more cooler than the GTX TITAN X.

However, it's more louder than the Titan X, but that shouldn't be much of a concern. Just keep your setup at a distant level lol :p
72538.png


Hmm Interesting, the R9 295X2 is somewhat less louder than GTX TITAN X on full load.
72539.png
 

likes0079

Reputable
Jul 3, 2014
90
0
4,640


Thx, that's what I am worried besides the potential psu wattage problem.
But still from the benchmark I generally read, major titles like Crysis 3 , BF4 etc... mostly a 295x2 can outrun TitanX while CF them will have 30% increase. In your opinion, I take it as going to TitanX would probably be a better idea since $/performance sucks for the second 295x2?




 

likes0079

Reputable
Jul 3, 2014
90
0
4,640


Thank you for the input, but I dont really care about the temp nor the noise, as the will be under custom block anyway.
I have two 480 rads, 16 AP-15, and a D5 Vario on hands and don't mind to pull them externally to some other room if needed.
 

likes0079

Reputable
Jul 3, 2014
90
0
4,640


That's why I can't do 2 titanx since I am on a budget of $1500 for gpus.
Can't afford 3x 980s, and I don't own a X99 platform.
 

-Lone-

Admirable
Just get 1x Titan X and super OC it, since you'll water cool it, temperatures shouldn't be much of a problem. One of them OC'ed can give you 50-70 fps at 4k ultra in most games. The only reason "they" think it is not a ready single GPU solution because of the Titan X was not OC'ed and their obsession with turning on AA during the tests. AA is not needed at 4k at all. The main point of 4k is to get the majority of it to work and display with a playable fps instead of worrying about a single setting which is meaningless. So one of Titan X will keep it happy for a long while, as the time pass, you'll only gain more money later and you can get a second one down the road and finish 4k once and for all until the next generation of DP 1.3 GPUs and displays.
 

-Lone-

Admirable
I didn't, let's just say my money is in their pockets right now but they're holding onto the Titan X tight and not letting it go. They're 24 hrs late from when it was supposed to have released to the market by now. The Evga standard version is on Amazon right now for $999 for pre-order. Doesn't seem to be any point of getting others since Evga has all the special variants. The others are just same specs as reference model except with their name on it. Evga has SC, hydrocopper, and possibly FTW if Nvidia was right during live chat. Evga said the standard and SC variant is just the same, you can OC the standard and it'll be exactly like the SC variant. So I'm not going to wait and just get the standard.

http://pcpartpicker.com/parts/video-card/#c=221
 

-Lone-

Admirable
2x Titan X will definitely be enough for 4k 60 fps for all games, no way it won't be enough, unless it is just very bad game optimization :) But yeah, definitely share the results, the first thing I'm going to do real quick is test in games without OC to see how high the temperature will get to, even with 10 fans. Then it is time to OC and see real results. If it'll take a few weeks until a single Titan X comes out, it'll be a very very very long few weeks for me, lol.
 

-Lone-

Admirable
Yeah, waiting for that long without a PC, not happening with me, lol. Plus I want to see how Nvidia is doing these days since I just switched sides :) AMD might and possibly be better, but without DP 1.3, all that GPU power is pretty pointless.
 
A couple recent reviews had some good points...

Quote:
"Here is the big problem with the AMD Radeon R9 295X2 video card, it relies on CrossFire profiles to function and scale well in games. If there are no profiles for CrossFire in said game, you will not have CrossFire support. You can try to turn on the option to force CrossFire on games that have no profile, but it may not work, and if it does, there could be bugs. Even then, it won't be as efficient as AMD optimized profiles in drivers for new games.

AMD has been behind recently on providing CrossFire driver profiles for new games. The latest driver from AMD was released in early December of 2014. It is now mid-March 2015 and no new driver yet. That means all games released between then and now have had no optimized CrossFire driver profiles. Far Cry 4, which has been out since November of 2014 still has no CrossFire support. Dying Light, a new game, also has no CrossFire support officially.

It is this that makes the AMD Radeon R9 295X an unappealing product. When profiles are working right though, it can be a fast video card. In this evaluation we showed four games that CrossFire will work on, though in a roundabout way for Dying Light."
http://www.hardocp.com/article/2015/03/16/asus_rog_poseidon_gtx_980_platinum_vs_amd_r9_295x2/8#.VQpiC-lFCUk


Quote:
"That situation is put into stark contrast when we look at what’s happening with AMD’s driver development as of late. Not only has it been nearly four months since their last revision – which is an eternity in the PC gaming space- but the new release they gave us in time for the TITAN X review leaves much to be desired as well. At 1440P it performed admirably (albeit with the Metro: Last Light problems remaining unresolved months after they first reared up) but 4K compatibility was a hit and miss affair at best. Crossfire profiles were either broken, missing or under-performing in Metro: Last Light, Hitman Absolution, Dying Light and Far Cry 4. That’s four out of the nine games we included in this review and that poor showing ultimately pushed down the card’s 4K framerates. With that taken into account the R9 295X2 is simply not a viable alternative to the $999 TITAN X at this time, even at its current price of $699."
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/68992-nvidia-titan-x-performance-review-17.html
 
I never really understood this part. Why do we need a Crossfire/SLI profile for every single game? Why doesn't it work directly through some "program" in the driver itself? If they can incorporate G-Sync via add-in hardware for a monitor, they can incorporate this. The GPU's their own, after all. And NVIDIA drivers do work even if the board is not reference(board I'm talking about, not cooler).
 

chenw

Honorable
Most likely because every game engine works differently and it may differ enough that a general SLI/Xfire profile just isn't feasible. G-Sync is different in that every G-Sync module is very similar, if not completely identical, and they directly control what they can put on G-Sync module, so there are no problems with having a generic driver for all G-Sync monitors.

I'd say it's a matter of Monitors need to conform to their G-Sync module vs Drivers need to conform to each game engine.