Witcher 3 and Crossfire R9 290x's ?

sinty

Honorable
Aug 8, 2012
192
0
10,680
Can't seem to find any solid video or blogs about this pairing that give me any decent information. Would like to pick up a second 290x and see how the Witcher 3 runs. I want to absolutely max it out and achieve a steady 50-60fps. Max 4k res, all settings on high except hairworks and sharpening.

Can the 290x in crossfire handle this at 4k resolution?
 

Rickinajijic

Honorable
Dec 4, 2013
13
0
10,510
I can't say exactly about your cards but for example I have two GTX 980 SC in sli on 2560x1440 with a ROG Swift monitor. At max settings I can run 60 FPS constant (and much of the time above) But still I average in the 60FPS to 70FPS on Witcher 3.
 

sinty

Honorable
Aug 8, 2012
192
0
10,680
Damn, Im using a better 4k tv as my monitor at hits actual 4000x res. I can play 2560 with a single gpu now and get 35-40 fps normally with certain graphics settings. I shot a vid of it. Choppy play is due to the recording and wasn't happening in game.

https://www.youtube.com/watch?v=QyPEnbMRaTs

But ya, I guess full 4k gaming isn't going to happen for me. I was really hoping xfire 290s would handle it at higher than 2500x res. :[
 

grandpamasaki

Distinguished
Nov 4, 2009
81
0
18,630
I'm only running at 2560x1440, but I saw a huge performance increase after adding a second 290x in CF. I get between 30 to 22 fps with Hairworks enabled, and shot up to between an average of 45 to 50 fps in CF. Only settings on High are foliage draw distance and background characters.

Not sure how well that translates to 4k, but I imagine you'd have a pretty solid experience with 2x 290x and Hairworks OFF.

That said, my frametimes are out of control. I get a steady rate with one card, but in CF they're bouncing all over the place, creating a lot of in-game stutter. I wouldn't recommend a CF setup for the Witcher 3 until that gets sorted.
 

sinty

Honorable
Aug 8, 2012
192
0
10,680
hmm, do you play with blur and sharpening on ? when i turn all that off, i get a nice fps boost. blur is super annoying :p but ya, once i clock up to 3000x res, i get single digit performance fps.
 

grandpamasaki

Distinguished
Nov 4, 2009
81
0
18,630
I do, haven't made any modifications under PostProcessing.

I just realized that 4k is more than 2x the pixels compared to 2560x1440. I still think you'd get a playable experience if CF wasn't a stuttery mess, but I'm now doubting you'd be able to hit the 50 to 60 range even with haiworks off.
 

sinty

Honorable
Aug 8, 2012
192
0
10,680
I am baffled by the idea nobody credible has uploaded benchmarks or even a video of crossfire 290s running 4k witcher gameplay. Sad, I woke up this morning and googled it again and naturally this thread i posted the day before is the primary result for this question.

Haha. Well, I paid full price for my r9 290x about 3 months ago when that GTX 970 nonsense happened, I wonder if it is a better route to wait for the GTX 980TI release and buy that, or buy a used secondary r9 290x somewhere. My XFX 290x sells like hotcakes on ebay and amazon, just not sure if I should sell it and invest the rest into the single 980TI.

Ugh, if only someone else actually used their nice 4k tv as their monitor instead of a "gaming" monitor that maxes at 2560. Gahh...
 

Christ J

Reputable
Apr 20, 2015
2
0
4,510
I have my MSI 8GB 290x running in Crossfire with an XFX 390x, and my 4k framerates never drop below 30fps. I usually stay somewhere between 40-55 frames. I have everything set to ultra, with no motion blur or Hairworks. I can't remember whether I play with any anti-aliasing, but I doubt I'd notice it if I did.

Just for fun, I just ran from the square in Novigrad up to the whorehouse, and I didn't drop below 40fps. I don't know if you've just given up on this thread and opted for the 980ti, but my Crossfire setup beats it all ends up at 4k. Don't listen to anyone who claims that 4k is exclusively limited to Titans and Furys. 290/390s and 970s in SLI and Crossfire handle it very well.


VERY impressed with these AMD cards at higher resolutions. Interestingly, when I get around 50fps at 4k I only get around 80fps at 1080p.

 


Will not happened. I have this setup and my rig is getting owned.

Use drivers 15.6... and enjoy the flickering... but at least you got really good performances at 4k, literally the best I can think of. I don't know many setups getting 50 FPS with everything at max except stupid and ugly hairwork at 4k... but mine does. Honestly it is not the hardware, it's the drivers. AMD just gave up on the Witcher 3.

Honestly, this is totally unacceptable from AMD and CD Projeckt. I just tried the new AMD drivers and it is the same broken CF profile as in 15.7.

Now I need to patch to 1.10 to see if there is a major difference.

So far, I might just stick to 15.6... and buy an Nvidia card when Pascal is released next year. For now, nothing will really give better performance on the market, which is sad.

 
Just did some test and I can confirm you that 15.6 is still the best driver for CF in the Witcher 3.

Still, I found the latest drivers with the latest patch to offer a more stable experience at a constant 35 FPS with 15.9.1, however 15.6 offer 45 FPS average with 290x crossfire with flickering...

Pick your poison...
 

sinty

Honorable
Aug 8, 2012
192
0
10,680
Damn! Thanks for the testing though. I've opted to drop down to 2560x res and play in 2k instead of 4k on just one 290x. With vsync on and with my 4k projector, the motion smoothing effect is addictive and I've found that its not worth investing more into dual gpus. I've just built a mini itx rig with only room for one gpu. Maybe when 4k can be handled more efficiently with new gpu sets in the future, ill upgrade. For now, it seems futile to even try.

The difference between a vsynced 60fps and 30fps in certain games that I can run on ultra is just huge, most might not be able to see it but I am using Sonys new 350es 4k projector and the motion smoothing (gliding soap opera effect) is immensely powerful and perfectly clean without any haloing in the slightest. What a shame GPU's are so behind with 4k needs, damn I hope that changes next year.

 

TRENDING THREADS