1440p 144Hz Gaming - Do I REALLY need a 1080 Ti?

Mick288

Prominent
Feb 19, 2017
9
0
510
Guys,

I recently built a system and purchased a 1440p 144Hz monitor but waited out before splashing on a GPU because of the imminent release of the 1080 Ti.

I want to make use of the 1440p 144hz (who wouldn't?), but do I need the 1080 Ti to achieve this in the newer AAA titles?

The games I'm likely to be playing are GTA V, Witcher 3, Rise of the Tomb Raider, BF1, Star Citizen etc. Id like to be able to play them at ultra.

My build consists of;
i7-7700k (no OC)
be quiet! Pure Rock Slim
ASUS STRIX Z270H Gaming
2x8GB Team Dark 3000MHz
250GB 850 Evo M.2
2 TB Seagate Barracuda
EVGA SuperNOVA P2 650W PSU
All in a NZXT S340 Elite with a
ACER XG27HU monitor.

Upgrades i will likely make in the future are an AIO cooler for the CPU when i start to OC it.

What do you reckon? Save and get the 1080 Ti or just buy a top end 1080 and save a few hundred bucks?

Cheers!
 
Solution
The issue is that maximum settings in modern games cripple your fps. Especially if you want to hit 144 fps consistently at 1440p.

That's why I ended up with a second GPU and a Freesync monitor, as I like to play shooters at around 144 fps @ 1440p.

If you don't want to compromise on your fps or settings, you'll need to get a 1080Ti as it is the fastest out there atm.

Personally, I just dial it down a little where necessary. Ultra vs. High or Very High makes no real difference to me.

JDubstep

Commendable
Aug 22, 2016
33
0
1,540
Rumor mill has it that the upcoming Vega launch (June) will be another bargain product from the red team (AMD) that's going to sit anywhere from 1-5% less performance than the 1080 Ti. The real question is, how much price difference? Something else to keep in mind if you're willing to wait. We could all quite possibly see drops in the prices on 1080 Ti as well from the Vega release.
 

The0nlyGamer

Prominent
Apr 11, 2017
13
0
520
You absolutely don't need a 1080ti for 1440p at 144hz. With a 1080 you could play every game comfortably at 1440p and 144hz but you might need to turn the settings down to high on some games like The Witcher 3.
 

The0nlyGamer

Prominent
Apr 11, 2017
13
0
520


Go look at some benchmarks.
A lot of people are falling into the trap of thinking they need to max out in game settings. Visually if you choose settings somewhat carefully you will see no difference in the appearance just get better frame rates.
 
The issue is that maximum settings in modern games cripple your fps. Especially if you want to hit 144 fps consistently at 1440p.

That's why I ended up with a second GPU and a Freesync monitor, as I like to play shooters at around 144 fps @ 1440p.

If you don't want to compromise on your fps or settings, you'll need to get a 1080Ti as it is the fastest out there atm.

Personally, I just dial it down a little where necessary. Ultra vs. High or Very High makes no real difference to me.
 
Solution

ddferrari

Distinguished
Apr 29, 2010
388
6
18,865

I completely agree that there is little to no visual difference between ultra and high settings- but GPUs certainly notice.
I'm gaming at 1440p as well on a 120Hz monitor and I fully expect that my Aorus Xtreme 1080 Ti will not even hit that mark in some games. In Rise of the Tomb Raider I'm dipping into the 70's at times and since it's the first game I tried on this card I was about to rma it.Then I read that the game is poorly optimized.

So to calm my fears I just ran the Metro Last Light benchmark with everything maxed and scored a 119.7 fps average- and it's a very demanding title. I just replaced an SLI setup with this card, but I may eventually move to 4K and add a second card down the road. Sadly this is a triple slot card so a new motherboard might be in order to make room.