Can my PSU handle the RTX 2080?

derridada

Commendable
Mar 16, 2016
16
0
1,520
Hi,

I'm looking to upgrade my GPU and plan to order an RTX 2080 in the coming weeks or so (Asus or Evga I reckon).
I'll also be upgrading one of my monitors in the near future to a 2k model with 144 refresh (seriously considering the MSI OPTIX MAG27CQ), so that might factor in PSU.

I currently have a Corsair RM650i as PSU, with a gtx 970 strix GPU. My config:
Intel Core i7-6700K, OC 4200 MHz
Asus Strix GTX970
Asus Z170 Deluxe
Corsair Dominator Platinum 2666mhz 16GB (4x4GB)
Samsung Nvme m.2 970 EVO 1TB/Crucial SSD MX500 2TB/Samsung SSD 850 EVO 500GB/TOSHIBA DT01ACA300/WDC WD1003FZEX-00MK2A0
Corsair RM650i
Dell UltraSharp U2515H (DP)
Dell UltraSharp U2414H (DP)

I realize 650W is listed a 'minimum' or 'recommended', but the Evga power calculator gives me a surprising 600W as recommendation with my specs with 2080 as GPU - surprising since on their own site they list 650 as minimum.

And, if were to replace, will getting a different Corsair unit allow me to keep my current cabling? Hope to just slide it out from my nzxt H440 chassis, and then just plug in the same cables.

Thoughts?
 
Solution
With the CPU overclock, the OP will be lucky to see 400W of power draw, not ~487W SgtScream.

The RTX 2080's overclocking headroom isn't vast at all. You're looking at ~30W additional power draw from an OC'd 2080.
graphs-2080-power-640x299.jpg

https://www.overclockers.com/nvidia-geforce-rtx-2080-and-rtx-2080-ti-review/

They're not going to be pushing near full load of an RM650i, even with a (mild) OC on the CPU and maxing out the GPU OC.

Now yes, a PSU is most efficient at 50% load..... but playing the efficiency card is negligible here.
You're comparing 90% efficient @ 50%, vs 87% at higher % loads.

Even assuming running 24/7, full load, assuming kinda worst-case...
Personally, if I was overclocking, that right there would cause me to want more than the suggested minimum amount of power. If I was buying a new and expensive videocard AND monitor, then I'd have a hard time telling myself I can't afford a better power supply. Power supplies are important, and that only becomes more true as your power needs increase.
 
650 watts is plenty, however your PSU will run at near load and the fan will likely be at 100% while gaming. You should look into getting something else down the line in the 750-850 watt range if affordable. I like having spare room so that my PSU is not running max all of the time.
 

Barty1884

Retired Moderator
91W TDP CPU, 215W TDP GPU.
Even factoring in overclocking headroom & balance of components, the RM650i is more than enough for that upgrade.



Calculators have to factor in for all sorts of junk PSUs and will vastly overestimate.



With that configuration (replacing the 970 with a 2080), 100% load will be more like 350-400W, maybe a little more with OCing.
Nowhere close to 100%
 


Do you really think NVidia is going to allow the 1080ti to match the 2080? They're probably keeping the 2080 performance near the 1080ti so their remaining 1080ti stock can be sold. However when that's finished, I bet they're going to release game developed drivers that will allow the 2080 to surpass the 1080ti by a significant margin.
 


You are ofcourse allowed to belive what you want. And you might be right.
But.
I follow one simple rule. I dont see it, it does not exist.

By that I mean that at this point (today) the 2080 and 1080Ti are neck and neck. In a few instances it even performs better than a 2080. Sure the 2080 will (or atleast is should) improve some by driver optimization. But not to (and now I am quoting you) allow the 2080 to surpass the 1080ti by a significant margin.

Reason for that is that a significant margin is what? 30% to 50%?
So let be nice and say its in the 30% range since that is an "small" significant margin. Now that would in fact bring the 2080 up into 2080Ti territory. And that is not going to happen. Cause then the 2080Ti needs a boost of atleast 30% also, and that will bring it to around what? 60 - 70% better than the 1080Ti (for the 2080Ti).

Now... Here is the question. Do you really think or belive that just because of a driver update a card will perform at a minimum 30% better across the board??

Quick answer. No it will not.
Maybe a 5% to 10% increase. Maybe.
Still if you ask me that is not enough to make me say "ok" to the price difference.
The 2080 should atleast cost the same or even a bit less than the 1080Ti. Not more.

That is my opinion. I am not saying it is correct. But that is what I think and belive.

And as several others are saying.. the whole "OH the holy grail of Ray Tracing"...
Come on... what is this... a new round of PhysicsX??
Sure it is nice with new tech... but I call bullshit on this one.
 

All good points. Just curious. Did you buy a used 1080TI on eBay? I'm seeing them used almost half the price of the 2080.
 

derridada

Commendable
Mar 16, 2016
16
0
1,520


Thanks for the comment.

For fans I have:
- Noctua NH-U12S, with 2x NF-F12 industrialPPC-2000 PWM (front and back of cooling block).
- Front: 3x Noctua NF-P12 redux-1700 PWM
- Case and exhaust: 3x Be Quiet! Silent Wings 2 140mm PWM
- Currently connected to nzxt grid+, which I'm going to remove using some splitters and connect to mobo. (already removed the hue+ unit - cam software being so unreliable)

As to the monitor: Is the PG35VQ not a TN panel? My current main monitor, U2515H (dual set-up DP daisy chain with U2414H), has an IPS panel. It's not a gaming monitor by any stretch of the imagination, but it's adequate since I use my PC for text editing, DTP, watching movies, photo-editing, etc. I'm not sure TN handles non-gaming functions well. The MSI OPTIX MAG27CQ has SVA, which seems like a good solution for multi-purpose use. As to gsync, I'm not sure if it's that important; the refresh rate will make a difference I reckon, coming from a 60Hz U2515H. I'm always open to suggestions though (but seems a different thread itself). So is HDR the future the same way ray-tracing is? ;)
 

derridada

Commendable
Mar 16, 2016
16
0
1,520


Yes, I am considering the 1080Ti also. My reasoning in favour of the RTX2080 is that I'm coming from the GTX970, and I only replace my GPU every 4-5 years or so, so it's quite a commitment. So I'm tending toward a more future-oriented unit (which would be the RTX), though granted, that 'future' looks more like wishful thinking/marketing on the part of nvidia. Plus the 2080 seems to have a very slight edge in terms of performance and seems more power-efficient (and with driver updates, who knows what else can be optimized - again, maybe wishful thinking again).
But I am still open to the GTX1080Ti, for sure.

 
SgtScream: "sadly" I dont have the 1080Ti but the regular 1080 :)
I dont have the need of the powers in the 1080Ti since I am still on a 1080p @ 60Hz monitor. So that would really be throwing money and fps out the window with a 1080Ti. :D

derridada: If you are going for upgrades every 4-5 years you should infact lean more towards the 2080 since (and I hate this word) future proof :/
Sure the 2080 has ray tracing and a couple of other bells and whistles that the 1080Ti does not have. But we still dont know how well it will perform vs the Pascal series with it enabled... That makes it kinda "hard" to make a good buy on a GPU right now.
On raw data here is where the 1080Ti beats the 2080.
Memory: 11GB vs 8GB
Memory bandwidth: 484 GB/s vs 448 GB/s (not big difference that matters)
CUDA cores: 3584 vs 2944
TMU: 224 vs 184
ROP: 88 vs 64
TFLOPS: 11.3 vs 10.6

Only wildcard is the darn RTX and DLSS... And there is no way to know yet how big of a game changer this will be.

Cause from all the tests I have seen and benchmarks. All I see is an average difference of around 6 fps...
Again wildcard is how big the difference will be with RTX and DLSS enabled.
Then again what if the games if you look 2 - 3 years into the future use more than 8GB VRAM? then the 1080Ti will have a clear advantage compared to the 2080.

So reason I say "go for a 1080Ti" is as I have sayd.
You know how it performs. And will perform since you will be able to run RTX and DLSS on and off in the future games.
IF and I mean IF in the future game developers will make use of RTX and DLSS.. or it will be like PhysicsX all over again.

But anyhow. 1080Ti or 2080 = still a great game experience no matter how you look at it. Does money matter? 1080Ti does not matter? 2080. Plain and simple in my eyes.

But that is how I look at it :)
 
I finished cooler masters power supply calculator and factored the power draw to be 487w under load including your mild CPU overclock. Because we aren't factoring in possible future overclock of the nvidia RTX 2080, my advice is to purchase a 750-850w power supply. Only because power supplies are designed to be the most efficient right around the 50% load mark. Sure your existing power supply will work in a pinch, but it will be running close to full load should you ever decide to overclock the RTX 2080. If you're just going to keep a mild CPU overclock and never overclock your RTX 2080 and casually game, then you'd be fine to keep your RM650i. Just makes sure the case isn't on a rug and the power supply is getting sufficient air flow. As for the monitor, yes it's HDR technology and can push 200hz at 1440p. However its also $2000. That's a lot of computer hardware!
 

Barty1884

Retired Moderator
With the CPU overclock, the OP will be lucky to see 400W of power draw, not ~487W SgtScream.

The RTX 2080's overclocking headroom isn't vast at all. You're looking at ~30W additional power draw from an OC'd 2080.
graphs-2080-power-640x299.jpg

https://www.overclockers.com/nvidia-geforce-rtx-2080-and-rtx-2080-ti-review/

They're not going to be pushing near full load of an RM650i, even with a (mild) OC on the CPU and maxing out the GPU OC.

Now yes, a PSU is most efficient at 50% load..... but playing the efficiency card is negligible here.
You're comparing 90% efficient @ 50%, vs 87% at higher % loads.

Even assuming running 24/7, full load, assuming kinda worst-case electricity costs ($.20/kWh), buying a new PSU just doesn't make sense.

For arguments sake, rounding off to 500W full load.

500W is 575W from the wall (87% efficiency)
575w x 24hours x 365 days = 5,037,000 Watt Hours
/1000 = 5,037 kWh
5,037 @ $0.20 = $1,007/year

If you wanted to run ~50% efficient, you'd need to look at a 1000W PSU (approx).
For arguments sake, lets just assume it'll be 90% efficient on an 850W PSU.

500W = 556W from the wall (90% efficient)
556 x 24h x 365 days = 4,870,560 Watt Hours
/1000 = 4,871 kWh
3,162 @ $0.20 = $974/year

So the difference is ~$33/year

Of course, that assumes 24/7 use @ 100% load. A realistic load might be 12/7 @ 75% on average, maybe?
Price difference would be <$10/year in that case*

Even with that, that's based on $.20/kWh which is substantially higher than average.
For most people, the price difference would probably be <$5/annually.


So, if the OP runs out and buys a new 750-850W unit (of good quality), that's going to cost.... say $100?
Sells the RM650i for $50 at best (a used PSU, afterall)..... There's a $50 net spend there.

$5 savings annually, that PSU 'upgrade' will take 10 years to pay itself off. And is totally unnecessary from the outset.
 
Solution