What is Better 780ti Sli or 1070

Solution
The 1070 by a large margin.

EDIT: To clarify the 1070 is faster than the 780 Ti's when they have proper SLI profiles. The 1070 will be at least 40% faster on games that don't have SLI profiles. Also at 2K you are going to hit some games that do a lot worse with the 780 Ti's in SLI. So all around pick the 1070 there is really no downside and you wont have to mess with SLI issues.
The 1070 by a large margin.

EDIT: To clarify the 1070 is faster than the 780 Ti's when they have proper SLI profiles. The 1070 will be at least 40% faster on games that don't have SLI profiles. Also at 2K you are going to hit some games that do a lot worse with the 780 Ti's in SLI. So all around pick the 1070 there is really no downside and you wont have to mess with SLI issues.
 
Solution
As you can see here... the 1070 is about 5 % faster than two 970s in SLI

perfrel_2560_1440.png


The 780 Ti was about 5% faster than the 970

So it's pretty much a wash speed wise. However, some games will benefit from more than the 780 Tis available VRAM.... so the move will show some benefit.
 

iamacow

Admirable
I went from 780Ti SLI to a single 1070. With games that used SLI profiles they were nearly even. However games that did not, the 1070 crushed the SLI pair. Factor in when playing at 2k some games like Ghost Recon, Dawn of War 3, Batman, GTA5 all use more than 3 GB (more like 5GB) when maxing out settings at 2K. The 1070 is hand down the winner and worth the upgrade if you play at 2K or above.
 
Be aware that there is no way to actually measure what VRAM is being used. Its kinda like ya credit rating, when ya get that credit card with a $5,000 limit, and you owe $500 on it. When you apply for a car loan, the amount of liability reported by the credit agency is $5k...even tho you only "use" 500.

Same thing... a game looks at what's available and says OK, you got 8, so give me 5 ... "just in case I need it". Alienbabeltech 1st made this case when they tested 2 GB and 4 GB 770s and could find no difference in performance between the 2 GB and 4 GB models at 5760 x 1080 w/ 40+ games. Max Payne wouldn't even install ... until after the installed the 4 GB, the install allowed the setting at 5760 x 1080 ... then the swapped in the 2 GB and it ran at same fps and same image quality. The only performance differences they found was that the 4 Gb outperformed the 2 GB when settings were so high the game was unplayable (well under 30 fps) Extremetech, and others, have repeated this testing over the years with the same results. To date, the only cases where they have been able to observe a problem is when:

a) Running a high demand game at 4k with settings maxed, they were able to observe performance differences between 4 GB and 8GB but **only** when you turned settings up so danged high that the game was simply unplayable either way. Does it really matter if 8GB gets you 22 fps and 4 GB only 17 fps ? This confirmed what Alienbabeltech found with the 7xx series cards year before

b) Working hard to create situations that lay way outside the realm of normal usage. This was common when folks tried to make an issue about the 970s 3.5 + 0.5 GB arrangement but got dispelled when test sites tried to duplicate the problems but couldn't w/o trying really hard and when they did, the 4GB 980 produced the same result which again blew the deficient 970 claim outta the water.

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

If one is assuming that they can judge what VRAM is used by a particular game by firing up a utility like GPU-Z, they have been misinformed.

http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x

GPU-Z: An imperfect tool

GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”

In GTAV, we see that at no time does the game even**allocate** above 4 GB of RAM and while the Titan and 980 Ti kick butt at 1080p and 1440p, the Fury X has the higher performance at 4K, despite only having 4 GB
https://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/4

So how can we determine how much VRAM a game is using... we can't because there's simply no tool available which provides this information. We do know that any assumption or VRAM usage based upon what GPU_z or any other utilities tell us is not "real". We do know that claims of "XGB us not enough" have been proven wrong time and time again when actually tested .... games reported to need more RAM turned out to have no observed performance differences in fps or image quality. Alienbabeltech refuted this in with the 7xx series, extremetech and others refuted it with the 9xx series (two more below)

http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

there are games that break this mold ... notably poor console ports which consume VRAM like free peanuts in a beer hall.

Here's GTAV with twin (*3.5* GB 970s) up against the 1070 ... same 5% difference

perfrel_2560_1440.png


Now all that being said, I am **not** saying VRAM makes no difference... I am saying that reviewers who test w/ GPUz and then claim a certain card is no good at a certain resolution because they tested a card with 8GB and the card **used** more than 4 GB are simply being misinformed because the utility they are basing these claims on is not capable of reporting what they think it is.

But here's a good way to look at this.... the 1060 3GB and 1060 6GB differ by more than the amount of VRAM, the 3 GB model has 1152 shaders and the 6 GB has 1280. With a different amount of shaders the 6GB should be faster than the 3GB. So how can we compare performance between 3GB and 6 GB 'all things being equal. Let me propose the following. I don't thing anyone would select a card for GTAV at 4k and it stands to reason therefore that as we move from 1440p to 2160p (4k) the advantage of the 6 GB card over the 3 GB card should widen substantially. We'll rule out 1080p because in think all would agree 3 GB is enough for 1080P and the factory OC will have an impact here.

So looking at the performance of the 3 GB MSI card in GTAV, we see that the 6GB reference card delivers 72.7 fps to the 3 GB MSI card's 69.9 which gives the 6GB card a 4% speed advantage. Now if we try and take this to 4K, logic and GPU_z) dictates GTAV **needs** more than 3 GB VRAM to perform at 4k, we should see this lead widen substantially. It doesn't.

The 6 GB card delivers 36.1 fps, the 3 GB card delivers 34.6 fps... the exact same 4% advantage .... clearly, GTAV has no issue with 3 GB of RAM. Going to 6GB to from 3GB clearly brings nothing to the table in this game .... even at 4k. Now if we look at later releases, we will start to see impacts. ROTR for example shows a huge impact from 3 Gb to 6GB. However, the 6GB card is 32% faster than the 3 GB card at 1080p and yet the 6GB card is only 15% faster at 1440p.



 

iamacow

Admirable


I'm gonna stop you right there. Some games tell you how much is used. GPU-Z tells you and Afterburner also. So you have 3 ways to see how much VRAM you actually use.

I can tell you for a FACT that games that go over the amount of VRAM available severely drop in frame rate. This isn't a rumor or being misinformed. it is a FACT. End of story.
 
Being in denial won't stop anyone from anything. There are multiple industry recognized experts quoted above who explained in detail why GPU-Z, Afterburner (and everything else since they are based upon the same code) doesn't tell you what you think they are telling you.

Why didn't you add Tweak UI from Asus, EVGA precision and all the other utilities that come from the various card vendors. Having multiple versions of the same thing doesn't strengthen your position when they are all based upon the same RivaTuner utility. Multiple version of the same thing isn't more proof.

gtav_2560_1440.png


Your FACT doesn't seem to be borne out by the fps numbers in the game you chose as the basis for your position. Look at the picture above .. where is this massive frame drop ? The 3 GB 1060 with less shaders was 4% slower than the 6GB model at 1440p ... its is STILL 4% slower at 4k ... so how is 4% represent a "massive frame drop" from 4 % ?

So on one side we have

Nvidia’s Brandon Bell ... who gets paid to do what he does
Extremetech's Joel Hruska ... who gets paid to write about this stuff
Dozens of published test reports signed and published authors putting their real names and reputation on the line
The published fps numbers on GTAV, **your chosen proof** which actually not only doesn't show a "massive performance drop: from 6 GB to 3GB ... it shows no performance drop whatsoever.

And on the other side being represented by "iamacow", we have not a single reference saying that GPU-Z does what you think it does or any test data showing this massive drop in GTAV.

The proof is in the pudding as the saying goes...we know that the 6GB 1060 is faster than the 3 GB 1060 at any resolution because it has more shaders ... so if, as you say, there is a massive frame drop going from 6Gb to 3 GB @ 1440p, we should see that in the test results... so how come we don't ? why doesn't the test results show that the 6GB is massively faster as resolution increases

https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/23.html

Game - 6GB Advantage at 1080p / 1440p .. * means under 30 fps aka unplayable

Anno 2205 - 1.03 / 1.03*
Battlefield 4 - 1.03 / 1.04
Batman AK - 1.03 / 1.04
COD: BO - 1.08 / 1.05
Deus X - 1.03 / *
Doom - 1.03 / 1.03
F1 2016- 1.02 / 1.02
Fallout 4 - 1.04 / 1.05
Far Cry Primal - 1.04 / 1.06
GTAV - 1.03 / 1.04
Hitman - 1.19 / 1.26
Just Cause 3 - 1.04 / 1.05
No mans Sky - 1.04 / 1.05
Rainbow 6:Siege - 1.03 / 1.03
ROTR - 1.32 / 1.16
Witcher 3- 1.03 / 1.03
Total War: Warhammer - 1.05 / 1.04

Pick a game ...

Let's look at W3 .. at 1080p, the 6 GB with more shaders is 3% faster than the 3 GB ... at 1440p, it retains that same 3% advantage ... where is the "sever drop in frame rate ?

Let's look at GTAV .. at 1080p, the 6 GB with more shaders is 3% faster than the 3 GB ... at 1440p, it has a 4% advantage ... does the 1% difference qualify as a "severe drop in frame rate ?

How about your other example of "proof" . Let's look at Batman AK .. at 1080p, the 6 GB with more shaders is 3% faster than the 3 GB ... at 1440p, it has a 4% advantage ... does the 1% difference qualify as a "severe drop in frame rate ? The 6GB card delivers an advantage because it has more shaders and yet adding 3 GB at 1440p only has the 6GB < 3 fps above the 3GB model... 1% or 3 fps can in no way be described as massive or severe. Those are simply the numbers and they are incontrovertible.... You offered the two games of examples of why this is so and the numbers fail to bear out this position.


 


There are a couple people here who have reported problems, so 3Gb is an issue from the experience of these users.

It is true that MSI Afterburner and GPU-Z just show what the game decided to allocate for use, and not what is required for a good smooth experience. Just because the game takes 5Gb of space, because it's available, doesn't mean it needs that much to perform well.

But as said by a few users with experience with 3Gb cards, it is an issue in some games today. Just as 2Gb was an issue a couple years ago for me.
 
Buying something today is a different question than what to buy today that you will still be using 3-4 years from now. But saying that we need 4 GB at 1080p is the same as saying anything less than 16 GB is not enough at 4k as 4K is (4) 1080p screens. Our rule of thumb for what to buy today for use between now and 2020 is 1.5 the number of pixels in millions on the low end and 2.0 times on the high end

1920 x 1080 = 2,073,600 ~ 3 - 4 GB
2560 x 1440 = 3,686,400 ~ 6 - 8 GB
3840 x 2160 = 8,294,400 ~ 12 - 16 GB

As for what you can get by with for another 12-18 months or so, you'll be just fine with 1.0 - 1.5

1920 x 1080 = 2,073,600 ~ 2 - 3 GB
2560 x 1440 = 3,686,400 ~ 3 - 4 GB
3840 x 2160 = 8,294,400 ~ 8 - 12 GB

But as you see in the above .... none of the 18 games in techpowerups test suite are any more playable at 6GB than they are with 3GB

 


Many times the issues are in the form of stuttering, and not FPS. Anyway, that method of determining the needed VRAM is way off. For starters, 8,294,400 is 8MB, not GB. The actual frame buffer size is quite small compared to what is needed for rendering and storing the data sets. It doesn't scale directly in relation to the resolution size, as much of the required VRAM is to store textures, which are the same size on all resolutions.
 


I thot I made it very clear when I said

Now all that being said, I am **not** saying VRAM makes no difference... I am saying that reviewers who test w/ GPUz and then claim a certain card is no good at a certain resolution because they tested a card with 8GB and the card **used** more than 4 GB are simply being misinformed because the utility they are basing these claims on is not capable of reporting what they think it is.

So, yes there can be problems, most if those problems ... as has been reported ... come from trying to use a card that is really not suitable for that resolution / settings.

I am saying that ...

1. When ya use a card like the 970 at 4k and complain about performance problems because of VRAM, face the reality that the card itself is ill suited for 4k resolution and therefore the amount of VRAM is irrelevant. This was driven quite hard when in order to cause ANY problem with 4GB, they had to use settings (4k and high settings) which made the game unplayable even when cards with more VRAM were used.

2. GPU-z and every other utility in existence do not report VRAM in use. If you have a 8 GB card and run GPUz says 6 GB, that doesn't mean that it is using 6GB, it means that of the 8 available, 6 has been put aside.

Neither of those in any way suggests that you can't have VRAM issues ... but most are simply misdiagnosed.

 


I didn't say that 8,294,400 = anything. If I said that one should take 1 gummybear vitamin for each 50 pounds of body weight, I am not saying that 3 gummy bears weigh 150 pounds

I am saying that use X GB for each 1,000,00 pixels


 


And I'm saying, due to the fact that much of the data sets stored (textures and other data), do not change with resolution, and as a result, the needed VRAM does not scale like that.
 


Not to put it too crudely but my seat of the pants gauge says without any doubt that new major release games do indeed play much worse with 3gig video ram vs 8gig...the measuring tools may not be precise, but as a general rule they get the job done
 

iamacow

Admirable
hey JackNaylorPE I'm just stating my own results from multiple systems and video cards over the years. Going over the VRAM usually results in lower FPS. Very few games it does not affect, but most of them will experience heavy shuddering. It has gotten less of a problem because DD4 and SSDs are around now for fast memory swapping, but the fact doesn't change. Just some games have less of a issue.

People think lowering the resolution fixes this problem, but really all you need to do is turn down the texture details and AF. Easy fix but it still not max settings. So I really don't care about "other" sources say, I have run my own benchmarks across maybe 30+ games over the years with the same results. The flip side of this is even if you say have an imaginary GTX 780TI with 6GB of ram (cough Titan Black), at a certain point the GPU is the weak link and no matter how much VRAM you have, it will not making your FPS go up.

So let me know when you run your OWN benchmarks with games like Ghost Recon, Batman, Shadow of Mordor, Dawn of War 3 and tell me your not experiencing heavy shuddering because you went over the VRAM of the card.

***MIC DROP***