Does Chipset-to-GPU Matching Matter?

pschmid

Distinguished
Dec 7, 2005
333
0
18,780
Experienced users might already know that Nvidia and ATI chipsets support competing-brand graphics cards, but is there a performance penalty to such mismatched combinations? We put the latest chipsets and graphics cards to the test to find out.
 

hergieburbur

Distinguished
Dec 19, 2005
1,907
0
19,780
Haven't got a chance to read the article other than the intro and conclusion, but haven't we known already for quite some time that there is no need or real gain to matching the chipset to the GPU?
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Yeah decent article to just reaffirm our beliefs that the only reason to buy a certain chipset or motherboard is for its overclocking abilites and SLi/Crossfire capability not whether your motherboard and grahpic card both say ATi on them.
Unless you really think all that other fluff is worth it. Dual ethernet?
 

prolfe

Distinguished
Jan 9, 2005
252
0
18,780
I've never heard definitively that there was not a correlation between chipset and GPU. I appreciated this article that, in my mind at least, put to bed an urban legend with clear empirical data. Cheers to THG and Patrick for a job well done! As an aside, I've never known another hardware site with so many pissy readers! If any of us had to write an article, under a deadline, that would be scrutinized by thousands or millions of readers that could analyze every word at their own leisure, I'm sure we would all make mistakes too! Give THG a break, please.
 

TSIMonster

Distinguished
Mar 10, 2006
1,129
0
19,280
GREAT Article. I am one of those matching noobs, just seems logical in my head. This article assured me that it doesn't matter.

Its ashame to see that the ATI chipset consistently performed quite a bit under the competition.
 

jeff_2087

Distinguished
Feb 18, 2007
823
0
18,980
Yeah, especially when last fall they were hyping it up as a champion performer with a memory controller possibly better than Intel's.
 

sandmanwn

Distinguished
Dec 1, 2006
915
0
18,990
Shockingly good article by THG.

There seems to be a general performance issue for the ATI solution but given its relatively short existence it may very well be driver issues. Seems like they spent most of the driver development getting the IGP performance numbers up and need to spend a little work on increasing the numbers for card based solutions.

Anyway, great work THG
 

trinitron64

Distinguished
Jun 25, 2006
302
0
18,780
I am surprised to say the least... i think if ATI or NVIDIA gains enough of a market share advantage over the other... then it would only be smart business for the chipset manu. to optimize brand specific boards to these high performing cards.

Thus forcing their own market share to follow the coat tails of the winning GPU.

I think it makes too much sense for this to happen... i fear the day when someone with a suit and a fancy pen wakes up to such a realization and we are all forced to lay down our hard earned cash to play in their monopoly.

But as it stands right now, the consumer wins (relatively speaking).
 

hergieburbur

Distinguished
Dec 19, 2005
1,907
0
19,780
Don't get me wrong, I am not complaining about the article, and as I said, I did not get the chance to read the whole thing to critique it. I was just surprised to see an entire article dedicated to the subject when i have seen several other articles draw the same conclusion (though they were usually about other topics). I guess it was just to collate all the data together into one place.
 

randomizer

Champion
Moderator
Wont have any problems with flaming at least this time, nice article. Regardless of how many times this topic is answered in the forums, people really like a "professional" opinion. Now all we have to do when this topic comes up again (which it will), we can just link this article.
 

Multiplectic

Distinguished
Apr 17, 2006
1,029
0
19,280
Just being nitpicking here...

- In all 3 gaming tests, 1024x768 was used as the "default" resolution. Shouldn't have they used a higher resolution, thus stressing the video cards and showing the chipset's capabilities to feed them with enough data? Just asking...

- Only 3 gaming tests? If they're analyzing Chipset-to-GPU matching, I think more gaming tests would have been more appropriate.

- Rendering tests... I'm ok with those. The GPU could be involved in some way (think about Gelato here, i.e.), but... Encoding? Synthetic (CPU, memory)? Those tests (IMO) proved only one thing: The combination that matters is CPU-Chipset-Memory (concerning those tests only).


I must commend THG for bringing a topic that has been discussed so many times in forums, but never analyzed seriously. But, I think they slipped a little bit. :wink:

Keep up the good work Thomas! :D


Edit:
PS: A little typo over here... ?? :?

Several Nvidia marketing partners have chosen the company's reference motherboard, from high-end graphics card brands like BFG and ECS to budget-conscious motherboard brands like Biostar and yes, ECS.
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
This is a good article, but I was dissapointed that you didn't include any numbers for 975X. As you point out, it's one of the only two viable options for someone wanting an R600 Crossfire system, and, if you're not a rabid overclocker, it's still a pretty good chipset in its own right. Useful for people with ICDE drives too! Is there any chance you could run the banchmarks for 975X as well and add them in?

It would also have been interesting to examine SLI and Crossfire performance a bit. It's rumoured, for example, that the RD600 is somehow optimised for Crossfire - more so than 975X is. It would also be interesting to see just how crippled Crossfire is by the P965 chipset, and how much difference there is between SLI performance on (say) 680i and 650i.

I also agree with the earlier poster who suggested including some higher-resolution benchmarks. Accepted wisdom has it that differences in performance between motherboards are more pronounced at lower resolutions, because at higher ones the graphics card is the bottleneck. That's a very plausibloe theory, but the whole purpose of this article is to put accepted wisdom to the test; I would like to have seen some numbers to confirm that the PCI-Express implementation (and Crossfire/SLI handling) doesn't have any influence on performance.

Still, a good article.
 

Flying-Q

Distinguished
Feb 20, 2006
643
7
19,065
Firstly, thank-you Thomas Soderstrom and THG for producing an article with some meat in it. It is a welcome and refreshing change.

Next,
- In all 3 gaming tests, 1024x768 was used as the "default" resolution. Shouldn't have they used a higher resolution, thus stressing the video cards and showing the chipset's capabilities to feed them with enough data? Just asking...

- Only 3 gaming tests? If they're analyzing Chipset-to-GPU matching, I think more gaming tests would have been more appropriate.

I have to agree with Multiplectic. As this is intended as a comparison between GPU manufacturers and chipsets makers, could we see an update with the gaming tests done at higher resolutions, such as those found in 19" and 21" LCD panels? This would fall in line with the more prevalent upgrade philosophy outlined in the conclusion to the article, and as Multiplectic said, this would stress the throughput of data from chipset to GPU more. Afterall, it is the communication between chipset and GPU that is really being tested here.

Finally, this subject needs ongoing monitoring as the manufacturers will read this article and will also likely react in the form of driver enhancements and eventually board revisions to address shortfalls and gain advantages.

Q
 

Flying-Q

Distinguished
Feb 20, 2006
643
7
19,065
This is a good article, but I was dissapointed that you didn't include any numbers for 975X. As you point out, it's one of the only two viable options for someone wanting an R600 Crossfire system, and, if you're not a rabid overclocker, it's still a pretty good chipset in its own right. Useful for people with ICDE drives too! Is there any chance you could run the banchmarks for 975X as well and add them in?

It would also have been interesting to examine SLI and Crossfire performance a bit. It's rumoured, for example, that the RD600 is somehow optimised for Crossfire - more so than 975X is. It would also be interesting to see just how crippled Crossfire is by the P965 chipset, and how much difference there is between SLI performance on (say) 680i and 650i.

I agree. This is a good suggestion for an article but I feel a separate one from that published, as it had a well defined and narrow focus.

I also agree with the earlier poster who suggested including some higher-resolution benchmarks. Accepted wisdom has it that differences in performance between motherboards are more pronounced at lower resolutions, because at higher ones the graphics card is the bottleneck. That's a very plausibloe theory, but the whole purpose of this article is to put accepted wisdom to the test; I would like to have seen some numbers to confirm that the PCI-Express implementation (and Crossfire/SLI handling) doesn't have any influence on performance.

Still, a good article.

Hehe, I was writing my other post as you were writing yours - but I type more slowly.

Q

Edit: typo
 

Flying-Q

Distinguished
Feb 20, 2006
643
7
19,065
Edit:
PS: A little typo over here... ?? :?

Several Nvidia marketing partners have chosen the company's reference motherboard, from high-end graphics card brands like BFG and ECS to budget-conscious motherboard brands like Biostar and yes, ECS.

Perhaps Thomas was emphasising that ECS was working in both high-end and budget markets?

Just a thought, as thats how I read it.

Q
 

choknuti

Distinguished
Mar 17, 2006
1,046
0
19,280
Hurrah!! This is more like the THG of old. :D Great article and like so many others have pointed out it is somewhere where we can point out to when asked the question.
 

Nossy

Distinguished
Apr 5, 2005
216
0
18,680
Agreed. They shouldve included 975X chipset as it is intended for high-end market; even though it's been out for a while. The price range for a 975X chipset falls in the same range with 680i; while the P965 chipset are generally cheaper which are in the same marketing segment as the 650i. And in general, the benchmarks around the web have consistently shown that 975X is a better performer than 965 at stock settings.

I'd want to point out that they should make the effort compare chipset that are competing in the "same" marketing segment and price range. Also drive higher resolution as we all know that performance advantage does not always shows up in low resolution settings. However, I do believe that Nvidia's claim of matching Memory (EPP Profiles), Mobo, and Graphics card is a marketing tool.

Despite what I just said, I'd recommend 975X or 965 over any nvidia chipset for anyone who don't give a crap about dual graphics setup (In other words, SLI). Better RAID and data performance, cooler chipset, lower power consumption, good OCers.
 

Chil

Distinguished
Feb 20, 2006
192
0
18,680
The only thing I got from that comparison was that both graphics cards can play 3 games at well over 150 fps, with slight variation which may or may not be due to the paired chipset. Benchmarking on more modern games (Oblivion) would be better to see just how big the gaps are between chipsets instead of making assumptions based on the 2% gain from 164 to 168 fps.

However, it seems that out of the three, the RD600 chipset consistently can't keep up, whether that be the fault of drivers or the fact that DFI's board is very fickle about the components put in it.