Sign in with
Sign up | Sign in
Your question

Crown back to Team Red, 3870X2

Last response: in Graphics & Displays
Share
January 22, 2008 1:07:44 AM

http://img3.pconline.com.cn/pconline/0801/20/1210234_08...

3DMark06 1280*1024 scores 18467 with a 3.6GHz Intel quad-core :bounce: 

12578@2560*1600

Really though power, huh? :D 


I wonder what toy our green team's gonna show us. :hello: 
January 22, 2008 1:13:01 AM

A heavily OCed 8800GTS 512MB gets around 17200+ in 3Dmark06 with the 3.6Ghz quad cpu.

I'm not sure about this until I see some real benchmarks.
January 22, 2008 1:14:56 AM

I think ATI's dual chip solution > nVidias. Mainly because of heat dissipation and the way it is interconnected.

I can't wait to see RV770.
Related resources
January 22, 2008 1:15:45 AM

I like the black PCB, now if only they'd change that red shroud!
January 22, 2008 1:17:25 AM

Color shouldn't matter, really. Performance>>>color

Unless you show it off of course.
January 22, 2008 1:33:44 AM

cnumartyr said:
I can't wait to see RV770.


Yea, I am interested to see how this turns out as well. Even though I just got the card in my sig about a month ago. lol

Best,

3Ball
January 22, 2008 1:34:41 AM

I can't wait to see some more thorough benchmarks. I wonder if it will dethrone the GTX?
January 22, 2008 1:36:23 AM

TSIMonster said:
I can't wait to see some more thorough benchmarks. I wonder if it will dethrone the GTX?


The word is it's faster than the Ultra. I don't know if that's in optimized setups or what... I'm really not sure how the Crossfire scaling works on it or if it is better than 2 HD3870 in scaling.

Eitherway... the initial reports of the HD3870X2 have me really excited for the summertime with the competition heating up with the G100 and RV770.
January 22, 2008 1:45:05 AM

Evilonigiri said:
A heavily OCed 8800GTS 512MB gets around 17200+ in 3Dmark06 with the 3.6Ghz quad cpu.

I'm not sure about this until I see some real benchmarks.

Whered you get that at? Im qurious as to how they got that number. Do you remember the link?
January 22, 2008 1:51:39 AM

JAYDEEJOHN said:
Whered you get that at? Im qurious as to how they got that number. Do you remember the link?

Yep. The proof is the person right above you. ^^

EDIT: Maybe I should of said 17000+
January 22, 2008 1:53:58 AM

JAYDEEJOHN said:
Whered you get that at? Im qurious as to how they got that number. Do you remember the link?




I think he got it from my Photo Bucket. All air ftw.
January 22, 2008 1:56:19 AM

Readily bragging as always :kaola: 
January 22, 2008 2:00:19 AM

Evilonigiri said:
Readily bragging as always :kaola: 


You brought it up this time!
January 22, 2008 2:07:45 AM

cnumartyr said:
You brought it up this time!

And you fell for it :p 
January 22, 2008 2:16:48 AM

I would really like to see a direct comparison of an HD3870X2 versus 2 HD3870s in SLI on an X38.

I would really like to see if the Dual Chip PCB makes an improvement over the dual card solution. (Yes this means overclocking both HD3870s to match the X2 since the X2 stock clock is 825 MHz).
January 22, 2008 2:18:36 AM

cnumartyr said:
I would really like to see a direct comparison of an HD3870X2 versus 2 HD3870s in SLI on an X38.

I would really like to see if the Dual Chip PCB makes an improvement over the dual card solution. (Yes this means overclocking both HD3870s to match the X2 since the X2 stock clock is 825 MHz).

I didn't know 3870's can do SLI...
January 22, 2008 2:20:26 AM

Evilonigiri said:
I didn't know 3870's can do SLI...


Sorry I don't live on the West Coast. It's 11:20 here and I'm tired. CROSSFIRE. YOU'LL GET CAUGHT UP IN THE ... CROSSFIRE.. CROSSFFFFFFFIIIIIIREE.

January 22, 2008 2:22:02 AM

Evilonigiri said:
:( 

You broke off the guns remember?


Just makes the game that much more interesting with balls flying everywhere.
January 22, 2008 2:23:01 AM

cnumartyr said:
Just makes the game that much more interesting with balls flying everywhere.

Go paintball or something then.
January 22, 2008 2:24:42 AM

Evilonigiri said:
Go paintball or something then.


You know as well as I do I have too many other expensive hobbies already.

Anyways.. HD3870X2 vs 2 HD3870s would be a fun test. As long as all clocks were equal... to see if the Dual GPU design has any interconnect advantages.

I'd also like to see a 9800 GX2 vs 2 8800 GTS G92s all at the same clocks to see the similar kind of thing from the nVidia camp.
January 22, 2008 2:27:08 AM

that game was awesome, remember that kid with the leather jacket flying around on a hover-skateboard? Dude was badass, ATI needs to get copyrights to the game and intermix them....think about it, R770 will be able to shoot out lead balls at nvidia fan boys etc.
January 22, 2008 2:27:42 AM

Look, I calculated that the 9800 GX2 that is coming out will be about 11% faster then the 3870 X2 with no AA on.

But, 2 3870 X2 in crossfire, will be about 24% faster then 2 9800 GX2 in SLI with no AA on.

I came out with these numbers after hours of meditation and about 10mins in calculations.

So the Crown will go both ways, depending how you look at it!
January 22, 2008 3:05:26 AM

It would appear that if there were no advantage that there would be no real need to do it. I did read somewhere that ATI claimed that there was advantages because the signals could be optimized between the two GPU's since they were on the same PCB.

I do not mean to get int a fight here but Crossfire scales much better than SLI and if you figure in the historical cost structure between the 2 then the 3870 should win all the way around. Unless Nvidia breaks from their premium pricing structure.

Fry's is going to be selling the 3870x2 for $499. (may change) They have been selling the 3870 for $279 and have just recently lowered the price to $259. It also appears that at this time Bestbuy is NOT going to carry the 3870x2. With Compusa out of the loop finding a card locally may be very difficult. Sooooo.... If you want one of these beast the keyword is "PREORDER".

My delimma is that I want to be able to drive 4 monitors on my game machine because it will do double dutyfor work but I do not want to spend $1000 for videocards so I my have to just get 2 3870's and take a hit on game performance
January 22, 2008 4:26:57 AM

I'd reckon that the interconnect between two GPU's on the same pcb carry less latency as opposed to two separate gcards if implemented correctly in terms of load sharing and sync. The issue here would be heat, regardless of fabrication and I'm only saying this as I'm very opposed to noisy coolers due to sub standard cooling.
January 22, 2008 11:26:22 AM

whatever..., even the 2900, did excellent in 3Dmark06 but still got bitch slapped in game performance, you aint proved nothing.
January 22, 2008 11:56:39 AM

The_King said:
It does beat the 8800 Ultra

http://en.expreview.com/?p=219#more-219

And im going to buy 2 of these and run them in X-Fire


Put your money where your mouth is, I believe that all the ATI cards have a benefit because of 3Dmark. But if you look at the Game performance... The Ultra still beats the 3870x2 So it seem's that it shouldn't be an improvement.

Let them battle, I'll wait for benchmarks, and buy a new card in 2 years. My GTS 512 plays crysis at high... everything at high. without AA at 30-40 fps @ 1680x1050.

Sicking tired of people putting up benchmarks with 3Dmark, oh no look at my 3D mark its so high... Few moments later... hey how come you have 30 fps and I 20 while playing crysis.... My 3Dmark showed different. This has been going for a long time now.

Just hold your horses and wait for Game benchmarks that show the 9800 GX2 and 3870x2 ...
January 22, 2008 12:50:15 PM

radium69 said:
Put your money where your mouth is, I believe that all the ATI cards have a benefit because of 3Dmark. But if you look at the Game performance... The Ultra still beats the 3870x2 So it seem's that it shouldn't be an improvement.

Let them battle, I'll wait for benchmarks, and buy a new card in 2 years. My GTS 512 plays crysis at high... everything at high. without AA at 30-40 fps @ 1680x1050.

Sicking tired of people putting up benchmarks with 3Dmark, oh no look at my 3D mark its so high... Few moments later... hey how come you have 30 fps and I 20 while playing crysis.... My 3Dmark showed different. This has been going for a long time now.

Just hold your horses and wait for Game benchmarks that show the 9800 GX2 and 3870x2 ...


Did you look at the link? Scroll down bellow the 3DMark 2006 results. The 3870x2 handily beats the 8800 Ultra in COD4 (even with 4AA and 16AF) and is only a few FPSs behind in Crysis. The newest Cat Drivers were used, Crysis 1.1 and on Vista.

Looking forward to full reviews on this card.
January 22, 2008 3:18:26 PM

So roughly an average of 12.5% faster than an 8800 Ultra (I threw out the highest and lowest scores, and disregarded 3DMark06).
January 22, 2008 5:32:39 PM

radium69 said:
Put your money where your mouth is, I believe that all the ATI cards have a benefit because of 3Dmark. But if you look at the Game performance... The Ultra still beats the 3870x2 So it seem's that it shouldn't be an improvement.

Let them battle, I'll wait for benchmarks, and buy a new card in 2 years. My GTS 512 plays crysis at high... everything at high. without AA at 30-40 fps @ 1680x1050.

Sicking tired of people putting up benchmarks with 3Dmark, oh no look at my 3D mark its so high... Few moments later... hey how come you have 30 fps and I 20 while playing crysis.... My 3Dmark showed different. This has been going for a long time now.

Just hold your horses and wait for Game benchmarks that show the 9800 GX2 and 3870x2 ...


http://www.pconline.com.cn/diy/graphics/reviews/0801/12...

Although the review is in Chinese the 3870 X2 beat the 8800 Ultra in 7 out of 9 GAMING benchmarks
excluding 3D Mark.

Im sick and tired of people who think they know it all and the and talking about the 9800 GX2
(sorry but you deserved it) :kaola: 
January 22, 2008 6:33:43 PM

radium69 said:
Put your money where your mouth is, I believe that all the ATI cards have a benefit because of 3Dmark. But if you look at the Game performance... The Ultra still beats the 3870x2 So it seem's that it shouldn't be an improvement.

Let them battle, I'll wait for benchmarks, and buy a new card in 2 years. My GTS 512 plays crysis at high... everything at high. without AA at 30-40 fps @ 1680x1050.

Sicking tired of people putting up benchmarks with 3Dmark, oh no look at my 3D mark its so high... Few moments later... hey how come you have 30 fps and I 20 while playing crysis.... My 3Dmark showed different. This has been going for a long time now.

Just hold your horses and wait for Game benchmarks that show the 9800 GX2 and 3870x2 ...


ATI cards do well in 3dmark06 cos they tear through the last test 'deep freeze'. Canyon flight results alone might be a better indicator of gaming performance as they come in a bit below nvidia cards in the same price bracket.
January 22, 2008 7:43:35 PM

The_King said:
http://www.pconline.com.cn/diy/graphics/reviews/0801/12...

Although the review is in Chinese the 3870 X2 beat the 8800 Ultra in 7 out of 9 GAMING benchmarks
excluding 3D Mark.

Im sick and tired of people who think they know it all and the and talking about the 9800 GX2
(sorry but you deserved it) :kaola: 

6 out of 9 NFS graph is wrong but lets wait until 9800GX2!
January 22, 2008 8:05:25 PM

So basically you have a dual video card in one package setup that is beating a single card setup and you figure that this is spectacular for some reason. Now if it was just a single card against a single card that would be something, but this is like when Nvidia came out with the 7950 dual card in 1 setup. It would be something to cheer about if it wasn't for the fact it's hilarious.

R Collins
January 22, 2008 8:08:54 PM

spoonboy said:
ATI cards do well in 3dmark06 cos they tear through the last test 'deep freeze'. Canyon flight results alone might be a better indicator of gaming performance as they come in a bit below nvidia cards in the same price bracket.


A better indicator of what gaming performance? Current only or future?

The thing about 3Dmark is that it's NEVER been a great game predictor it's been a better feature and hardware tester within it's own products.

Optimizations and Floptimizations negate any real comparison. Perhaps the HD series is that much better, however because games are optimized and sold the way they are it's irrelevant, because they play differently than the benchmark runs.

Unlike what some people like Radium seem to suggest, it's not built to favour either company, if anything in the GF6&7 generation 3Dmk favoured nV because of their ability to use DST (which rarely was a game feature), now both companies offer the support (plus Fetch 4) and so really it's just the see-saw back and forth between designs. The main thing is the standard for 3Dmk involves massive workloads without much regard to AA performance, which reflects poorly on most other tests/games, but it may reflect the future performance of shader based games, so neither is a cut and dry route.

As always the best thing 3Dmark shows is..... whcih is best at 3Dmark.

That being said it sounds like 3Dmark Vanatage will give an even more narrow picture which may break more people's insistance of using it as a game predictor.

Don't get me wrong 3Dmark and it's bungholio scores have a very good place in people's toolbox, but too often they're used in ways they were never intended to be used and to predict/say things they aren't capable of.
January 22, 2008 8:20:37 PM

Lol I get much more than that with my 18-month old 8800 GTX SLI setup.
January 22, 2008 9:32:49 PM

kg4icg said:
So basically you have a dual video card in one package setup that is beating a single card setup and you figure that this is spectacular for some reason. Now if it was just a single card against a single card that would be something, but this is like when Nvidia came out with the 7950 dual card in 1 setup. It would be something to cheer about if it wasn't for the fact it's hilarious.


Actually it's a single card versus single card. The GX2 was dual card single slot so it's not the same. As long as the software supports it the same physical limits to the Ultra are available to the X2, whereas the GX2 is limited in both cooling options and bridge scaling options. So as long as price/performance is worth it it's a relevant player, although mainly for the e-wang stuff not for sensible solutions.

It's not a impressive as a single chip solution, but considering the future both companies and intel are headed with their more modular designs, this is far from laughable, and really more of a preview of things to come.

The thing that wouldbe laughable is if it becomes a dead-end similar to the GX2 where alot of their advancement/work gets lost.

Not my cup of tea by any means, but this is likely the future of high end graphics, so anyone interested in such things better get used to it.
January 22, 2008 10:30:56 PM

Having read the only english review so far they said CCC doesn't detect it as a Xfire setup but as a single card which I found quite interesting. As I've said in another post if this review is anything to go by all the reviews will be done using an outdated driver.

It isn't as impressive as a single chip solution but it is a step closer to multi-gpu's. Think of it as a 2P server with 2 single core cpu's. There is a nice performance increase but its just not as efficient as a 1P dualcore cpu server.

As I have previously said and Ape said the R600/RV670 core still has a lot of potential in it for games sepecifically optimised to use all its shader power. It might even be that in the future if programmers to take the heavy shader approach that the R600/RV670 core will suddenly see a performance boost.
January 22, 2008 11:13:53 PM

I'm just getting the Gigabyte GIGABYTE GV-RX387512H Radeon HD 3870 512MB to hold me over until midrange R700 by the end of the year.

The 3870x2 card looks good though. It has two GPU's on one PCB. I"m sure it will do better than the 9800x2, but what we all really need from ATI and Nvidia are dual, triple and quad core GPU's on one die. Unless of course there's something about GPU architecture that makes that unlikely.

When is the Evil Netburst Empire getting into the discrete GPU game? They've routed the rebels with their C2D onslaught, so their next target is the neutral planet ruled by Princess Envidia, who's main rival recently joined the Rebel Alliance. Not since the days Princess Envidia was stood up by that loutish FX has she been in such dire danger. :) 
January 22, 2008 11:41:18 PM

this is certainly not the gtx killer I had been hoping for
January 23, 2008 4:02:56 AM

checked the thg review already. the 3870 is sweet, i just hope they come under $500.

the thing that intrigues me the most is that its over a year already and ati still cant outperform a year old architecture by a large margin. my theory is that even nvidia can outdo their old architecture themselves... its been so stagnant argh!
January 23, 2008 4:06:32 AM

niz said:
Lol I get much more than that with my 18-month old 8800 GTX SLI setup.


Yes and your setup cost almost 4 times the price.
January 23, 2008 9:10:04 AM

TheGreatGrapeApe said:
Actually it's a single card versus single card. The GX2 was dual card single slot so it's not the same. As long as the software supports it the same physical limits to the Ultra are available to the X2, whereas the GX2 is limited in both cooling options and bridge scaling options. So as long as price/performance is worth it it's a relevant player, although mainly for the e-wang stuff not for sensible solutions.

It's not a impressive as a single chip solution, but considering the future both companies and intel are headed with their more modular designs, this is far from laughable, and really more of a preview of things to come.

The thing that wouldbe laughable is if it becomes a dead-end similar to the GX2 where alot of their advancement/work gets lost.

Not my cup of tea by any means, but this is likely the future of high end graphics, so anyone interested in such things better get used to it.


You might be correct it is the future:

http://www.techpowerup.com/

"According to a news story by the Chinese newspaper Commercial Times, ATI's RV770 GPU is due to be commercially released in late Q2 2008. Two of such chips are projected to power AMD’s new high-performance ATI R700 graphics card."

so another 3870x2 sort of card coming out this summer.

On the 3dmark point, I meant that taking test 3 results would be a better indicator of gaming performance than the aggregate score, as ati comes in slightly behind comparable nvidia cards in this test, which reflects true life as they are behind on average in gaming.
January 23, 2008 1:06:16 PM

MikosNZ said:
Yes and your setup cost almost 4 times the price.

Actually it would be 2 times as much since its $500 vs $1000.
January 23, 2008 6:12:02 PM

spoonboy said:

so another 3870x2 sort of card coming out this summer.


Depends on how you look at it.
Alot of people are confusing multi-chip with multi VPUs, and what has always be mentioned as the goal for the R7xx generation is modularity on a more simplified setup (dual GPU massive board is not simplified for the AIBs).

This probably gives the best description of the options and some of their benefits/drawbacks from a processor fab perspective (try to ignore CPU vs VPU for a second);
http://www.tomshardware.com/2005/10/10/intel_moves_from...



Now yipsl is talking about multi-core on a single die which would essentially be similar to what we have for most current multi-core CPUs and would be detailed by the Athlon on that page; however this isn't as effective for the GPU which is already multifaceted and has a ton of parrallel processors where the best would be of course the giant itanium core with little duplication, but this increases costs significantly.

Now the other two options easily seen are either the Smithfield or Presler, however from the early talki it sounds like the RV770 is using a Smithfield type dual-core die with a double core Presler packaging idea (see above left) for the R700.

So what you end up with is 4 cores easily modularized for 3 different SKU lines;

HD4800 = 2 RV770 die for a total of 4 core on a single R700 'presler style' package/VPU (still single socket).
HD4600 = 1 RV770 die with both functioning core for a total of 2 cores.
HD4200 = 1 bin'ed RV770 with one core disabled giving you 1 core.

This should allow very god scaling because all of the communication is done on die and not through long wire traces on the PCB and there is no need to duplicate memory etc.

However should you find yourself behind the nV solution again then you could take a page from the HDxxxxxX2 and put 2 R700 packages/VPUs together and have 8 cores on one card to make the HD4900 or whatever.

To me that lets AMD satisfy almost all of their production needs with just 1 chip run which should be far more cost saving than having 3 seperate chips. Of course you need to learn to effectively manage the cores. However if succesful you could probably satisfy all of your customers with that chip and then a low-power, high functionality IGP style chip that would be good for HTPCs, OEMs, and IGP solutions.

Gluing single cores together is also doable but the scaling and interface surfaces offer different limitations, whereas a dual core being stuck to another dual core is far easier than sticking 4 cores together.

Anywhoo, we'll have to wait and see, but to me that's the best way to approach the future and keep cost low and yields high (compared to one giant 1-2 Billion transistor chip yield).

As always, just my two frames' worth from the cheap seats.


More pics to get a feel for what 2 dies on one package looks like;

http://www.tomshardware.pl/business/20050825/images/pre...
http://images.tomshardware.com/2005/12/03/presler_proce...
http://ismailfaruqi.files.wordpress.com/2007/05/presler...
http://img158.imageshack.us/img158/6462/2ja4.jpg


Consider each die dual-core and you get what I mean.
January 23, 2008 6:46:30 PM

Anyone know if you x-fire a 3870 with a 3870x2 if it will scale nicely, or will it be limited by the RAM and clock rate of the single 3870?

I am thinking of picking one up, but I am not sure if it will scale properly. The x2 has 1GB of RAM, but i am not sure if it is spread across both GPU's OR if it is shared memory that both GPUs can access.

Any insight would be appreciated.

Jay
!