Sign in with
Sign up | Sign in
Your question

Updated post - 9800 GX2 , 9800GTX spec's and 3DMark06 from VR-Zone.

Last response: in Graphics & Displays
Share

Which card do you think will be top dog?

Total: 144 votes (23 blank votes)

  • ATI's 3870x2
  • 37 %
  • Nvidia's 9800GX2
  • 64 %
February 20, 2008 7:33:10 AM

_________________________________________________________________________________________________
**************************************************************************************
_________________________________________________________________________________________________
**************************************************************************************


******UPDATED AGAIN******

Well it seems I have more news on the 9800GX2 and the 9800GTX, please don't accuse me for posting fake pictures or links. We are individuals which means that we must come to our own decisions on weather these links are truthful. I guess my 3DMark06 scores & GPU-Z pictures were not fakes after all.

http://www.vr-zone.com/articles/GeForce_9800_GTX_Card_P...

http://www.vr-zone.com/articles/GeForce_9800_GX2%2C_980...

http://www.vr-zone.com/articles/GeForce_9800_Series_Lin...

http://www.vr-zone.com/articles/GeForce_9800_GX2_Launch...

Quote : From VR-Zone.com


GeForce 9800 GTX Card Photos & Specs Unveiled: VR-Zone has gotten some details and photos of the GeForce 9800 GTX (G92-P392) card. It comes with 12-layer PCB at 4.376" by 10.5". This card is clocked at 673MHz for core and 1683MHz for the shader while memory clock is yet to be determined. The memory interface is 256-bit with 512MB of 136-pin BGA GDDR3 memory onboard. It comes with two DVI-I and one HDTV-out. There are two SLI connectors and two 6-pin PCIe power connector. The card employs the CoolerMaster TM67 cooler where the fan is rated at 0.34A, 4.08W, 2900rpm, 34dBA. The total board power is 168W.



Quote :From VR-Zone.com



VR-Zone has gotten some preliminary 3DMark06 scores on the upcoming GeForce 9800 GX2 and 9800 GTX cards. The setup : Core 2 Quad Q6700 2.66GHz processor on a P965 board with Forceware 173.67 drivers and Catalyst driver version 8.451. GeForce 9800 GX2 scored 14225, 9800 GTX scored 13167 while Radeon HD 3870 X2 scored 14301. Nvidia is still tuning up the drivers and the clock speeds aren't finalized yet so we should be seeing some improvements when launched. Currently, 9800 GX2 is slated for March 18th launch while 9800 GTX is slated for end March.


Quote :From VR-Zone.com



GeForce 9800 Series Line-Up Unveiled; GX2, GTX, GTS, GT, GS The beta Forceware 174.xx we received revealed many interesting derivatives based on the G92 architecture. We already knew that G92-450 is GeForce 9800 GX2 and G92-420 is GeForce 9800 GTX. We heard Nvidia is trying to complete their GeForce 98xx series line-up so G92-350, G92-250, G92-240 could probably be GeForce 9800 GTS, GT and GS. There are other interesting G92 SKUs like G92-750, G92-650, G92-600 for mobile graphics and G02-890, G92-850, G92-950, G92-985 for workstation graphics too.


By systemlord at 2008-03-02

It was only a matter of time before this info was to be leaked, the 9800GX2 is almost here guys and the clocks don't seem to be as low as we thought. I don't believe its going to sell for $449.99 though, it will be higher for sure. A picture is worth 1000 words as they say.

http://www.tcmagazine.com/comments.php?shownews=18247&c...


By systemlord at 2008-02-20


By systemlord at 2008-02-20




By systemlord at 2008-02-26


By systemlord at 2008-02-26



Quote:
The cooler, GPU-Z and 3DMark06

Our colleagues from Expreview.com have posted some new information about Nvidia's upcoming 9800GTX. This new info includes a GPU-Z screenshot, cooler design rumors and the 3DMark06 score.

The first bit that was shown on this Website is the GPU-Z screenshot. The 0.1.7 version of the GPU-Z didn't like the 9800GTX so after W11zard from TechPowerUP did some modification to it, Expreview.com published the first screenshot with full info on this card. The card's core clock is standing at 675MHz which is only 25MHz higher than G92 8800GTS 512MB.

The 9800GTX's memory is a different story, since it goes all the way up to 1100MHz. According to that same screenshot the card has 16 ROPs, 128 unified Shaders working at 1688MHz, 512MB of GDDR3 memory with a 256-bit memory bus and 70.4 GB/s of bandwidth.

As for the cooler design, Expreview.com has posted a story that the cooler is actually the same one that can be found on the 8800GTS 512MB. It uses the same heatsink, same heatpipe placing and same fan, all made by CoolerMaster. The only difference is the shell, or the cover of the cooler which is redesigned to fit the new length of the 9800GTX. Of course, none of the Nvidia people will even bother to deny or confirm these rumors, so we will have to wait and see.

The Futuremark 3DMark06 result has also been posted by the same site, showing a score of 14014 marks. Nothing special, really, but if those SLI scaling scores of the 9600GT are true, 9800GTX will be a killer when placed in an SLI or 3-SLI scenario.


You can find the GPU-Z screenshot here, 3DMark06 score is here and the cooler design story is here.

Spec's for 9800GTX http://www.vr-zone.com/articles/GeForce_9800_GTX_Card_P...

GPU-Z sceenshot http://en.expreview.com/2008/02/26/9800gtx-gpu-z-screen...

3DMark06 score http://en.expreview.com/2008/02/26/9800gtx-3dmark06-sco...

Cooler design http://en.expreview.com/2008/02/25/9800gtx-got-a-8800gt...
February 20, 2008 7:50:08 AM

looks neat. tho i wont buy it. lol
a b U Graphics card
February 20, 2008 8:26:25 AM

Nice find.
Can't wait to see some benchmarks.

I would love to see how this scales compared to standard SLI.
February 20, 2008 9:35:10 AM

what the Feck??? 2.4ghz GDDR3!!! This thing will scream (as it burts into flames due to its mem clock speed).
February 20, 2008 9:48:28 AM

If it's legit, I don't know many cases that thing will fit into. Just look at the length of that thing...
February 20, 2008 10:13:45 AM

yep noticed the length first thing.nvidia should start making cases to house their cards.had to get a new case for my system to house the 8800gt.the specs look good on paper-waiting for tomshardware/anandtech to review it.
a b U Graphics card
February 20, 2008 10:22:19 AM

If I remember correctly, the G92 is drop in compatable with the 7950's PCB so it should be the same length as the 7950GX2.
At least it looks like they upped the cooling a little...

Not so sure about the clock speeds though.
I had thought it was using 2 8800GT cards in a single slot.
As a single GT is clocked at 600 core 900 (1800) mem, I doubt this card will be much if any higher.

Wish the company firewall would let me read the artical though :( 
February 20, 2008 10:59:00 AM

Its same length as a GTX, why all the fuss? People all said the 3870X2 was huge too but again same as a GTX, people really should think before posting..

It isn't GT as it has more raw hardware specs, 128SPs for example.


Ill most likely be going to a similar one but from another brand and WC'ing it, as wc'ing this should theoretically be able to clock it up to the 800mhz levels. Aslong as there is a decent amount of AA it should scale well even at lower res.
February 20, 2008 11:19:31 AM

I am a little confused, I thought each board has 512mb of ram on it (thus 512mb per board, two boards, this could equal 1gb). But because this is just two g92 boards put into SLI, shouldn’t it just be 512mbx2?

You can’t add up the ram in SLI, or is it 1gbx2 of DDR3?

Does this make sense?
February 20, 2008 12:09:37 PM

My geforce 7950 GX2 registers 1 gb of video ram on dxdiag, ill look to see what gpu-z says when i get home ( im 90% sure it says 1 gb as well).
February 20, 2008 1:29:24 PM

There is 1gb on the card, despite how its used.
February 20, 2008 1:43:56 PM

Ok..if those specs are correct, im buyin one. EVGA step-up here I come!
February 20, 2008 4:23:13 PM

if you sli two 8800gt together (or any card) it does not double your video memory. I am just a little confused.
February 20, 2008 4:46:01 PM

If you install (2) 8800GT you do have 1gb of Video memory.

This new card will not be able to handle more items in memory.
Rather each GPU will have a dedicated 512mb of RAM.
Still the card has 1gb of memory.
All 1gb, however, will not be used by a single GPU.
February 20, 2008 5:01:06 PM

spaztic7 said:
if you sli two 8800gt together (or any card) it does not double your video memory. I am just a little confused.


yup, and that is the idea behind it too. Confusion = more sales in the minds of the advert-generators. If they can make n00bs believe that they are getting "more" buffer, they will get said n00b to upgrade. (not saying you are a n00b as you identified their ruse... just giving my take on their reasoning)

You are correct though. While "technically" the "board" has a full 1Gb of memory, each GPU only has 512Mb to operate with. Each frame buffer is a mirror of the other in sli, so 3-way sli on this would "technically" be 1.5Gb but still only behave like a 512Mb frame buffer.

That is why sli with two 256Mb cards (thus a total of 512Mb) does not benefit you if a game requires more than 256Mb of buffer for large textures or high AA levels.

They can get away with it in advertising because there is physically 1Gb there, but reality says that a game or other app only gets half of that. IMO it is a lie and/or deceptive in practice... but such is life. That is why it is good to stay informed. ;) 
February 20, 2008 5:21:13 PM

Enough of this Oreo cookie crap, I want to see the GTX Model
February 20, 2008 5:25:20 PM

You guys are missing one thing tho: a GX2 card is not SLI. It's different than SLI. I've probed NVidia for the past few days about the GX2 cards (7950GX2 to be precise) and they've assured me that they are not SLI, no special in-game requirements, both GPUs (including the memory) are fully available to any game regardless of SLI support (or lack thereof). It's seen as a single card to the computer and game. All of the information splitting to the GPUs are done on the card/driver itself.
February 20, 2008 5:46:26 PM

Ok this makes more sense now. I know whit sli and crossfire, you should only see the amount of memory from one card (if you have 2 512mb cards, you computer only reads that you have 512mb of video memory). I understand that physically you have more, I just wanted to clear that up.

I just wanted to know it they are marketing this with the physical memory available or the usable memory available (as in how much each GPU can use).

Oh, where the hell is the GDDR4? ATi R700 class is targeted with GDDR5 for Christ sake!
February 20, 2008 5:58:43 PM

leo2kp said:
You guys are missing one thing tho: a GX2 card is not SLI. It's different than SLI. I've probed NVidia for the past few days about the GX2 cards (7950GX2 to be precise) and they've assured me that they are not SLI, no special in-game requirements, both GPUs (including the memory) are fully available to any game regardless of SLI support (or lack thereof). It's seen as a single card to the computer and game. All of the information splitting to the GPUs are done on the card/driver itself.


lol, "different" in the sense that it is a single PCIe slot card yes. "different" in that it does not require an sli-mobo, yes.

It is still sli though. It has a PCIe chip onboard (at least the 7950 does) that handles the switching for the sli info. There is a "riser" connector between the two boards that facilitate the sli connection.

In fact, AFAIK you can even disable sli in the driver and run it as a "single" card. I do not own one, just going off what I remember from initial reviews. (that may be gone now...) Here is the toms review with that:
http://www.tomshardware.com/2006/06/05/geforce_7950_gx2/page4.html
I also remember at early stages that some games performed no better on it than a single 7900gt... but I could be remembering something else on that.

regardless... it is sli without the sli hangups of specific mobo and dual PCIe slot requirements.

How it is "seen" by a game is irrelevant. How it actually performs IS relevant however. (Current drivers may mask how it is seen... but it is still sli regardless of name or perception)

finally, saying that both GPUs are "fully available" is as technically true as saying it has 1Gb of memory. Yes, they are available... but are they actually USING the second one? Would you even notice if it is used or not? (without a direct comparison to another machine that is) lol... marketers will say anything to get you to buy man. ;) 

rock on.
February 20, 2008 6:55:55 PM

Im all confused now, so im revoking my statement.
February 20, 2008 9:46:13 PM

So? Define SLI...

You can use 2 cards in a mobo but not SLI with them. The bridge makes SLI worth while, a bridge which this card has, it is SLI, just on one card rather then taking 2 slots.

The motherboard doesn't really communicate with the 2nd slot anyway, it uses the bridge. This card is 2 working together just like any 2 card SLI setup, it just takes one slot. Mobo talks to it and they communicate the data between each other.

Also, even with the GX2, there are 2 physical cards in there if you look it up in the hardware manager, the same it will likely be with this one.
February 20, 2008 9:49:25 PM

spaztic7 said:
Oh, where the hell is the GDDR4? ATi R700 class is targeted with GDDR5 for Christ sake!


You want GDDR4 or GDDR5 with higher latecies? Remember DDR3 having little to no impact of performance, but latecy is absurd being CAS 8-9 @ 2000MHz plus. Prurely marketing as far as 1GB GDDR3 frame buffer for the 9800GX2, there is a total of 1GB GDDR3 on the 9800GX2. Theres still two different frame buffers both 512MB just like SLI would have it. I think Nvidia should have done what ATI did, one 12 layer PCB for both GPU's and the 1GB GDDR3.
February 20, 2008 9:57:19 PM

Remember that this may just be similar to the 7900GX2 and if it follows that trend it may get a revision.

I don't really have a problem with how it works though, what would have been better is they placed the cores not opposite each other so they could of used 2 blocks of copper/aluminium. Some heatpipes on the stock cooler wouldn't have hurt either.

Water ftw in this case.
February 20, 2008 9:59:34 PM

660MHz? Wonder how hot it's going to run.
February 20, 2008 10:08:11 PM

Well, considering you have virtually single slot cooling on a chip that usually has dual slot, quite hot :D 


Water Ftw :D 
February 20, 2008 10:25:09 PM

Wow, I didn't know they made 2400Mhz GDDR3.
a c 105 U Graphics card
February 20, 2008 10:52:02 PM

It's still more old crap thrown together if you still can't sli 3 of them together. They're really milking it. Where's the wonder card that will bring crysis to it's knees ?
February 20, 2008 11:11:37 PM

swifty_morgan said:
It's still more old crap thrown together if you still can't sli 3 of them together. They're really milking it. Where's the wonder card that will bring crysis to it's knees ?


Pretty far away from now, and since they don't use SLI very well not anytime soon. You'll have to wait for a single new card to run Crysis in full glorry.
February 20, 2008 11:14:20 PM

Is the 1GB all usable? I remember reading that the 3870x2 uses only 512MB.
February 20, 2008 11:48:17 PM

how much faster is this than the 8800GT and do u still think the 8800GT will be the best value for money by mid -08? or will there be a new value king?
February 21, 2008 12:31:29 AM

Well its based on the GTS obviously and people clock the memory on them up to 2400 all the time. Mine went that high without artifacting but I turned it down cuz it just sounds crazy
February 21, 2008 12:48:43 AM

Hatman said:
So? Define SLI...


SLI needs driver and game support to work properly. It would seem that the GX2 does not.
a b U Graphics card
February 21, 2008 12:50:40 AM

skittle said:
SLI needs driver and game support to work properly. It would seem that the GX2 does not.

It doesn't? The 7950GX2 sure did, and that was one of it's big problems. SLI is more mature now though.

February 21, 2008 3:55:20 AM

nkarasch said:
Well its based on the GTS obviously and people clock the memory on them up to 2400 all the time. Mine went that high without artifacting but I turned it down cuz it just sounds crazy

2400 on a 8800GT?
I don't know about that.
February 21, 2008 11:36:20 AM

skittle said:
SLI needs driver and game support to work properly. It would seem that the GX2 does not.



How can it seem not? You dont think itll need drivers? Of coarse it does! Need game support? Of coarse!!!


It needs both, like all SLI setups. There isn't a normal driver then an SLI driver is there. Really, your posts aren't thought out at all. This is just made up.

GX2 is 2 cards joined together using SLI, it uses an SLI BRIDGE, whether you want to admit it or not, it is pretty obvious to almost everyone.
February 21, 2008 11:44:36 AM

kpo6969 said:
2400 on a 8800GT?
I don't know about that.


Maybe the GPU's are based off the 8800GTX GPU with full 24 ROPs and 128 SP's.
February 21, 2008 12:22:54 PM

anybody think this thing will sell like crazy?

I highly doubt it.
February 21, 2008 12:32:03 PM

swifty_morgan said:
It's still more old crap thrown together if you still can't sli 3 of them together. They're really milking it. Where's the wonder card that will bring crysis to it's knees ?



Sadly, ATi may have it with the R770 if is as good as it claims to be.

I must add, remember the promises of the 2900... :pfff: 
February 21, 2008 12:58:37 PM

teh_boxzor said:
anybody think this thing will sell like crazy?

I highly doubt it.



To high end users, yes it will. Not many want a crappy nvidia motherboard tbh and this card offers that SLI capability on Intel motherboards.


I know Ill be getting one for WC.
February 21, 2008 11:16:51 PM

you would buy it at any cost? why would you pick it over the 3870x2?
February 21, 2008 11:38:19 PM

teh_boxzor said:
you would buy it at any cost? why would you pick it over the 3870x2?

Um...it's faster?
February 21, 2008 11:50:28 PM

homerdog said:
Um...it's faster?

I agree, most likely.

If it is $450, it's a better deal than 2 8800GTS 512MB.
February 22, 2008 2:51:13 AM

Could produce a tough sell though, if you go with an Intel chipset you are throwing away your chance to SLI two 9800 GTXes which will probably be totally badass and only like double the price of a 9800GX2, LOL.
February 22, 2008 6:38:42 AM

teh_boxzor said:
you would buy it at any cost? why would you pick it over the 3870x2?


I know I would choose the 9800GX2, because it will stomp the 3870x2 and its Nvidia not ATI.
February 22, 2008 7:26:18 AM

I keep on hearing news of March 11th as the release date for this bad boy. This is awesome, since my EVGA Step-Up Program window ends on March 16th.

That is the main reason I want to grab this one, just so I can use the Step-Up Program and feel all special :) 
February 22, 2008 7:29:10 AM

You better be all over that since the release was already pushed back once (if not twice already). Cutting it close.
February 22, 2008 8:49:27 AM

Mwh, I jsut want it because its the most powerful card for my P35.. gd overclocking and SLI'd cards, cant complain about that.


WC FTW
February 22, 2008 9:09:17 AM

Yeah I know. When I originally bought my 8800GT, Nvidia was still supposed to release the new cards in February. Just as long as I can get in there and snatch it, even if it goes on backorder, I should be fine. If I miss the date, I have a friend who would be willing to purchase it from me...I hope.
February 22, 2008 10:02:04 AM

Hatman said:
Its same length as a GTX, why all the fuss? People all said the 3870X2 was huge too but again same as a GTX, people really should think before posting..

It isn't GT as it has more raw hardware specs, 128SPs for example.


It should beat my new MSI overclocked 3870x2 in some benchies, but wait for the 4870x2! Though Nvidia's CEO doesn't like dual GPU cards, they're the wave of the future at the high end. What Nvidia needs to do, though is put two GPU's on one PCB like ATI did.

As for fitting in cases, the 3870x2 barely fit into an old Raidmax case, and I didn't think the 5 80mm fans provided enough cooling, so I got an Antec Nine Hundred, and it fits just fine.

What are the PSU requirements of the Nvidia card? I'm using an ATI certified Antec Neo 650 with my dual GPU card.

spaztic7 said:


Oh, where the hell is the GDDR4? ATi R700 class is targeted with GDDR5 for Christ sake!


Yes, but the current 3870x2 has DDR3. So, I don't think the 9800gx2 will suffer all that much. Nordic Hardware reported GDDR5 for 770 in June (ie the 4870 and 4870x2), but that hasn't been confirmed by AMD.

sojrner said:

I also remember at early stages that some games performed no better on it than a single 7900gt... but I could be remembering something else on that.

regardless... it is sli without the sli hangups of specific mobo and dual PCIe slot requirements.

How it is "seen" by a game is irrelevant. How it actually performs IS relevant however. (Current drivers may mask how it is seen... but it is still sli regardless of name or perception)


That's the case with the 3870x2. It's internally Crossfire, and the single PCB design is better than Nvidia's, but a game still has to benefit from dual GPU's to get the full advantage of the card, at least that's how reviewers at Anandtech, Tom's Hardware and Xbit Labs are reporting it.

So, if a game doesn't support SLI, the 9800gx2 should perform like a similarly clocked single GPU card (what is it, G92?)



    • 1 / 7
    • 2
    • 3
    • 4
    • 5
    • More pages
    • Next
    • Newest
!