Nvidia 8800 GTX Details Unveiled ...Asus EN8800GTX Details

tolemi

Distinguished
Oct 20, 2006
34
0
18,530
asus_gf8800_pic2.jpg

asus_gf8800_pic1.jpg

asus_gf8800_pic3.jpg


An Asian web-site has published several details concerning Nvidia GeForce 8800 GTX graphics card from Asustek Computer that do not reveal any performance or micro-architectural peculiarities of the chip code-named G80, but do reveal some information regarding the details of graphics cards powered by Nvidia’s forthcoming graphics processing unit (GPU).

The pictures published by MyChat web-site expose that the GeForce 8800 GTX-based graphics card from Asus looks a little different compared to alleged Nvidia G80 picture published earlier. While the board does look like a large one, it does not feature liquid cooling system as well as has elements placed in a little bit different way compared to the previous version. According to the web-site, which claims that the information comes from sources within Asus, one of the world’s largest maker of graphics cards, Nvidia’s yet-to-be-announced G80 graphics chip will support Microsoft DirectX 10 Shader Model 4.0, Nvidia’s GigaThread technology, Nvidia Quantum Effects technology, floating point high dynamic-range (HDR) lighting and so on.

Asus, the web-site claims, will release two versions of the GeForce 8800-series graphics cards: EN8800GTX/HTDP/768M and EN8800GTS/HTDP/640M. The naming scheme implies that there will be two versions of the G80: the GeForce 8800 GTX and the GeForce 8800 GTS which will be different not only in terms of performance, but also in terms of installed amount of memory and interface: the GTX boards will carry 768MB with 384-bit interface, whereas the GTX cards will have 640MB with 320-bit interface. The higher-end model 8800 GTX will have its chip clocked at 575MHz and GDDR3 memory operating at 1.80GHz.

It is uncertain why Asustek and Nvidia use GDDR3 memory for the new top-of-the-range products, as GDDR3 memory operating at 1.70GHz and above is pretty rare, whereas GDDR4 memory operates at 2.0GHz and beyond easily.

Specifications of the G80 chip are not clear. Some sources indicate that Nvidia’s first DirectX 10 chip will incorporate 48 pixel shader processors and an unknown number of vertex shader/geometry shader processors. Other, however, claims that the G80 has 32 pixel and 16 vertex and geometry shader processors. Yet another source has indicated that the G80 will have unified shader architecture and will consist of 700 million transistors.
http://www.xbitlabs.com/news/video/display/20061023142704.html

That is a hack of a card......
 

quantumsheep

Distinguished
Dec 10, 2005
2,341
0
19,790
Still looks like nVidia is lagging behind ATI in the technology stakes! (Unified shaders, GDDR4 etc).

If the specs here are anything to go by, i believe that R600 will be a fair amount better.
 

Mike995

Distinguished
Mar 14, 2006
419
0
18,780
Yes the R600 should destroy that card if its specs of 64 unified shaders are true. Unified shaders are better anyway and more optimised for the DX10 standard.
 

skyguy

Distinguished
Aug 14, 2006
2,408
0
19,780
A GRAW graphic?? Puhlease.....pick something generic......or at least a popular game. Yes, I'm nitpicking...... ;)

I think I'll wait for the inevitable bugs to surface and get worked out before I jump into DX10....not to mention hardly any DX10 games available for quite some time.

I think they're putting the cart before the horse, as it were.......
 

3lfk1ng

Distinguished
Jun 28, 2006
681
0
18,980
I've said it before and I'll warn you all again....if you have a mid-tower case....take out a tape measure and measure an extra 2" past your motherboard. (the one you plan on putting a g80 into) If it interfers with anything in your mid tower case....you will not be able to use a G80....they are way too long. If that is case with your mid-tower.....you might wanna start searching for a decent full-tower case on newegg.

Rumors say that the r600 will be just as beastly. So if you plan on going with the r600, you may still need to find a full tower chassis.

I'm just letting people know, because I know some of you already found that your current gen card(s) was/were hard enough to get into your case. Someone about 2 weeks ago was talking about how they had to position the end board of the graphics card between their hard drives. If thats the case ...you definately won't be able to fit a g80.

If your mid tower has a removable hard drive tray directly behind your graphics card(s) at the front of the case...you can probly get away with taking that out...and mount the hard drive(s) in any spare 5.25" bay(s).

Silencer
Disk Twins

For full towers:
Full towers at newegg

Anyways....
@Tolemi

Nice find
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Well, I"m assuming it should have applied physics processing as I see its being pushed right along with GRAW. hmmm neat.

@ Gam3ra

Kinda figured that. But what are you going to have.....4 -PCI-E 16x slots??
 

tolemi

Distinguished
Oct 20, 2006
34
0
18,530
thanx for the info dude...but i am using TT Tai-Chi and have 6 internal 12cm fans and 3 other 9cm....and using TT toughpower 750W so I think I am cool enough for the DX 10 cards.
 

melarcky

Distinguished
Mar 23, 2006
766
0
18,980
What a card. A little power hungry. And yes why not GDDR4 is a good question. But whatever that card will smoke anything out there right now. Also even though there are no DX10 games out that doesn't mean it wont play DX9 games... Oblivian players will love this card. And also Ghost Recon players also!
 

shinigamiX

Distinguished
Jan 8, 2006
1,107
0
19,280
Right, this card will need 2 6-pin connectors to run. This is ridiculous. One of these cards is estimated to require 300 watts. For SLI, Nvidia recommends an 800w PSU, minimum. Does that mean they will recommend a 2kW PSU for QSLI? They are overestimating how far people will go for Pure FPS IMO.
 

tool_462

Distinguished
Jun 19, 2006
3,020
2
20,780
No such thing as overestimating how far people will go to have the best of the best. There is always a market for rediculously over powered and over priced merchandise in any field. Cars to computers to any electronic you can imagine.
 
I think the amount of pixel pipes will make up for if the shaders aren't unified.
Official G80 specs from Megagames

Those aren't official specs, heck they aren't even worthy of being called official rumour, it's just a plain CRAP botch job of the rumours.

"128 PIXEL PIPELINES"?

Yeah sure, on a 2 billion transistor chip.

Even if they mean 128 stream processors, that's still just another rumour at this point, like the Xbit article points out.