Sign in with
Sign up | Sign in
Your question

Nvidia 8800 GTX Details Unveiled ...Asus EN8800GTX Details

Last response: in Graphics & Displays
Share
October 24, 2006 10:22:21 AM





Quote:
An Asian web-site has published several details concerning Nvidia GeForce 8800 GTX graphics card from Asustek Computer that do not reveal any performance or micro-architectural peculiarities of the chip code-named G80, but do reveal some information regarding the details of graphics cards powered by Nvidia’s forthcoming graphics processing unit (GPU).

The pictures published by MyChat web-site expose that the GeForce 8800 GTX-based graphics card from Asus looks a little different compared to alleged Nvidia G80 picture published earlier. While the board does look like a large one, it does not feature liquid cooling system as well as has elements placed in a little bit different way compared to the previous version. According to the web-site, which claims that the information comes from sources within Asus, one of the world’s largest maker of graphics cards, Nvidia’s yet-to-be-announced G80 graphics chip will support Microsoft DirectX 10 Shader Model 4.0, Nvidia’s GigaThread technology, Nvidia Quantum Effects technology, floating point high dynamic-range (HDR) lighting and so on.

Asus, the web-site claims, will release two versions of the GeForce 8800-series graphics cards: EN8800GTX/HTDP/768M and EN8800GTS/HTDP/640M. The naming scheme implies that there will be two versions of the G80: the GeForce 8800 GTX and the GeForce 8800 GTS which will be different not only in terms of performance, but also in terms of installed amount of memory and interface: the GTX boards will carry 768MB with 384-bit interface, whereas the GTX cards will have 640MB with 320-bit interface. The higher-end model 8800 GTX will have its chip clocked at 575MHz and GDDR3 memory operating at 1.80GHz.

It is uncertain why Asustek and Nvidia use GDDR3 memory for the new top-of-the-range products, as GDDR3 memory operating at 1.70GHz and above is pretty rare, whereas GDDR4 memory operates at 2.0GHz and beyond easily.

Specifications of the G80 chip are not clear. Some sources indicate that Nvidia’s first DirectX 10 chip will incorporate 48 pixel shader processors and an unknown number of vertex shader/geometry shader processors. Other, however, claims that the G80 has 32 pixel and 16 vertex and geometry shader processors. Yet another source has indicated that the G80 will have unified shader architecture and will consist of 700 million transistors.

http://www.xbitlabs.com/news/video/display/200610231427...

That is a hack of a card......
a b U Graphics card
October 24, 2006 11:10:20 AM

Quote:
That is a hack of a card......


do you mean Hack - a botched job or Heck - commonly taken to mean very good.
October 24, 2006 11:16:03 AM

Hack as in, "Can you hack it?"
Related resources
October 24, 2006 11:34:03 AM

Still don't see a point in using the GDDR3 memory instead of GDDR4.
a b U Graphics card
October 24, 2006 12:14:07 PM

Quote:
Hack as in, "Can you hack it?"


break it?
October 24, 2006 12:33:03 PM

Quote:
Still don't see a point in using the GDDR3 memory instead of GDDR4.


Maybe ATI had something to do with it :? . I dunno.

Quote:
That is a hack of a card......


Don't you mean its a hull of a card? :p 
October 24, 2006 12:38:09 PM

Still looks like nVidia is lagging behind ATI in the technology stakes! (Unified shaders, GDDR4 etc).

If the specs here are anything to go by, i believe that R600 will be a fair amount better.
October 24, 2006 1:15:40 PM

Yes the R600 should destroy that card if its specs of 64 unified shaders are true. Unified shaders are better anyway and more optimised for the DX10 standard.
October 24, 2006 3:42:19 PM



It's huge chip :roll:
October 24, 2006 3:45:44 PM

Two external interlink connections?????
October 24, 2006 3:47:27 PM

A GRAW graphic?? Puhlease.....pick something generic......or at least a popular game. Yes, I'm nitpicking...... ;) 

I think I'll wait for the inevitable bugs to surface and get worked out before I jump into DX10....not to mention hardly any DX10 games available for quite some time.

I think they're putting the cart before the horse, as it were.......
October 24, 2006 3:52:15 PM

Quote:
Two external interlink connections?????
QuadSLI
October 24, 2006 4:27:09 PM

I've said it before and I'll warn you all again....if you have a mid-tower case....take out a tape measure and measure an extra 2" past your motherboard. (the one you plan on putting a g80 into) If it interfers with anything in your mid tower case....you will not be able to use a G80....they are way too long. If that is case with your mid-tower.....you might wanna start searching for a decent full-tower case on newegg.

Rumors say that the r600 will be just as beastly. So if you plan on going with the r600, you may still need to find a full tower chassis.

I'm just letting people know, because I know some of you already found that your current gen card(s) was/were hard enough to get into your case. Someone about 2 weeks ago was talking about how they had to position the end board of the graphics card between their hard drives. If thats the case ...you definately won't be able to fit a g80.

If your mid tower has a removable hard drive tray directly behind your graphics card(s) at the front of the case...you can probly get away with taking that out...and mount the hard drive(s) in any spare 5.25" bay(s).

Silencer
Disk Twins

For full towers:
Full towers at newegg

Anyways....
@Tolemi

Nice find
October 24, 2006 4:45:55 PM

Well, I"m assuming it should have applied physics processing as I see its being pushed right along with GRAW. hmmm neat.

@ Gam3ra

Kinda figured that. But what are you going to have.....4 -PCI-E 16x slots??
October 24, 2006 5:25:56 PM

Quote:
Two external interlink connections?????
QuadSLI
BuT if you look closely i think there's also ->2<- Pci-E Power connectors on there.?!?!?!!

If it's the case this is just too much...
October 24, 2006 6:37:23 PM

thanx for the info dude...but i am using TT Tai-Chi and have 6 internal 12cm fans and 3 other 9cm....and using TT toughpower 750W so I think I am cool enough for the DX 10 cards.
Anonymous
a b U Graphics card
October 24, 2006 7:12:56 PM

lol, nice dig =)
October 24, 2006 7:24:19 PM

Tolemi it wasn't specifically aimed towards you....it's information for everybody. Sorry if I confused you.
October 24, 2006 8:07:33 PM

What a card. A little power hungry. And yes why not GDDR4 is a good question. But whatever that card will smoke anything out there right now. Also even though there are no DX10 games out that doesn't mean it wont play DX9 games... Oblivian players will love this card. And also Ghost Recon players also!
October 24, 2006 8:29:38 PM

Right, this card will need 2 6-pin connectors to run. This is ridiculous. One of these cards is estimated to require 300 watts. For SLI, Nvidia recommends an 800w PSU, minimum. Does that mean they will recommend a 2kW PSU for QSLI? They are overestimating how far people will go for Pure FPS IMO.
October 24, 2006 9:00:12 PM

No such thing as overestimating how far people will go to have the best of the best. There is always a market for rediculously over powered and over priced merchandise in any field. Cars to computers to any electronic you can imagine.
a b U Graphics card
a b Î Nvidia
October 25, 2006 6:06:56 AM

Quote:
I think the amount of pixel pipes will make up for if the shaders aren't unified.
Official G80 specs from Megagames


Those aren't official specs, heck they aren't even worthy of being called official rumour, it's just a plain CRAP botch job of the rumours.

"128 PIXEL PIPELINES"?

Yeah sure, on a 2 billion transistor chip.

Even if they mean 128 stream processors, that's still just another rumour at this point, like the Xbit article points out.
a b U Graphics card
a b Î Nvidia
October 25, 2006 6:12:38 AM

Quote:
Two external interlink connections?????


People are so myopic, it isn't for "Quad SLi" but for more open ended SLi.

The current limit of SLi is 2 (their current quad is really just 2X + 2X), the new connectors, and likely new on chip SLi support will likely allow you to daisy chain multiple cards (whatever their max 64 like ATi?), but the limit would no longer be due to the hardware in the same way it currently is. It was expected that Xfire would work that way too, but they never implemented it, it may technically be possible, but this shows that nV is serious about pursuing that solution, and realistically it may allow them to maintain top spot if ATi can only doe 2 cards in Xfire, and the G80 can do 4 or more right out of the box.

It just gives them more future options.
October 25, 2006 7:01:01 AM

So, 4x8800GTX...now where the hell am I going to find a 2000w PSU?
October 25, 2006 11:05:23 AM

Quote:


It's huge chip :roll:


That looks wicked!
October 25, 2006 11:31:24 AM

It could be dual core. or physics as I'm hoping it would be. Still the power requirement is just crap. They should follow the cpu path and go for the low-watts high-performance goal.
October 25, 2006 1:20:30 PM



Maybe?
October 25, 2006 2:05:01 PM

Could that be the HDCP chip?
a b U Graphics card
a b Î Nvidia
October 25, 2006 2:08:50 PM

Quote:
Oh F*ck nVidia!
Damn, the IHS. Just as I was praising in the CPU forum how ATi and nVidia used the superior shim they slapped an IHS on! :evil: 
Maybe they do have two cores on one package...


Well that may just be the engineering sample, but it's not like someone won't pry it off within the first week.

Quote:
Note the dot on top-left corner.
And notice the distinctive 8+4 RAM confirguration due to 512bit+128bit wide bus.


You meant 256+128 , right? :wink:
a b U Graphics card
a b Î Nvidia
October 25, 2006 2:22:35 PM

That last pic looks Photochopped for sure, that packages bridges look out of proportion as does the laser inscribe, but it is a low res pic.
October 25, 2006 3:02:21 PM

It's Photoshoped around the card, I'm not touching the card! It's real 8800
a b U Graphics card
a b Î Nvidia
October 26, 2006 4:17:44 AM

I know it's a real GF8800, but that 'FPU' doesn't look right at all, and I doubt it's what's actually there so much as what soeone thinks or hopes is there, and if you have the card, then a better detailed picture of the laser script would be good, or fearing too much info, a tilted off-angle shot would help show chop or not.
October 26, 2006 11:17:35 AM

Quote:


Maybe?


I assume this is the GT version for it has only one 6-pin power socket? On the other pic the GTX or whatever it is, longer and have 2 6-pin power and that "part" is blocked.
October 26, 2006 1:03:22 PM

Quote:

"128 PIXEL PIPELINES"?

Yeah sure, on a 2 billion transistor chip.

Even if they mean 128 stream processors, that's still just another rumour at this point, like the Xbit article points out.


Yep. Marketing. Well, others call it propaganda, yet, basically, it´s marketing. I´m really curious too see what a Pipeline is to those sales people. :lol: 
a b U Graphics card
a b Î Nvidia
October 26, 2006 4:01:58 PM



Still doesn't look quite right. :?:

Based on those speeds listed in the ATiTool type OC'er (520mhz memory) that would have to be the GT not the GTX as it's being identified, that's nowhere near the 1.7+ghz effective (850+mhz double pumped). So either it's a weird reference model, or 3Dmk is reading the chips wrong, or the reference designs didn't ship with exotic memory yet, or some more grist for the rumour mill.

Interesting no one else noticed that in all the excitement. :wink:

Looking interesting though.
a b U Graphics card
a b Î Nvidia
October 29, 2006 6:11:29 PM

Well it's revealed what that extra chip is for VIDEO I/O;

http://www.theinquirer.net/default.aspx?article=35385

It seems strange, didn't nV promote that they would have HDCP included at the chip level, now not only is HDCP external but so is the RAMDACS and TMDS which are also usually internal (at least one internal TMDS usually).

Very strange, sound like either they ran out of transistor room, or some of the asynchronous features of the G80 didn't mesh with the speeds of the RAMDACs and TMDSs. Or they've decided to make a completely seperate video solution to avoid issues across generations. Also looks like a solution that helps whatever the early rumours were about broken HDCP support.
October 29, 2006 10:58:37 PM

Well, the GTX won't even fit in my current case :( 
!