Some Info. G80 Specs.( read somewhere )

mayouuu

Distinguished
Jan 24, 2006
124
0
18,680
* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71
* 8 TCPs & 128 stream processors
* Much more efficient than traditional architecture
* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)
* Two models at launch : GeForce 8800GTX and GeForce 8800GT
* GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649
* GeForce 8800GT : 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499
 

mayouuu

Distinguished
Jan 24, 2006
124
0
18,680
yes it souns more realistic.... but if more complex circuits are needed dont surprise of 700M of transistors.... and hearing about those power consumptions.....
 

theaxemaster

Distinguished
Feb 23, 2006
375
0
18,780
Yeah it is a dual core chip. They're still low on pixel shader units if you ask me though. It will be interesting to see how nvidia's dual core solution will stack up to ati's single core solution.

The graphics card companies have said it themselves though, the upcoming generation of cards is going to be the most power hungry and hottest ever. After that, I've read, they're going to work on improving performance without increasing transistor count/heat on the scale they have been. The G80/r600 is the gen to skip if you're wanting to keep the power requirements down.
 

pauldh

Illustrious
I don't put alot of faith in what VR-zone had up there, but it could pan out to be accurate. I'm still thankful they translated and passed the info on to us. (rumors are fun) I'm sure if someone wants to see it, the Anand, [H], or beyond3d forums would have a working link to it.
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
I'm just looking at the memory size and the interface......those are some, interesting numbers. :?:

BTW: Didnt Nvidia already relate that G80 will NOT be USD?
 

Doughbuy

Distinguished
Jul 25, 2006
2,079
0
19,780
It's kind of sad when one of the deciding factors for getting an apartment is whether or not electricity is paid for or not...

If I add all the power my gaming system would use... It would go up to around 1.2kW average... sigh =*(
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Yeah it is a dual core chip. They're still low on pixel shader units if you ask me though. It will be interesting to see how nvidia's dual core solution will stack up to ati's single core solution.

The graphics card companies have said it themselves though, the upcoming generation of cards is going to be the most power hungry and hottest ever. After that, I've read, they're going to work on improving performance without increasing transistor count/heat on the scale they have been. The G80/r600 is the gen to skip if you're wanting to keep the power requirements down.


* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71

they label it a unified shader architecture but then go on to say they'll have 2x24=48 pixel shaders and 12x8=96 vertex shaders... thats a decent amout of shaders if you ask me, considereing nvidia has had less shaders than ati the in the last generation not surprised at this count of shaders although i'm still confused on how they have a "unified shaders" but still label how many pixel and vertex shaders there will be
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
It almost sounds like they are half assing a second core into the G80 in order to cope with the geometric shader requirement, or whatever. I am not totaly versed on DX10, nor do I claim to be. However, with all this crap they are putting in the cards they better damn well give me some good loving for all the heat and eletricity I am going to have to pay.

These DX10 cards are getting out of hand, well I say that before I see any performance reviews. If they double my G71 speeds, then we can talk. However, if it falls short and lands anywhere under 45-55%, I will say F it, and wait for the second revision of DX10. Then again I could just WC them and really have some fun. We are just commenting on rumors so we shall see what realy pans out.

I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.

Doughboy: I use on avg 1500KwH in my 1 bedroom loft lol, and thats with my sig rig turned off 1/2 the day.
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
Uh oh 30k posts, congrats. So what yoru saying is in effect 7900 GTX + 7900GT? Obviously not what is happeneing in actuality, but the concept is similar in nature.

That sounds strange but I am guessing nVidia went through some rationalization process and decided that having 2 unequal cores is a good idea. That is one weird way of designing a core, of course ATi's 250w RD600 (if that rumors hold) is just as ridiculous. This change seems akin to the days when the 6800 Ultra came out and just plane walked all over the 9800 Pro. Doubled the pipes, upped the memory (well over the normal 9800 pro :wink: ).

In any case, I don't like the idea of paying 600+ for a GPU. I was reluctent to pay for a 7800GTX and I paid 350 for it new in december, and upgraded to 7900GTX for shipping cost only. Oh well, we shall see how that market reacts to this shift towards insane power consumption.... we all know how that panned out for CPU's. OH BURN!

No not really, chipmakers just began to differentiate market segments with luke warm CPUs and boiling CPUs.
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
Ok, I am on board with that. It is kinda what i was thinking but clearly I did not articulate that well, which sometimes happens, meh. So your saying the constraints of the process size and PCB size limit the second core to 1/2 the memory and memory bus width? I think is the feeling I'm getting. Time to do some research... as if I don't do enough of that with this damn accounting degree!!!!!! lol

This whole unified shader architecture should be interesting. In theory it sounds like a wonderful idea, in implimentation it might prove a programmers worst nightmare. Maybe one day I will audit AMD/Nvidia and I can give you a real look at what a graphics card will be :wink:

Oh yeah, except for that whole NDA and confidentiality clause I sign when I go to work on a new engagement. :?
 

lasseht

Distinguished
Apr 4, 2006
28
0
18,530
I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.

In game physics ?? or can swap between that and GPU as needed?

Just wondering too.... or dreaming :?: 8O
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.

In game physics ?? or can swap between that and GPU as needed?

Just wondering too.... or dreaming :?: 8O

No I 'm guessing that the smaller core is actually a DX9 GPU, because the card still needs to support older games, but unified shader is directX10 only.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.

In game physics ?? or can swap between that and GPU as needed?

Just wondering too.... or dreaming :?: 8O

No I 'm guessing that the smaller core is actually a DX9 GPU, because the card still needs to support older games, but unified shader is directX10 only.

doubt it
 

ReliReli

Distinguished
Jul 4, 2006
55
0
18,630
Well, here's the reason I invested in that Enermax Galaxy 1KW power supply for my next build. It's do-or-die time now!

I'm perfectly happy to wait for G80/R600 for now, though. I mean the price, along with whatever new arch. gets released in that time...not yet, thanks.
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
I keep hearing about the need for 1KW PSUs and then others saying that the newer GPUs will use up to 300Ws. My system currently uses 273W without the GPU, so... maybe I can run a DX10 card without upgrading- why would I need a 1KW PS? I'm probably missing something. :lol:
 

yakyb

Distinguished
Jun 14, 2006
531
0
18,980
what about the hybrid water/ air cooling how would this go about on one cooler? surely if you didnt use watercooling the space left for the water would lead to great cooling inefficencies when only running on air
 

GavinLeigh

Distinguished
Jun 30, 2006
333
0
18,860
Actually rumour has it that not only will you need a 1Kw power supply but you also need a water faucet connection for the cards water cooling supply. They'll provide a splitter in the box to hook to your washing machine hose.

Alternatively Zalman will offer a 40 gallon Reservator ( it's a copper water tank painted blue ) for a tiny investment of $1100. The water pump is a 125cc two-stroke lawn mower engine. It's a little noisy at 105dB but it can run is stealth mode at 75dB. Oh and the best thing is that they give you a free baseball cap with "Zalman.. We're Cool" written on it.

No really.... it's true... HONEST... I know I read it somewhere.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
This, in my personal opinion, is just obscene. Making a two-GPU card the STANDARD flagship? No, I don't think I need that high a price for my machine. What happened to a good, solid single-GPU card?

And as for memory... If they're identical GPUs, but the second both has less memory and a thinner memory bus, won't that perhaps limit the card's ability to run well? Since, after all, in running a parallel GPU setup, you need an identical copy of the texture set in each GPU's VRAM.

And as for the shader counts... From what I see, it's apparently 24 shaders per core, which I find to be a dissapointment. Even though it might be nice that they're unified for once.

This just reminds me of the dread people felt when the GeForce 6800ultra was introduced, and people tensely waited for ATi to unveil its X800XT... The 6800u brought us a two-slot card that ran noisy, hot, and consumed so much power that for the (then modern) AGP version, you needed TWO power supplements from the PSU... PLUS they introduced SLi with that card.

I just hope ATi will come back like they did back in 2004, bringing us a SANE card to compare with nVidia's crazy card. I know that I for one will want no part of the GeForce 8800 series like that...
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
This, in my personal opinion, is just obscene. Making a two-GPU card the STANDARD flagship? No, I don't think I need that high a price for my machine. What happened to a good, solid single-GPU card?

And as for memory... If they're identical GPUs, but the second both has less memory and a thinner memory bus, won't that perhaps limit the card's ability to run well? Since, after all, in running a parallel GPU setup, you need an identical copy of the texture set in each GPU's VRAM.

And as for the shader counts... From what I see, it's apparently 24 shaders per core, which I find to be a dissapointment. Even though it might be nice that they're unified for once.

This just reminds me of the dread people felt when the GeForce 6800ultra was introduced, and people tensely waited for ATi to unveil its X800XT... The 6800u brought us a two-slot card that ran noisy, hot, and consumed so much power that for the (then modern) AGP version, you needed TWO power supplements from the PSU... PLUS they introduced SLi with that card.

I just hope ATi will come back like they did back in 2004, bringing us a SANE card to compare with nVidia's crazy card. I know that I for one will want no part of the GeForce 8800 series like that...

Why not make a 2 GPUs in the flag ship card, has it not been that the flagsip must represent the company in all its glory and fame, does a Enzo let you down, no, should a 8800GTX, no they bring out the baddest and the best at the highest price...if you want a solid single gpu then go to the midrange cause thats what its going to,

as far as the memory bus width the second core is for a different set of calculations i believe, like physics, while the first core does the same thing graphics cards have been doing for aw hile, if anything you should be thankfull as it is a "bonus" from the regular card and prevents you from having to buy a phys-x card that will also burn your wallet another $280