Sign in with
Sign up | Sign in
Your question

Some Info. G80 Specs.( read somewhere )

Last response: in Graphics & Displays
Share
September 20, 2006 2:43:34 PM

* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71
* 8 TCPs & 128 stream processors
* Much more efficient than traditional architecture
* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)
* Two models at launch : GeForce 8800GTX and GeForce 8800GT
* GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649
* GeForce 8800GT : 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499

More about : info g80 specs read

September 20, 2006 2:51:10 PM

sounds amazing, too bad we don't have concrete info on either g80 or rd600,
September 20, 2006 3:00:35 PM

yes it souns more realistic.... but if more complex circuits are needed dont surprise of 700M of transistors.... and hearing about those power consumptions.....
Related resources
September 20, 2006 3:04:34 PM

yeah, my 450W power supply will be obsolete in 6 months
September 20, 2006 3:37:46 PM

... Looks impressive, but the price... it's killing me... looks like its time to donate my other nut...
September 20, 2006 3:40:40 PM

Yeah it is a dual core chip. They're still low on pixel shader units if you ask me though. It will be interesting to see how nvidia's dual core solution will stack up to ati's single core solution.

The graphics card companies have said it themselves though, the upcoming generation of cards is going to be the most power hungry and hottest ever. After that, I've read, they're going to work on improving performance without increasing transistor count/heat on the scale they have been. The G80/r600 is the gen to skip if you're wanting to keep the power requirements down.
a b U Graphics card
September 20, 2006 3:51:57 PM

I don't put alot of faith in what VR-zone had up there, but it could pan out to be accurate. I'm still thankful they translated and passed the info on to us. (rumors are fun) I'm sure if someone wants to see it, the Anand, [H], or beyond3d forums would have a working link to it.
September 20, 2006 3:57:01 PM

I'm just looking at the memory size and the interface......those are some, interesting numbers. :?:

BTW: Didnt Nvidia already relate that G80 will NOT be USD?
September 20, 2006 4:56:04 PM

It's kind of sad when one of the deciding factors for getting an apartment is whether or not electricity is paid for or not...

If I add all the power my gaming system would use... It would go up to around 1.2kW average... sigh =*(
September 20, 2006 5:35:50 PM

Man thats one fast video card 8O way faster than any of today's video cards
September 20, 2006 9:27:49 PM

Quote:
Yeah it is a dual core chip. They're still low on pixel shader units if you ask me though. It will be interesting to see how nvidia's dual core solution will stack up to ati's single core solution.

The graphics card companies have said it themselves though, the upcoming generation of cards is going to be the most power hungry and hottest ever. After that, I've read, they're going to work on improving performance without increasing transistor count/heat on the scale they have been. The G80/r600 is the gen to skip if you're wanting to keep the power requirements down.



* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71

they label it a unified shader architecture but then go on to say they'll have 2x24=48 pixel shaders and 12x8=96 vertex shaders... thats a decent amout of shaders if you ask me, considereing nvidia has had less shaders than ati the in the last generation not surprised at this count of shaders although i'm still confused on how they have a "unified shaders" but still label how many pixel and vertex shaders there will be
September 20, 2006 9:59:42 PM

It almost sounds like they are half assing a second core into the G80 in order to cope with the geometric shader requirement, or whatever. I am not totaly versed on DX10, nor do I claim to be. However, with all this crap they are putting in the cards they better damn well give me some good loving for all the heat and eletricity I am going to have to pay.

These DX10 cards are getting out of hand, well I say that before I see any performance reviews. If they double my G71 speeds, then we can talk. However, if it falls short and lands anywhere under 45-55%, I will say F it, and wait for the second revision of DX10. Then again I could just WC them and really have some fun. We are just commenting on rumors so we shall see what realy pans out.

I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.

Doughboy: I use on avg 1500KwH in my 1 bedroom loft lol, and thats with my sig rig turned off 1/2 the day.
September 21, 2006 1:09:24 AM

Uh oh 30k posts, congrats. So what yoru saying is in effect 7900 GTX + 7900GT? Obviously not what is happeneing in actuality, but the concept is similar in nature.

That sounds strange but I am guessing nVidia went through some rationalization process and decided that having 2 unequal cores is a good idea. That is one weird way of designing a core, of course ATi's 250w RD600 (if that rumors hold) is just as ridiculous. This change seems akin to the days when the 6800 Ultra came out and just plane walked all over the 9800 Pro. Doubled the pipes, upped the memory (well over the normal 9800 pro :wink: ).

In any case, I don't like the idea of paying 600+ for a GPU. I was reluctent to pay for a 7800GTX and I paid 350 for it new in december, and upgraded to 7900GTX for shipping cost only. Oh well, we shall see how that market reacts to this shift towards insane power consumption.... we all know how that panned out for CPU's. OH BURN!

No not really, chipmakers just began to differentiate market segments with luke warm CPUs and boiling CPUs.
September 21, 2006 1:28:01 AM

Ok, I am on board with that. It is kinda what i was thinking but clearly I did not articulate that well, which sometimes happens, meh. So your saying the constraints of the process size and PCB size limit the second core to 1/2 the memory and memory bus width? I think is the feeling I'm getting. Time to do some research... as if I don't do enough of that with this damn accounting degree!!!!!! lol

This whole unified shader architecture should be interesting. In theory it sounds like a wonderful idea, in implimentation it might prove a programmers worst nightmare. Maybe one day I will audit AMD/Nvidia and I can give you a real look at what a graphics card will be :wink:

Oh yeah, except for that whole NDA and confidentiality clause I sign when I go to work on a new engagement. :?
September 21, 2006 2:34:25 AM

Wusy... I'm not even going to ask about your sig. 8O
September 21, 2006 1:55:50 PM

Quote:

I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.


In game physics ?? or can swap between that and GPU as needed?

Just wondering too.... or dreaming :?: 8O
September 26, 2006 10:22:47 PM

Quote:
... Looks impressive, but the price... it's killing me... looks like its time to donate my other nut...


Yeah you don't need kids anyway. they just take all your game time away.
September 26, 2006 10:28:38 PM

Quote:

I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.


In game physics ?? or can swap between that and GPU as needed?

Just wondering too.... or dreaming :?: 8O

No I 'm guessing that the smaller core is actually a DX9 GPU, because the card still needs to support older games, but unified shader is directX10 only.
September 27, 2006 1:09:42 AM

Quote:

I am still trying to wrap my head around having a 2nd smaller core with a 128bit memory interface, with 1/2 the ram and other 1/2 assed specs. Hmmm, it makes me wonder.


In game physics ?? or can swap between that and GPU as needed?

Just wondering too.... or dreaming :?: 8O

No I 'm guessing that the smaller core is actually a DX9 GPU, because the card still needs to support older games, but unified shader is directX10 only.

doubt it
September 27, 2006 6:29:11 AM

Well, here's the reason I invested in that Enermax Galaxy 1KW power supply for my next build. It's do-or-die time now!

I'm perfectly happy to wait for G80/R600 for now, though. I mean the price, along with whatever new arch. gets released in that time...not yet, thanks.
September 27, 2006 3:03:21 PM

I keep hearing about the need for 1KW PSUs and then others saying that the newer GPUs will use up to 300Ws. My system currently uses 273W without the GPU, so... maybe I can run a DX10 card without upgrading- why would I need a 1KW PS? I'm probably missing something. :lol: 
September 29, 2006 9:48:14 AM

what about the hybrid water/ air cooling how would this go about on one cooler? surely if you didnt use watercooling the space left for the water would lead to great cooling inefficencies when only running on air
October 2, 2006 3:57:17 PM

Actually rumour has it that not only will you need a 1Kw power supply but you also need a water faucet connection for the cards water cooling supply. They'll provide a splitter in the box to hook to your washing machine hose.

Alternatively Zalman will offer a 40 gallon Reservator ( it's a copper water tank painted blue ) for a tiny investment of $1100. The water pump is a 125cc two-stroke lawn mower engine. It's a little noisy at 105dB but it can run is stealth mode at 75dB. Oh and the best thing is that they give you a free baseball cap with "Zalman.. We're Cool" written on it.

No really.... it's true... HONEST... I know I read it somewhere.
October 2, 2006 6:16:05 PM

This, in my personal opinion, is just obscene. Making a two-GPU card the STANDARD flagship? No, I don't think I need that high a price for my machine. What happened to a good, solid single-GPU card?

And as for memory... If they're identical GPUs, but the second both has less memory and a thinner memory bus, won't that perhaps limit the card's ability to run well? Since, after all, in running a parallel GPU setup, you need an identical copy of the texture set in each GPU's VRAM.

And as for the shader counts... From what I see, it's apparently 24 shaders per core, which I find to be a dissapointment. Even though it might be nice that they're unified for once.

This just reminds me of the dread people felt when the GeForce 6800ultra was introduced, and people tensely waited for ATi to unveil its X800XT... The 6800u brought us a two-slot card that ran noisy, hot, and consumed so much power that for the (then modern) AGP version, you needed TWO power supplements from the PSU... PLUS they introduced SLi with that card.

I just hope ATi will come back like they did back in 2004, bringing us a SANE card to compare with nVidia's crazy card. I know that I for one will want no part of the GeForce 8800 series like that...
October 2, 2006 7:54:24 PM

Quote:
This, in my personal opinion, is just obscene. Making a two-GPU card the STANDARD flagship? No, I don't think I need that high a price for my machine. What happened to a good, solid single-GPU card?

And as for memory... If they're identical GPUs, but the second both has less memory and a thinner memory bus, won't that perhaps limit the card's ability to run well? Since, after all, in running a parallel GPU setup, you need an identical copy of the texture set in each GPU's VRAM.

And as for the shader counts... From what I see, it's apparently 24 shaders per core, which I find to be a dissapointment. Even though it might be nice that they're unified for once.

This just reminds me of the dread people felt when the GeForce 6800ultra was introduced, and people tensely waited for ATi to unveil its X800XT... The 6800u brought us a two-slot card that ran noisy, hot, and consumed so much power that for the (then modern) AGP version, you needed TWO power supplements from the PSU... PLUS they introduced SLi with that card.

I just hope ATi will come back like they did back in 2004, bringing us a SANE card to compare with nVidia's crazy card. I know that I for one will want no part of the GeForce 8800 series like that...


Why not make a 2 GPUs in the flag ship card, has it not been that the flagsip must represent the company in all its glory and fame, does a Enzo let you down, no, should a 8800GTX, no they bring out the baddest and the best at the highest price...if you want a solid single gpu then go to the midrange cause thats what its going to,

as far as the memory bus width the second core is for a different set of calculations i believe, like physics, while the first core does the same thing graphics cards have been doing for aw hile, if anything you should be thankfull as it is a "bonus" from the regular card and prevents you from having to buy a phys-x card that will also burn your wallet another $280
October 2, 2006 8:35:37 PM

Quote:
Why not make a 2 GPUs in the flag ship card, has it not been that the flagsip must represent the company in all its glory and fame, does a Enzo let you down, no, should a 8800GTX, no they bring out the baddest and the best at the highest price...if you want a solid single gpu then go to the midrange cause thats what its going to,

as far as the memory bus width the second core is for a different set of calculations i believe, like physics, while the first core does the same thing graphics cards have been doing for aw hile, if anything you should be thankfull as it is a "bonus" from the regular card and prevents you from having to buy a phys-x card that will also burn your wallet another $280

You missed a key word, even though it was in ALL CAPS: "STANDARD." This is the same realm that was demonstrated in cards anywhere from the GeForce 2 ultra, to the Radeon 9700pro, to the GeForce 7800GTX. As a "flaghshipm" it's supposed to represent the pinnacle of their GPU engineering... Whereas dual-GPU cards like the 7950GX2 and the 8800GTX are not that.

Your analogy using a Ferrari Enzo is FATALY flawed; an automobile is more like a complete PC; it's not just the v-12 or w-16 engine block itself, but also about the transmission, the drivetrain, the body's aerodynamics, that all work together to hurl the vehichle down the road as fast as possible.

Hence, "flagship gamer PCs" will be loaded all-out. However, a graphics card is just a single part. A flaghsip part demonstrates technology above all; nVidia's GeForce 6 flagship was the 6800ultra, not SLi.

And if one has to go to the mid-range or lower to avoid spending $650-700US for a card not using crippled chips, that speaks of a very terrible future for gaming...
October 2, 2006 8:42:31 PM

If the 8800GTX works as if it's one core, why are you getting all bent out of shape about it actually being two?
October 2, 2006 9:02:16 PM

We are bound to see multiple cores on the GPU's in the future, especially once the smaller fabrication processes come online.

I imagine that the 8800 will be a good card but not the perfect implementation on the PCB. Remember how big and power hungry the 6800 ultra was, or how the 5800 needed a huge fan and howled. I think this will be a stop-gap DX10 card and once things are better developed you might see all this technology wrapped up into a single chip, with a lower power consumption ( hopefully ).

As always these expensive "flagship" pieces will only appeal to those with money to burn. The mainstream has always been where the big money earners are for the graphics card manufacturers. Give it another 3 months and we'll have some nice, affordable DX10 units out there in the stores.

Personally I think the next few months may be pretty exciting on the hardware front.

8)
October 2, 2006 9:08:06 PM

Quote:
Why not make a 2 GPUs in the flag ship card, has it not been that the flagsip must represent the company in all its glory and fame, does a Enzo let you down, no, should a 8800GTX, no they bring out the baddest and the best at the highest price...if you want a solid single gpu then go to the midrange cause thats what its going to,

as far as the memory bus width the second core is for a different set of calculations i believe, like physics, while the first core does the same thing graphics cards have been doing for aw hile, if anything you should be thankfull as it is a "bonus" from the regular card and prevents you from having to buy a phys-x card that will also burn your wallet another $280

You missed a key word, even though it was in ALL CAPS: "STANDARD." This is the same realm that was demonstrated in cards anywhere from the GeForce 2 ultra, to the Radeon 9700pro, to the GeForce 7800GTX. As a "flaghshipm" it's supposed to represent the pinnacle of their GPU engineering... Whereas dual-GPU cards like the 7950GX2 and the 8800GTX are not that.

Your analogy using a Ferrari Enzo is FATALY flawed; an automobile is more like a complete PC; it's not just the v-12 or w-16 engine block itself, but also about the transmission, the drivetrain, the body's aerodynamics, that all work together to hurl the vehichle down the road as fast as possible.

Hence, "flagship gamer PCs" will be loaded all-out. However, a graphics card is just a single part. A flaghsip part demonstrates technology above all; nVidia's GeForce 6 flagship was the 6800ultra, not SLi.

And if one has to go to the mid-range or lower to avoid spending $650-700US for a card not using crippled chips, that speaks of a very terrible future for gaming...

Man oh man,
ok if you feel as though my analogy was flawed that is just cause you refuse to look at it as though i did, i was simplay stating that a flagship is the pinnacle if all design (i.e. a ferrari enzo and the soon to be 8800GTX), what you do NOT understand is that the 7950GX2 is TWO 7900GTs put together, which is two completely separate video cards, as in 2 completely separate GPUs and there completely own ram

what the 8800GTX is a 2 cores on one BOARD, not two boards like the 7950GX2, think if it as a dual core cpu, but its a dual core GPU, what is wrong with that, i don't know why you are complaing so match

and the fact that 7950GX2 was even spawned represents how well nvidia incorporates all there technology, it is a piece of the puzzle, and it shows that even more since Ati has not released such a product at all

when the 8800GTX comes out it will be like a 6800ultra or a 7900GTX it is the "Standard" as you so call it, just 10(random guess) or so times more powerful, it is not the combining of two boards to make 1 single slot graphics card, it is ONE graphics card

i'm sure invidia will release such a graphics card though to keep that high price segment still there something like the 8800GX2 which would in theory connect two 8800GTs, this would still provide them with the ultra high end segment of quad sli that they have absolutely no competition in

its typical that prices are going that high since they are the release price when they first come out, you are getting spoiled with all the prices of late since it is the end of a generation all prices are at at a low, you used to have to pay $300 for a 7900GT, now you can get them for $230 or so,

same goes with all ATi and Nvidia cards, the will ask for that high price in the beginning but prices always come down, and if nvidia releases there dx10 cards first then they will have they only product offered in that area and i'm sure they will take advantage of it price wise, the prices for the x1900xtx and 7900gtx were insanely high when released also, but at that time they offered superior graphic processing power and were in high demand

same thing will happen with the 8800GTX, it will be the most powerful graphics card on the market at the time, therefore it will demand the most expensive price
October 2, 2006 9:15:47 PM

Just some questions....

I know there are gonna be 8800GTX and 8800GT but...

are there gonna be any 8800GS? 8600GT? Because the 8800GTX is gonna be $500 as always and the 8800GT is gonna be around $400. So I expect to get a 8800GS which would be about $300 or maybe a 8600GT that will be around $200-$250.
October 2, 2006 9:16:19 PM

Quote:
We are bound to see multiple cores on the GPU's in the future, especially once the smaller fabrication processes come online.
GPUs have had multiple cores for years, but I know what you mean. ;) 
October 2, 2006 10:19:33 PM

Quote:
Just some questions....

I know there are gonna be 8800GTX and 8800GT but...

are there gonna be any 8800GS? 8600GT? Because the 8800GTX is gonna be $500 as always and the 8800GT is gonna be around $400. So I expect to get a 8800GS which would be about $300 or maybe a 8600GT that will be around $200-$250.


if you were to follow how nvidia released their previous generation the 8800GTX and 8800GT and 8600GT will be first, then followed up with the 8600GS and 8300, then later a 8800GS and 8300GT, and a rumor that i'll start that there willbe a 8800GX2 combiming two 8800GTs together =)

after that will obviously be the 8900GTX and GT later on
October 10, 2006 11:16:43 PM

Quote:
Just some questions....

I know there are gonna be 8800GTX and 8800GT but...

are there gonna be any 8800GS? 8600GT? Because the 8800GTX is gonna be $500 as always and the 8800GT is gonna be around $400. So I expect to get a 8800GS which would be about $300 or maybe a 8600GT that will be around $200-$250.


if you were to follow how nvidia released their previous generation the 8800GTX and 8800GT and 8600GT will be first, then followed up with the 8600GS and 8300, then later a 8800GS and 8300GT, and a rumor that i'll start that there willbe a 8800GX2 combiming two 8800GTs together =)

after that will obviously be the 8900GTX and GT later on

It will be REALLY interesting to see if there is an 8800GX2, because the GTX is already a dual core. that means the gx2 would be 4 cores on 1 card! Imagine 2 GX2's in SLI!!!
October 11, 2006 12:48:19 AM

Pointless for someone like me? Nice to think about though! 8O
October 11, 2006 1:37:19 AM

My wallet is already getting angry at me and I haven't even bought one yet :( 
!