Sign-in / Sign-up
Your question

first DirectX10-compliant graphics chip,Nvid 8800 coming nov

Tags:
  • Graphics Cards
  • Chip
  • Graphics
  • Product
Last response: in Graphics Cards
October 3, 2006 1:07:41 PM

http://www.digitimes.com/mobos/a20061002A2007.html

cant wait my 6800 getting old, just waiting to get a dx10 one

pic off liquid cooled 8800
http://www.engadget.com/2006/10/02/nvidia-busting-out-l...

More about : directx10 compliant graphics chip nvid 8800 coming nov

a b U Graphics card
October 3, 2006 1:26:12 PM

Quote:
http://www.digitimes.com/mobos/a20061002A2007.html

cant wait my 6800 getting old, just waiting to get a dx10 one

pic off liquid cooled 8800
http://www.engadget.com/2006/10/02/nvidia-busting-out-l...


Meh, it will be the fastest for about 2 months. Not really worth buying until Vista is released... as R600 will also be released at the same time as Vista. And we all have that eary suspicion that R600 will trounce G80 in the performance department.
October 3, 2006 1:48:11 PM

300 Watts for the G80? Holy Smokes!(Literally if the cooling fails).

Any word on the ATI wattage? IF they can beat the Performance and require lower wattage - then I will jump off the Nvida bandwagon when I need to get DX10.

Cheers.
Related resources
a b U Graphics card
October 3, 2006 1:55:17 PM

Quote:
300 Watts for the G80? Holy Smokes!(Literally if the cooling fails).

Any word on the ATI wattage? IF they can beat the Performance and require lower wattage - then I will jump off the Nvida bandwagon when I need to get DX10.

Cheers.


ATi's R600 VPU should require more wattage. It is going to be the largest most powerful VPU ever conceived. Heck, R600 will start on an 80nm process and quickly migrate to a 65nm process. So it will be pretty darn hot running.

But the performance should be astounding. The R600 will be far more threaded in design then the R520/580. The one downside is that it will likely run warmer then anything we have ever seen.

So if performance/watt is your thing then R600 definitly isn't your cup of tea..lol.
October 3, 2006 1:57:35 PM

Yeah, I have that sneaking suspicion also. I also have a sneaking suspicion the R600 will be atomically powered and liquid nitrogen cooled (can't really decide if I'm being sarcastic here, especially because of how much wattage it'll use and the heat it'll give off)...

Either way, I'm holding off until both of them come out. Maybe 8 case fans aren't enough... where can I dremel a few more holes?
a b U Graphics card
October 3, 2006 2:20:03 PM

Quote:

Either way, I'm holding off until both of them come out. Maybe 8 case fans aren't enough... where can I dremel a few more holes?


On the bottom and the top of your case? :lol: 

Anyway you all have to know those pics are probably just
a prototype of the card, so when the consumer version arrive let's just
hope it's smaller, better looking, and hopefully not power hungry.

If it is power hungry, then the card manufacture should provide us
a special PSU just for GPUs. :p 
October 3, 2006 2:21:33 PM

If that's even true, that it sucks down 300w, that's effing ridiculous.
They really need to make their sh*t more efficient.
My entire household electricity bill doesn't even reach 0,3kw/h

And that single card is supposed to suck that much?
October 3, 2006 2:42:45 PM

Quote:


So if performance/watt is your thing then R600 definitly isn't your cup of tea..lol.


Good think that my only "thing" is, as Jeremy Clarkson always says, MORE POWER!
October 3, 2006 2:59:15 PM

Why can they go for a low-watt low temp graphics card. Seems it getting hotter and hotter means more money for the prize itself, the psu and cooling upgrades.
October 3, 2006 3:04:05 PM

I was planning on getting an 8800GT next month but looks like the better option would be just to wait until Vista is released and see what is my best option then. So will Vista ship with DirectX 10? I am kind of confused :( .
October 3, 2006 4:04:35 PM

Quote:
I was planning on getting an 8800GT next month but looks like the better option would be just to wait until Vista is released and see what is my best option then. So will Vista ship with DirectX 10? I am kind of confused :( .


8800GT? Why not get the 8850GX2. :roll: We don't even know what they're going to end up releasing next month, and you've already got your part picked out? The article in the link above states that they don't believe DX10 won't be ready when Vista ships, and says that it will probably go with an update.
October 3, 2006 4:25:37 PM

Quote:
I was planning on getting an 8800GT next month but looks like the better option would be just to wait until Vista is released and see what is my best option then. So will Vista ship with DirectX 10? I am kind of confused :( .


8800GT? Why not get the 8850GX2. :roll: We don't even know what they're going to end up releasing next month, and you've already got your part picked out? The article in the link above states that they don't believe DX10 won't be ready when Vista ships, and says that it will probably go with an update.


its called history dude, for the past 3 generations nvidias first release has been a flagship card with a *800 Ultra(or GTX) designation and a slightly lower card with *800GT

personally I'm waiting for the 8900GT :wink:
October 3, 2006 4:35:06 PM

And why do you have that suspicion ElMoIsEviL ??!
October 3, 2006 4:41:36 PM

I understand that, but nobody knows exactly what will be available next month. To say you're going to buy that particular card next month is a bit premature.
October 3, 2006 4:53:44 PM

300 W only for a GPU........ 8O I think they need "PERFORMANCE PER WATT" innovation
October 3, 2006 5:16:54 PM

If I had deep pockets like the rest of you I would just have the parts for the new video card shipped to me and then I would hire someone to put the thing together for me. I will prob upgrade to a 7900 or 1900 card in December. XP is gonna be my main OS for at least 2 more years or at least until they stop supporting it :wink:
a b U Graphics card
October 3, 2006 5:37:14 PM

Quote:
And why do you have that suspicion ElMoIsEviL ??!


Well, I used to work for ATi, and I still have friends who work there. When I worked there the engineers were always optimistic about matching whatever nVIDIA was releasing. Now, there's this aura of confidence that nothing nVIDIA releases will be able to counter R600 in the performance category. It's like an arrogance. Quite odd to explain. It's the same aura that even the Tech media outlets (TGDaily, Dailytech, Inquirer and Digitimes) seem to also be projecting.

It's like everyone already knows R600 will be the superior performing product. It's an eerie feeling if you ask me.

I don't doubt that G80 will be extremely powerful but even ATi's arrogant remarks that R600 will be the fastest DX9 card ever.. period, no if and or buts and there remarks that it will extend it's lead even further under DX10 are also not ATi-like. You'd have to have worked for ATi to understand why R600 is different then any other product they've released. It's like R600 is to ATi what K8 was to AMD.
October 3, 2006 5:49:38 PM

Quote:
And why do you have that suspicion ElMoIsEviL ??!


Well, I used to work for ATi, and I still have friends who work there. When I worked there the engineers were always optimistic about matching whatever nVIDIA was releasing. Now, there's this aura of confidence that nothing nVIDIA releases will be able to counter R600 in the performance category. It's like an arrogance. Quite odd to explain. It's the same aura that even the Tech media outlets (TGDaily, Dailytech, Inquirer and Digitimes) seem to also be projecting.

It's a similar sort of thing to Sony at the moment. Yesterday when asked what he thought of the competition the Wii and 360 would offer Ken "The father of playstation" Kutarugi said "I don't care". This shows supreme arrogance in what can only be considered nothing other than a true 3 horse race.
October 3, 2006 7:03:47 PM

Quote:
I understand that, but nobody knows exactly what will be available next month. To say you're going to buy that particular card next month is a bit premature.


thats true, I guess I should say I'm waiting for whatever 8 series card that matches the 7900gt for price/performance, and hopefully that card boosts efficiency like the 7900's did too
October 3, 2006 7:38:01 PM

Quote:
Meh, it will be the fastest for about 2 months. Not really worth buying until Vista is released... as R600 will also be released at the same time as Vista. And we all have that eary suspicion that R600 will trounce G80 in the performance department.
That's probably a likely scenario. If I get a G80, it won't be for Direct X10, but improved Direct X9 performance.
October 3, 2006 7:39:33 PM

Quote:
Meh, it will be the fastest for about 2 months. Not really worth buying until Vista is released... as R600 will also be released at the same time as Vista. And we all have that eary suspicion that R600 will trounce G80 in the performance department.
That's probably a likely scenario. If I get a G80, it won't be for Direct X10, but improved Direct X9 performance.

Same as me then. Games aren't going to be taking full advantage of DX10 for at least a year. Heck, DX9 has barely been fully exploited! Look at the Unreal 3 engine, i think it looks phenomenal and it is based on Dx9.
October 3, 2006 7:45:48 PM

hmm.. does it come with a optional power brick? Because i dont wanna fork over cash for another power supply... I have a old 300w psu.. plug it into the card and then short the green-black wires.... or mod my board to power up both psu's when i press hte power button. Also, does the video card box comes with a pump/cables/etc? Id hate to get a watercooled card just so i can buy the rest of a hte watercooling set...
October 4, 2006 1:03:48 PM

I just remembered seeing this a while back - Fortron Source has a Drive Bay Power Supply which provides an additional 300W of power (peak 400W).

Not sure if there are others - but this is one possible solution.

Cheers
October 4, 2006 1:24:03 PM

Hopefully both of these cards will run on a 500W PSU or I guess I'll be upgrading that too when I pickup one of these cards :( .
October 4, 2006 1:45:44 PM

You know, if ATI and Nvidia wanted to make money... they'd just make bigger graphics cards that took more heat and power, and then start selling PSU's and heatsinks just for their cards. That way, they make money off the card, the psu, and additional cooling. Horizontal integration FTW.

If only I was CEO, we'd be rolling in dough.
October 4, 2006 2:03:04 PM

And I would be poor ;(
October 4, 2006 3:02:00 PM

Well... if looks like we need octuple pci-e connectors if they want to quad-sli...

Either way, things are getting ridiculous. I'm lucky that I don't have to pay for electricity yet (Dorm handles that) but its a sad state of affairs when one of the determining factors of my apartment is if utilities are included...
October 4, 2006 3:02:35 PM

2 power connectors wtf
October 4, 2006 3:10:35 PM

Quote:
2 power connectors wtf


Oh for the love of Christ, how many times do I have to say it? It is a dual core/dual GPU card. Go look up the G80 specs if you don't believe me, 700 million transistors = 2 x 350 million transistor chips. So, since you have two GPUs on the same card, you need....you guessed it, one power connector for each GPU.
October 4, 2006 3:53:46 PM

wow, so I guess if you have a 700W+ psu you could run the first card off the main psu then get a 5.25" psu for the second card 8O ...but if you're running 4x4 700W might not cut it 8O 8O

at this rate we will need a dedicated 30A 220v service for our systems soon, good thing my comp room backs up to teh garage right where there happens to be a 30A 220v outlet for an electric dryer, and I just happen to be using a gas dryer :p  looks like I'm set 8) now I just need to rob a bank to get teh hardware and pay teh elec bill :lol: 
Anonymous
a b U Graphics card
October 4, 2006 8:18:56 PM

I agree with the dual core in a single package but I dont see it translating in a dual power connector.
Gpu have been working with Quads for quite a while and there was'nt for connector...

And I think that since it's a single package/(?die?) you can't say there's 2 GPU but thats just being picky 8)

My question is the following, how much Juice can go thru a single PCIe Power connector, is it around 200 watt? if so the 300W figures we have seen floating around would leave you to beleive that yeah, you need to of them...