Sign in with
Sign up | Sign in
Your question

Finally, we need external PSU for those DX10 babes....Dah...

Last response: in Graphics & Displays
Share
September 13, 2006 9:39:46 AM

Yes I know that's gonna happen, but the question is, how many watts will they consume?

External PSU and November tech day

Any opinion? Will the external PSU be capable of handling SLI, CF, or even Quad SLI??
a c 358 U Graphics card
September 13, 2006 3:22:39 PM

The speculations had already been addressed back on June 5th, 2006.

Computex 2006 Article

Quote:

ATI and NVIDIA have been briefing power supply manufacturers in Taiwan recently about what to expect for next year’s R600 and G80 GPUs. Both GPUs will be introduced in late 2006 or early 2007, and while we don’t know the specifications of the new cores we do know that they will be extremely power hungry. The new GPUs will range in power consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm or deny our findings and we are receiving conflicting information as to the exact specifications of these new GPUs, but the one thing is for sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that the power supplies they are shipping by the end of this year are up to par with the high end GPU requirements for late 2006/early 2007. You will see both higher wattage PSUs (1000 - 1200W) as well as secondary units specifically for graphics cards. One configuration we’ve seen is a standard PSU mounted in your case for your motherboard, CPU and drives, running alongside a secondary PSU installed in a 5.25” drive bay. The secondary PSU would then be used to power your graphics cards.

September 13, 2006 4:00:57 PM

Quote:
The speculations had already been addressed back on June 5th, 2006.

Computex 2006 Article


ATI and NVIDIA have been briefing power supply manufacturers in Taiwan recently about what to expect for next year’s R600 and G80 GPUs. Both GPUs will be introduced in late 2006 or early 2007, and while we don’t know the specifications of the new cores we do know that they will be extremely power hungry. The new GPUs will range in power consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm or deny our findings and we are receiving conflicting information as to the exact specifications of these new GPUs, but the one thing is for sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that the power supplies they are shipping by the end of this year are up to par with the high end GPU requirements for late 2006/early 2007. You will see both higher wattage PSUs (1000 - 1200W) as well as secondary units specifically for graphics cards. One configuration we’ve seen is a standard PSU mounted in your case for your motherboard, CPU and drives, running alongside a secondary PSU installed in a 5.25” drive bay. The secondary PSU would then be used to power your graphics cards.



300 Watt for a graphics card? This is just getting ridiculous.
September 13, 2006 4:05:37 PM

Quote:
300 Watt for a graphics card? This is just getting ridiculous.


I really hope nobody would suggest quad SLI with that kind of cards or we would have to shut almost everything down in the house before we boot our pc :lol: 
September 13, 2006 4:07:05 PM

Quote:
300 Watt for a graphics card? This is just getting ridiculous.


I really hope nobody would suggest quad SLI with that kind of cards or we would have to shut almost everything down in the house before we boot our pc :lol: 

That just made me realize... 1200 watts?!?!?!?! Jesus christ! That is a LOT of power. Probably more power than every other appliance in my house combined!
September 13, 2006 4:14:27 PM

Quote:
The speculations had already been addressed back on June 5th, 2006.

Computex 2006 Article


ATI and NVIDIA have been briefing power supply manufacturers in Taiwan recently about what to expect for next year’s R600 and G80 GPUs. Both GPUs will be introduced in late 2006 or early 2007, and while we don’t know the specifications of the new cores we do know that they will be extremely power hungry. The new GPUs will range in power consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm or deny our findings and we are receiving conflicting information as to the exact specifications of these new GPUs, but the one thing is for sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that the power supplies they are shipping by the end of this year are up to par with the high end GPU requirements for late 2006/early 2007. You will see both higher wattage PSUs (1000 - 1200W) as well as secondary units specifically for graphics cards. One configuration we’ve seen is a standard PSU mounted in your case for your motherboard, CPU and drives, running alongside a secondary PSU installed in a 5.25” drive bay. The secondary PSU would then be used to power your graphics cards.



Isn't this crap just unbelievable? 300W PER (video) CARD!?! My ENTIRE SYSTEM runs on barely more than that! The shit is definitly getting out of hand!

Fortunately, after reading the entire article, I discovered that the card makers are going to ratchet-down the power requirements by mid-2007. According to the article, it seems that the competition to release these new cards was so great, that they threw out power consumption concerns.

And just think how expensive all of this will be: I doubt if many people here have that kind of overhead available from their current power supplies to handle the new cards, so that means buying an over-priced, untested (immature as to drivers and stability), video card PLUS another power supply (and those might get harder to find (and prices will go up) if the demand for those cards is really high).
September 13, 2006 4:43:54 PM

Look at it this way. With that much sytem power being used, there will be heat - you can mod your pc exhaust to the hot water heater in your home or, if you prefer water cooling, just loop your tubes to the shower for hot water. I see this as a real boon to folks in Minnesota in January - a couple of hours of Oblivion will take care of the whole house and you can keep the spa warm at the same time.
September 13, 2006 5:06:06 PM

If manufacturers want their product to go faster, they have a few choices. One is to increase the amount of power (electricty) and the other is to create a fab for smaller CPU's such as 90mm, 45mm, 30mm... It tends to be easier to increase the power and consequensly the power consumption then to create an entire new fab plant when there is still much more potential in the product they have. 300watts today, 500watts tomorrow. Its the price we pay to have technology progress, without actualy progressing.
September 13, 2006 5:11:31 PM

Quote:
If manufacturers want their product to go faster.......the other is to create a fab for smaller CPU's such as 90mm, 45mm, 30mm...


You mean 90nm, 45nm and 32nm right??
September 13, 2006 5:13:49 PM

Well of course I do for CPU's. This scheme doesnt apply to other circuit technologies though, but it was what I ment.
September 13, 2006 5:25:18 PM

I'm naive when it comes to differences in build and architecture of CPU's vs. GPU's. Is it feasible for manufacturers to come up with a more efficient architecture and still dramatically increase the performance? Both AMD and Intel have already done so with their CPU's---especially with the latest Intel Conroe (more speed, less power consumption). Why not go the same route with GPU's.

PC's requiring a 1200W PSU is absurd. I can no longer b$tch at my wife for running her hairdryer for 20 minutes if I'm running a 1200W computer for 3 hours.
September 13, 2006 5:30:57 PM

That is the goal for all manufactures, but developing new architecture takes a long time (relatively) and when your dealing with video cards, they want new ones every 6 months. They can not reengineer the GPU every 6 months and put it into production, so they usually just up the power, or add minor things that usually end up requiring more watts to run.
September 13, 2006 6:08:29 PM

Quote:
Yes I know that's gonna happen, but the question is, how many watts will they consume?

External PSU and November tech day

Any opinion? Will the external PSU be capable of handling SLI, CF, or even Quad SLI??


I bet ATI & nVidia will go cheap on the PSUs which means each will only have enough watts and connectors for one card, adding to the spaghetti mess of cables already at the back of most PCs, and making it look just like an oversized console.

For SLI we'll probably end up with no option but to have 2 power bricks outside the PC. Hopefully Dual GPU cards will only need one brick each not two, otherwise Quad SLI systems will need 4 bricks hanging off the back.

I hope they provide the option to run them in brick-less mode if you have a beefy enough PSU already.
September 13, 2006 6:11:21 PM

They do not provide this option as non in the past have either. I cant wait until everything requires an external brick, like a 150watt modem, a 220watt 10/100 Nic. a 500watt Audigy 5. and so on.
September 13, 2006 9:09:58 PM

Quote:
If manufacturers want their product to go faster, they have a few choices. One is to increase the amount of power (electricty) and the other is to create a fab for smaller CPU's such as 90mm, 45mm, 30mm... It tends to be easier to increase the power and consequensly the power consumption then to create an entire new fab plant when there is still much more potential in the product they have. 300watts today, 500watts tomorrow. Its the price we pay to have technology progress, without actualy progressing.


How about this: Stop making a new card every 5 minutes! I mean, c'mon here: we all fall into three distinct categories. They are:

1. The gamer who "absolutely" needs every last frame in Quake;
2. The average user who occaisionally games and does some multimedia stuff;
3. Those who write emails, letters, code, and post ridiculous items in forums.

Guy 1 "needs" a high-end card.
Guy 2 requires a mid-range card.
Guy 3 could get by with an abacus, pen/paper/stamps, and friends. Oh, and a low-end card.

So the video card manufactures could do us all a favor by concentrating on those markets. Then each year (or 18 months), they can release new ones, and because people will always want the newest shizzle, they will buy them. This will also give them time to properly make and test those cards, as well as ensure that supplies of existing ones are adequate.

Think about it- what if a car manufactuer were releasing a new car every week? You buy a Vette in October, but in January they release the new, XZR-4LT. Two weeks later, they release XZR-4XX! And it's basically the same damned thing!
September 13, 2006 9:12:34 PM

Quote:
Look at it this way. With that much sytem power being used, there will be heat - you can mod your pc exhaust to the hot water heater in your home or, if you prefer water cooling, just loop your tubes to the shower for hot water. I see this as a real boon to folks in Minnesota in January - a couple of hours of Oblivion will take care of the whole house and you can keep the spa warm at the same time.



That's great for you saps in the snow belt, but hell, I live in the deep south baby. My power bill was already over $300 this month! Trust me, I ain't in no need of any additional heat!

In other news, I can now eat PB&J sanwiches with my ramon, now that gas has fallen in price lately.
September 13, 2006 9:48:20 PM

Quote:
If manufacturers want their product to go faster, they have a few choices. One is to increase the amount of power (electricty) and the other is to create a fab for smaller CPU's such as 90mm, 45mm, 30mm... It tends to be easier to increase the power and consequensly the power consumption then to create an entire new fab plant when there is still much more potential in the product they have. 300watts today, 500watts tomorrow. Its the price we pay to have technology progress, without actualy progressing.


How about this: Stop making a new card every 5 minutes! I mean, c'mon here: we all fall into three distinct categories. They are:

1. The gamer who "absolutely" needs every last frame in Quake;
2. The average user who occaisionally games and does some multimedia stuff;
3. Those who write emails, letters, code, and post ridiculous items in forums.

Guy 1 "needs" a high-end card.
Guy 2 requires a mid-range card.
Guy 3 could get by with an abacus, pen/paper/stamps, and friends. Oh, and a low-end card.

So the video card manufactures could do us all a favor by concentrating on those markets. Then each year (or 18 months), they can release new ones, and because people will always want the newest shizzle, they will buy them. This will also give them time to properly make and test those cards, as well as ensure that supplies of existing ones are adequate.

Think about it- what if a car manufactuer were releasing a new car every week? You buy a Vette in October, but in January they release the new, XZR-4LT. Two weeks later, they release XZR-4XX! And it's basically the same damned thing!


Yes and no. I can choose to buy the 500watt card, or the mid ranged reasonable 80watt card. If they dont keep updating the cards quickly, people will lose out AND they wouldnt update so quickly if people didnt buy the cards.

As for the last bit of frame for online FPS'rs. I play on the top leagues in SOF2 (basicly quake3 + CS) and I do just fine with 50fps. I have a card that can push the game past 300fps but im too lazy to type in seta com_maxfps 300. 3 extra frames does not make you a beter player. Its easier to boast to your online friends and lie not to mention cheaper.
September 13, 2006 10:16:52 PM

The video for my old Commodore64 required, like, 0 watts. It was just some small guy inside my monitor with a candle who would furiously draw things on the screen.
a b U Graphics card
September 13, 2006 10:24:59 PM

C64's ruled for gaming in their day. Those load times were downright unbearable though.
September 13, 2006 10:33:41 PM

Anyone play the c64 game Jill of the Jungle???
That is the first computer game I remember playing.
September 13, 2006 10:56:07 PM

Well, I don't know what u guys, NV and ATI are thinking, I will never buy a card with 300W! Never!! Ever!!

I don't know how many of you are gonna buy those cards. Let's do a survey to see.

Post your opinion, Ye or No and I'll start to count, until they are finally released. I bet they won't sell well.

Some marketing/management/tech/design guys in both companies already ate sxxt??
September 13, 2006 10:58:36 PM

The forum member base here is hardly an acurate sample of the video card buying demographic. How will a few replies of yes and no determine if the 300watt cards sell or not?



How about we say YES it will sell because if they wouldnt, they wouldnt be made.
September 13, 2006 11:15:31 PM

Old news.
September 13, 2006 11:16:39 PM

Quote:
Well, I don't know what u guys, NV and ATI are thinking, I will never buy a card with 300W! Never!! Ever!!

Ok, you play games on obsolete technology...
September 13, 2006 11:32:09 PM

No, I like new technology, but I don't go along with their current path.

Good technology should bring us less power consumption (e.g. Intel and AMD). I would prefer a $1000 high end single card consumes 100 Watt rather than a $500 card consumes 300Watt. Short term big price rather than long term big price (e.g. Electric Bill)

You just misunderstood me :) 
September 13, 2006 11:38:16 PM

Maybe I'm just paranoid. The last card I remember with an external PSU was the voodoo5 6000 or 5500??

I wanted one of those sooooo bad. Fortunately I didnt have the cash and didnt buy something from a defunct company.
September 13, 2006 11:45:59 PM

Quote:
The speculations had already been addressed back on June 5th, 2006.

Computex 2006 Article


ATI and NVIDIA have been briefing power supply manufacturers in Taiwan recently about what to expect for next year’s R600 and G80 GPUs. Both GPUs will be introduced in late 2006 or early 2007, and while we don’t know the specifications of the new cores we do know that they will be extremely power hungry. The new GPUs will range in power consumption from 130W up to 300W per card. ATI and NVIDIA won't confirm or deny our findings and we are receiving conflicting information as to the exact specifications of these new GPUs, but the one thing is for sure is that the power requirements are steep.

Power supply makers are being briefed now in order to make sure that the power supplies they are shipping by the end of this year are up to par with the high end GPU requirements for late 2006/early 2007. You will see both higher wattage PSUs (1000 - 1200W) as well as secondary units specifically for graphics cards. One configuration we’ve seen is a standard PSU mounted in your case for your motherboard, CPU and drives, running alongside a secondary PSU installed in a 5.25” drive bay. The secondary PSU would then be used to power your graphics cards.



300 Watt for a graphics card? This is just getting ridiculous.

How much power does your 7900GT take? Also, what kind of cooling are you using? Water?
September 13, 2006 11:51:48 PM

Quote:
No, I like new technology, but I don't go along with their current path.

Good technology should bring us less power consumption (e.g. Intel and AMD). I would prefer a $1000 high end single card consumes 100 Watt rather than a $500 card consumes 300Watt. Short term big price rather than long term big price (e.g. Electric Bill)

You just misunderstood me :) 

So you'll pay much more up front, for technology that would be old in 6 months?

You lose money either way...
September 13, 2006 11:56:10 PM

Whatever manufacturers do, they can't increase power consumption forever : in the end, they will be limited by cooling systems.

increasing GPUs' consumption is a simple way to get more performance, but it is only possible as long as heat can be dissipated by a reasonable cooling system. And reasonable means not too expensive.
Most graphics cards now require quite heavy HSFs with heatpipes, and watercooling becomes more & more usual for overclockers... but after ? What will they have to do to dissipate all those watts ? bigger fans on a radiator on a watercooling on Peltier elements that eat more amps again ?

when watts will be more a problem than a solution, they'll change... but it's not for now, even though we could have hoped that cards manufacturers would follow the intel/amd marketing battle about "performance per watt".

Anyway, everything is not so dark... there already are powerful GPUs on some laptops, and their architecture is designed to minimize consumption. Hopefully those optimizations will come to desktop PCs, just like speedstep /cool&quite did.

please forgive me for my french guy's approximative English
September 13, 2006 11:59:27 PM

I forgot to mention:

There's no way in hell that DX10 cards will need that much power. It just doesn't make economic sense for ATI/nV. If the cards need taht much power, they will lose all the people with pre-built systems who want to upgrade their video card on a 350W PSU. I know that I won't buy a DX10 card if it requires me to buy a new, 800W PSU.
September 14, 2006 12:03:16 AM

If they require new huge PSUs, then the price of the cards has to be low enough for people to buy them.

They will lose money if cards cost $600 and you need to buy a $300 PSU to use them. No way in hell will people pay $900 to upgrade a video card.
September 14, 2006 12:28:05 AM

One thing I forgot to mention is noise... GPU coolers are more and more noisy ... if they get bigger again, there'll be more noise at a LAN party than on an aircraft carrier's bridge...

" - What's that noise ?? A F-14 started it's engines ??
- No, my PC is booting... don't stay near the fans, it's burning !
- Hey, guys ! bring the meat here, we can cook it behind his PC !"

NV and ati will really have to choose if they make graphics cards, jet engines or barbecues
September 14, 2006 2:04:45 AM

Quote:
If they require new huge PSUs, then the price of the cards has to be low enough for people to buy them.

They will lose money if cards cost $600 and you need to buy a $300 PSU to use them. No way in hell will people pay $900 to upgrade a video card.


My point exactly.

Quote:
One thing I forgot to mention is noise... GPU coolers are more and more noisy ... if they get bigger again, there'll be more noise at a LAN party than on an aircraft carrier's bridge...


In general, the larger fans get, the quieter they are, as a slow-spinning, large fan can move the same amount of air as a hyper-spinning, small fan.
September 14, 2006 3:18:23 AM

Quote:
Well, I don't know what u guys, NV and ATI are thinking, I will never buy a card with 300W! Never!! Ever!!

I don't know how many of you are gonna buy those cards. Let's do a survey to see.

Post your opinion, Ye or No and I'll start to count, until they are finally released. I bet they won't sell well.

Some marketing/management/tech/design guys in both companies already ate sxxt??


Newbie votes NAY.
September 14, 2006 5:59:50 AM

I only have a damn 300W PSU. Still haven't got enough to upgrade my good old athlon xp. Want to go big this time.
September 14, 2006 7:35:34 AM

Personally I'll remember the difference between a FX 5800 'Dustbuster' and a 6600... The first was a power hog, the second can work very well - even overclocked - with passive cooling.

I'm planning to replace my 6600 when the G8x rolling out will get:
- a second core revision, leading to improved performance,
- a finer engraving, leading to lower heat at same clock speed and reuced power consumption.

Of note: careful software modding of your graphics card can already lead to 'adaptive' power use (I know that Rivatuner for example allows me to reduce my 6600 to 120 MHz when no 3D is used, then it jumps back to 300 MHz whencomputing power is required). It would just be nice for it to be more profound, like you can see with CPUs.

Maybe for the second revision of the G8x series...
September 14, 2006 8:38:29 AM

Quote:
In general, the larger fans get, the quieter they are, as a slow-spinning, large fan can move the same amount of air as a hyper-spinning, small fan.


True, but it's not very easy to place a huge fan on a video card... and even if they place a 8cm fan on a card, it will be a slim fan, that have to spin faster than a standard 80x80x25mm fan. an other solution would be to make cards that use 3 slots, but they should remember that enthusiasts don't need ONLY graphics cards in their machines... they also like to have a good sound card (which means not integrated) or a good RAID controller, and whatever else.

and magine a SLI rig with 2 three-slot cards...
1) it would be ridiculous to kill 6 slots juste for 2 cards...
2) mobos don't have enough space betwwen PCI-E 16x slots, and I don't see nv & ati asking every 6 months "Please Mr Asus, could you put the 2nd 16x slot a bit lower ?"

Zalmann ZM-80D Power !!! ok, ok, it was for the Geforce 4... but I have a GF4 Ti4200 so....
September 14, 2006 8:52:52 AM

Quote:
There's no way in hell that DX10 cards will need that much power. It just doesn't make economic sense for ATI/nV. If the cards need taht much power, they will lose all the people with pre-built systems who want to upgrade their video card on a 350W PSU. I know that I won't buy a DX10 card if it requires me to buy a new, 800W PSU.

But that's the whole point: it won't. The cards will require vast amounts of power, but they will come with their own stand-alone, external PSU. They won't draw any power at all from the main PSU. That means the drain on the computer's main PSU will actually be much less than it is in a present-day system. People with 300W power supplies will be able to use cutting-edge graphics hardware in the way that they actually can't at the moment.
September 14, 2006 9:09:23 AM

Quote:
There's no way in hell that DX10 cards will need that much power. It just doesn't make economic sense for ATI/nV. If the cards need taht much power, they will lose all the people with pre-built systems who want to upgrade their video card on a 350W PSU. I know that I won't buy a DX10 card if it requires me to buy a new, 800W PSU.

But that's the whole point: it won't. The cards will require vast amounts of power, but they will come with their own stand-alone, external PSU. They won't draw any power at all from the main PSU. That means the drain on the computer's main PSU will actually be much less than it is in a present-day system. People with 300W power supplies will be able to use cutting-edge graphics hardware in the way that they actually can't at the moment.

so, instead of buying ONE big PSU, you buy TWO smaller ones... even if the 2d one is bundled with your graphics card, you'll have to pay for it... and internal or external isn't the problem. the problem is that the way they upgrade their cards is notthe right way : imagine if car manufacturers only increased power with bigger engines...
September 14, 2006 9:17:38 AM

Oh yeah?

I don't care whether I lose money or not.

The problem is, I am renting my house with 4 other people who only use laptop for surfing/downloading. They don't play games. So I don't care to shell out $1000 for my next high end card, but I do care if other people have to pay the huge bill for me without using most of the electrisity themselves.

You see, I don't want to feel guilty everytime I play the games.
September 14, 2006 9:30:40 AM

start rivatuner. Go to software overclocking. It should show you a 2D clock speed and a 3D clock speed. Reduce 2D clock speed. There you are.
September 14, 2006 9:39:52 AM

Quote:
Yes I know that's gonna happen, but the question is, how many watts will they consume?

External PSU and November tech day

Any opinion? Will the external PSU be capable of handling SLI, CF, or even Quad SLI??


The average American baseboard circuit, from memory is 20 amps... Yeah? (hehe... We're lucky! we're on 220 V, so get twice the watts for a rated circuit...)

P=IxE, so watts = 20a x 120v, or 2400 watts.... Think about it. Just for a second...

Then dismiss the thought, as this is an old article from the (in)famous INQ.

Has anyone actually seen the specs for a DX 10 card?
September 14, 2006 9:43:05 AM

About Oblivion: I don't know, I don't play it (I got fed up with the Elders Scrolls series with III), but I heard you could tweak rendering by editing an INI file. Have a look at it.
September 14, 2006 10:04:09 AM

Quote:
they only exist in the lab and the demented minds of the inquirer staff. 8O :p 
Along with their dreams of K8L
September 14, 2006 10:05:43 AM

I'm just wondering if there is any possibility that in the near future, the high end graphics solution will be like:

One special port at the back of the motherboard (really high bandwidth)+a completely independent (external) graphics card with its PSU

OR maybe:

a seperate graphics unit case (with it's own motherboard and PSU) sit right on top of your computer case :D  :D  :D 

just kidding......
September 14, 2006 10:09:49 AM

Quote:
so, instead of buying ONE big PSU, you buy TWO smaller ones... even if the 2d one is bundled with your graphics card, you'll have to pay for it... and internal or external isn't the problem. the problem is that the way they upgrade their cards is notthe right way : imagine if car manufacturers only increased power with bigger engines...

No one is forcing you to upgrade, are they? You sound as if you think Nvidia and ATI are deliberately producing high-power solutions out of sheer spite when they could just as easily produce something equally fast that uses a tenth of the power.

This, frankly, is b*llocks. If Nvidia and ATI choose to release high-consumption products, the reason is because right now that is the only they can dramatically boost performance.

You therefore have a choice: do you buy a high-power, high speed solution, or a lower-power, lower-speed solution? If Nvidia and ATI don't produce the high-power/high speed option at all (which is what you're adovcating) then no one will have that choice any more. If you personally choose the lower-power, lower-speed version then that's absolutely fine: it's your decision. But it is extremely presumptuous of you to suggest that other people should not be allowed to make that decision for themselves.
September 14, 2006 10:25:16 AM

"I can see a nuclear reactor in every house from now and beyond" 8)
September 14, 2006 10:33:09 AM

Quote:
"I can see a nuclear reactor in every house from now and beyond" 8)


....and booommmmm it goes for the whole neighbourhood if there's fluctuation.....don't blame the iraqis/iranians/any "terrorist" group when that happens...... :wink:


-->Should we wear something protective should anything bad happen :?:
September 14, 2006 10:42:09 AM

Quote:
....and booommmmm it goes for the whole neighbourhood if there's fluctuation.....

Quote:
a nuclear reactor in every house


Nuclear reactor in every house. When it's boom...you have nothing left...even your DX10 display cards.
September 14, 2006 1:48:29 PM

Quote:
If Nvidia and ATI choose to release high-consumption products, the reason is because right now that is the only they can dramatically boost performance.

You therefore have a choice: do you buy a high-power, high speed solution, or a lower-power, lower-speed solution? If Nvidia and ATI don't produce the high-power/high speed option at all (which is what you're adovcating) then no one will have that choice any more. If you personally choose the lower-power, lower-speed version then that's absolutely fine: it's your decision. But it is extremely presumptuous of you to suggest that other people should not be allowed to make that decision for themselves.


I don't mean to decide if people should buy those cards or not...
you seem to think I don't agree with people who buy high-end cards... if somebody needs a 7950GX2 to run his games, why would it be bad ? there's no problem with that. what I don't agree with is the choice manufacturers made. they've chosen to release a new GPU series about every 6 month, but such a rythm does not allow them to make time consuming upgrades like a brand new architecture. the big problem is that once this rythm is taken, if nv or ati decides to take time to release something really new, the other one will dominate the market during the time he didn't release anything. so, in one way you're right : to the moment, there can't be more perf without more watt... but in my opinion, it is for economical reasons only.

And when you talk about "dramatically" increasing performance, I think the major perf raises come more from new architectures than from clock increases .for example, look at Geforce 1 : its frequencies were almost comparable with TNT2's... the difference came from hardware T&L. A more recent example, for CPUs, is Core2Duo. It IS possible to raise perfs & reduce consumption, but it requires a lot of R&D, and GPU manufacturers can't afford it now
!