Sign in with
Sign up | Sign in
Your question

The Graphics State of the Union

Last response: in Memory
Share
July 21, 2006 2:38:30 PM

Tom\'s Hardware graphics presidente Polkowski is concerned about the 3D arms race. Power and heat dissipation are skyrocketing, but external graphics boxes could eliminate the imminent need for 1,000 W power supplies.

Speak out in the Toms's Hardware reader survey!

More about : graphics state union

July 21, 2006 2:58:34 PM

Quote:
Tom\'s Hardware graphics presidente Polkowski is concerned about the 3D arms race. Power and heat dissipation are skyrocketing, but external graphics boxes could eliminate the imminent need for 1,000 W power supplies.


Interesting article. I got a kick out of the comment about AMD pulling the Antec 550 watt PS out of their game development lab and how they are now using 200-300 buck units! Many people on this forum swear by PS calculators and don't realize that these are just a starting point. PS failure can be ugly and I predict we'll be seeing an epidemic of failures when DX-10 cards roll out.

Where can I get one of thos external GPUs? My laptop is jealous!
July 21, 2006 3:12:00 PM

Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.
Related resources
July 21, 2006 3:22:52 PM

Quote:
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.


If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way? There are internal PSUs available, so you do not have to upgrade your main PSU. And imagine vid cards coming out with additional equipment - they would get even more expensive than they are right now.

Give the developers some time to realize that power consumption might be something valuable in relation to advertising and tests - and they will start designing cards that will use less power, just like the upcoming row of cpus already shows.
July 21, 2006 3:36:21 PM

Quote:
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.


If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way?

It's not a bad upgrade path, especially if available as an option (as it now is). I tend to advise people to get an overkill PS, but if you have a rig and want to get more GPU power, an outboard PS is a decent solution at least for short to mid term.
July 21, 2006 3:38:49 PM

I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite! Wrong. I'm completely wrong on that. Thank you all for correcting this horrid mistake. The X1900XTX consumes 121W on load.

I'm not deleting it, so that we don't have an argument that I deleted and then I said I never said it, so again: I'M WRONG. IT IS 121 WATTS ON LOAD.

The Core 2 Extreme X6800 consumes a meek 66W on load.



What is wrong with the world?!?!?!

~Ibrahim~
July 21, 2006 3:40:11 PM

and everyone (including myself) said I was crazy for getting a 680w psu, but hey I always go by a price performace ratio in this case it was price per watt, and I got that 680w for 120 bucks and it has an active pfc so now I can justify that 680w. Although it doesnt come close to the pirce per watt as my brothers psu a 580w for 30 bucks I didnt feel safe going that cheap on a psu.
July 21, 2006 3:42:49 PM

I'd like to know where they get their PC&C power supplies for $400.00. Never Never land?
July 21, 2006 3:42:52 PM

Quote:
If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way?

I'm with you on that! I got enuff wires, power packs, and cords galore tucked and stuffed behind the machine and under my desk, let alone the idea of an external GPU on the desktop or external GPU power supply on the floor.

In light of this article, the idea of a socketed GPU on the mobo seems like a better way to go. The mobo wouldn't be much different from a dual socket mobo now, 1 socket for the cpu w/ its own memory slots and then a second socket for the gpu with its own dedicted memory slots. Think about the upgradability and expandibility of that platform!

As far as needing to upgrade the psu just to accommodate a gpu, that will be ridiculous. There should be no reason that a good quality 550 or 600 watt psu with plenty of amps on the 12v rail(s) shouldn't be able to handle everything. Ironic considering all the talk and competition between Intel and AMD over who has the most power efficient procs, almost as if gpu's are going in the opposite direction.
July 21, 2006 3:44:21 PM

Yet another reason to stick with xp, or God forbid I learn the penguin.

I'm thinking the decision not to go with a smaller processing unit will haunt the graphics manufacturers. But, it does come with a huge price tag, so hold on to technology till it can't put out anymore I guess.
July 21, 2006 3:44:25 PM

Quote:
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.


I don't mean to be rude but thats a ridiculous idea. I bought a $160 PSU to power 1 computer, and now you want me to deal with another power brick just for a GPU? Now thats just plain dumb, imo. Look at the xbox 360 power brick, do you really want to lug that around just to power a GPU? Not even the entire computer, just a GPU.

I know your going to say "but its a desktop nobody cares," quite the opposite. The imense power consumed by a computer has a direct effect on power bills, cooling bills, home comfort, noise problems and the like. The more power is drawn the more heat is generated (generally, lets not get into nit picking here) and that heat has to go somewhere. Most people won't go with water cooling because its risky and not worth it (to them), so lets just toss a few more fans on there, why not? Wrong those fans start adding noise that is like a swarm of mosquitos buzzing around your head all day long. It might just drive a person mad.

Once you get the heat out of the case it has to go somewhere, and guess where? Into your bedroom, living room, or wherever, then it has to be cooled. Guess what? More power is needed to cool the room now.

You might say "hey this guy is retarded, computers don't heat up rooms that much," to which I reply "Are you still using a 386? Ok, thought so, now shut it" lol :tongue: . I used to have a P4 Northwood with a 6800GT in it, and within an hour of gaming I could tell the ambient temp rose 4-6 degrees, it was ridiculous. I had a 15x15 or so bedroom with good air conditioning but the room just got damn hot.

All this power consumption is ridiculous, it needs to be dealt with. They need to reduce the overall power draw of a system. CPU's have already taken a sizeable step towards this. AMD now has 35w desktop CPU's and Intel is down to 65w (last I checked), and Intel used to run 130W!

There has to be a stopping point because at this rate we will all be using phase change to cool our computers within a decade (an exageration, but you get the picture).

Someone please backhand ATI and Nvidia.

/rant
July 21, 2006 3:47:07 PM

As with anything, the manufacturers won't spend dollars optimizing power consumption until it's in their best interest.

Until people stop buying the high power eating cards, it doesn't hurt their pocketbook. So we, the geek consumers, need to start raising the uproar and make it a topic that gets attention. Only when it's embarassing to them and they percieve it as having a sales impact, will we see progress.

Personally I'm hoping that one company makes the mistake of advertising their power consumption as being better than the other. That'll be throwing down the marketing gauntlet and force a power war between them just liek AMD and Intel had.
July 21, 2006 3:50:30 PM

Quote:
If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way?

I'm with you on that! I got enuff wires, power packs, and cords galore tucked and stuffed behind the machine and under my desk, let alone the idea of an external GPU on the desktop or external GPU power supply on the floor.

In light of this article, the idea of a socketed GPU on the mobo seems like a better way to go. The mobo wouldn't be much different from a dual socket mobo now, 1 socket for the cpu w/ its own memory slots and then a second socket for the gpu with its own dedicted memory slots. Think about the upgradability and expandibility of that platform!

As far as needing to upgrade the psu just to accommodate a gpu, that will be ridiculous. There should be no reason that a good quality 550 or 600 watt psu with plenty of amps on the 12v rail(s) shouldn't be able to handle everything. Ironic considering all the talk and competition between Intel and AMD over who has the most power efficient procs, almost as if gpu's are going in the opposite direction.

This is why I dropped about $160 for my FSP Group 600w Epsilon, 4 12v rails at 15a a piece, hmmm...... sexy! Not to mention its blue, I mean come on, how cool is that!?
July 21, 2006 3:52:26 PM

Quote:
You might say "hey this guy is retarded, computers don't heat up rooms that much," to which I reply "Are you still using a 386?


Well, I try to avoid giving labels like "retarded" but regardless, I do know how computers heat rooms up. I set up an imaging workstation lab once and we had to triple that amount of air conditioning capacity for the room - it had previously been a two-person office space. My home office area has two desktops and a laptop and those certainly do heat the area.

Quote:
Someone please backhand ATI and Nvidia.
/rant


They are going for performance per dollar right now. Supposedly, the 7900 series cards were to run cool and quiet but that was just more hype. Give it time, power requirements for GPUs are bound to drop inthe future.
July 21, 2006 3:53:35 PM

Quote:
Personally I'm hoping that one company makes the mistake of advertising their power consumption as being better than the other. That'll be throwing down the marketing gauntlet and force a power war between them just liek AMD and Intel had.


Bingo!
July 21, 2006 4:00:43 PM

I have a Dual-Xeon Server with four hard drives in a tiny closet. I swear it feels like the desert when you get in and noisy as swarm of bees...

~Ibrahim~
July 21, 2006 4:01:28 PM

Quote:
You might say "hey this guy is retarded, computers don't heat up rooms that much," to which I reply "Are you still using a 386?


Well, I try to avoid giving labels like "retarded" but regardless, I do know how computers heat rooms up. I set up an imaging workstation lab once and we had to triple that amount of air conditioning capacity for the room - it had previously been a two-person office space. My home office area has two desktops and a laptop and those certainly do heat the area.

Quote:
Someone please backhand ATI and Nvidia.
/rant


They are going for performance per dollar right now. Supposedly, the 7900 series cards were to run cool and quiet but that was just more hype. Give it time, power requirements for GPUs are bound to drop inthe future.

Yeah, its just getting out of hand imo. I hate to use the word retarded in a public sense, but this problem really bugs me. Luckily I've been in a loft apartment for the past 2 years so the problem has greatly diminished due to 20-30ft ceilings. Argh, go marketing mistakes go!!!!
July 21, 2006 4:30:11 PM

Ok, so, I too am sick of these rediculous power consumption, which leads me to ask, why are we still rendering using old algorythms based on a zbuffer etc.

In the latest Scientific American there is an article about RPU, or Raytracing processing units. Basically, by using new RT algorythms and a chip clocked at a measly 66mhz they have managed to get reasonable framerates. One of the huge advantages of raytracing is that it "auto occludes" everything which the rays (or light) do not hit, and not only that but because of this it doesnt need as much memory to store and process all the information about every single polygon in the visable scene. If you want to read the article itself it is in the August edition of Sciam, which should be on the shelves shortly.

I realize that this is "too far" in the future, but it seems like it could solve a number of the problems stated in this article, not by changing the problem itself (the cards) but by attacking the root cause of the problem: our current inefficient methods for rendering scenes.
July 21, 2006 4:40:31 PM

Quote:
I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite!

What is wrong with the world?!?!?!

~Ibrahim~


And some people wonder why I use a 680 wt psu. With my X1900XTX and the water cooling sucking up wts, I'm running close to the edge as it is. If power requirements go up, well, I live in a desert already so what's a little more heat?
July 21, 2006 4:41:39 PM

Slightly off topic:

Im really not liking the hype surrounding Windows Vista. The day will come when MS doesn't support XP anymore. And when that day comes, if I have to chose between Vista and Macintosh, I will be chosing Macintosh.

And if DirectX 10 doesn't let me pull a Willi wanka candy bar out f my TV screen, I would just as soon keep playing DirectX 9.0c games.

Anyone noticed Apples Quaterly earnings? Now I know why it is so high.
July 21, 2006 4:42:29 PM

Quote:
Tom\'s Hardware graphics presidente Polkowski is concerned about the 3D arms race. Power and heat dissipation are skyrocketing, but external graphics boxes could eliminate the imminent need for 1,000 W power supplies.


Interesting article. I got a kick out of the comment about AMD pulling the Antec 550 watt PS out of their game development lab and how they are now using 200-300 buck units! Many people on this forum swear by PS calculators and don't realize that these are just a starting point. PS failure can be ugly and I predict we'll be seeing an epidemic of failures when DX-10 cards roll out.

Where can I get one of thos external GPUs? My laptop is jealous!

I 2nd your opinion on the failures
July 21, 2006 4:51:05 PM

Let's see...

If we go down that path we'll have all sorts of extra bricks laying around? Why? What other single component requires you to go buy a new PS before you can upgrade it? You'd have more available power anyway once you took the vid card off the internal PS.

Bricks are hot and ugly....Sure, I can agree with that, but one more brick laying around is to me personally far less annoying than having to go out and buy a new PS just to upgrade a vid card.

The card makers will never learn and they'll just keep producing power hogs...I'd have to see a cost per year comparison of internal vs externally powered cards. I suspect it's not a budget busting amount.
July 21, 2006 4:55:22 PM

Hah! I have it!!
I say we develop a solar panel system that powers our graphics cards! Or even the whole system! How about that? One solar panel as a PSU! Where the energy can be stored in a battery. Plug the cable into your system from the battery and taddow...

I'm sure someone from Tom's Hardware can put this into the think tank and make it work..........
July 21, 2006 5:14:41 PM

Quote:
Slightly off topic:

Quote:
And when that day comes, if I have to chose between Vista and Macintosh, I will be chosing Macintosh.


Oh lets NOT start this $hit :roll: again.....
July 21, 2006 5:43:14 PM

I've used the vista beta rc2. It looks like MS thinks one we're all blind the GUI is childish in appearnace , two only MS knows what's best for your security conatantly asking do you really want to use the start button? A web browser that's just a rebagged firefox. I see nothing that interests me, benchmarks show it's much slower than XP either 32 or 64bit in everything. got no use for it and probably won't buy it, XP is doing just fine and it'll be supported for several years to come. So when it comes to MS saying you need graphics card or we wont play nice, I say let them take their OS and go home :) 
July 21, 2006 5:52:14 PM

It's gotten to the point for me, where i just want to put duct tape over all the vents of my case and let that heatburst roast in its own oven.
July 21, 2006 5:54:59 PM

Quote:
I've used the vista beta rc2.


I haven't tried it yet, but I did use beta versions of 2k/XP .. and they were all slower than the final release. So, I had the same feelings when it came to making the switch from 98SE.

I will give Vista a try soon enough and I'm sure I will like it and use it. However, the hardware requirements that are being touted do bother me.

I'm still using a GF4 Ti4200 64megger and really don't feel like being bothered to upgrade.
July 21, 2006 6:13:30 PM

I too will not use Vista when it comes out, just like I did with XP. Why? It's buggy, and there will be no software (that I will need to re-buy) for about a year.

On the flip side I also know just like other beta software, you can not determin what the finished product will be like from the beta. That's not very smart. Do uou think MS will give you the goods for free? I know betta then dat.
July 21, 2006 6:14:15 PM

I think it's a great idea for laptops. When you're at home you can connect the graphics card and play games then disconnect it when go the college or work. It a wonderful idea and it could be just the solutions for mobile users that also like gaming, since they don't have to buy gaming desktops or ultra expensive gaming laptops. It's brilliant.
July 21, 2006 6:18:54 PM

I realize it is early yet but it seems that the graphics manufacturers (nVidia and ATI) have been less anxious to release info about their forthcoming cards than they have been in the past.

I'm really hoping to see a dual GPU DX10 like the 7950 from both manufacturers. Going to a smaller chip design would be sweet to help eliminate some of the power/heat. I would also hope they make the card easy enough for 3rd party silent GPU solutions such as Zalman to make an alternative to the dustbuster fans they use for stock.
July 21, 2006 6:24:41 PM

The graphics industry won't change until users stop demanding to get a few more FPS and start demanding cooler GPU's! It would be nice if they made a card that had power-saving utilities. I mean, does the GPU need to run at 500mhz when all you are doing is surfing the web?
July 21, 2006 6:28:47 PM

Quote:
I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite!



The Core 2 Extreme X6800 consumes a meek 66W on load.



What is wrong with the world?!?!?!

~Ibrahim~


Dude, read the chart title. That's total system power consumption. That means the CPU, the hard drive(s), fans, GPU,optical drive(s), RAM (2gb). The GPU alone doesnt consume that much power. Even in the article, Tom's said that the DX10 card only consume 150W and maximum can be 250W+. So then why manufacturers recommend whoever use the X1900xtx needs at least 450W PSU? Because they assume that if you have enough money to buy this monster, then your system must be high-end. That means 2 or more hard drives, a monster CPU (like Netburst or FX-60). They rather use the worst scenario than tell consumers they only need 350W for their systems. Also, if you overclock like crazy then it needs even more power. But the thing is, most people dont do that much overclocking.

You can take the Dell XPS 400 for example, they only use a 375W PSU with a 12a rail, and it still supports the X1900xtx (configuration).

I think the 1000W PSU is overkill, even for DX10 GPU. If you focus mostly on GPU, then it is too much. But of course, if you plan to have 2 RAID 0 array of 150GB Raptor X and all the best things for your money, then you need that much power. For normal gamers, 600W PSU for DX10 card is enough. Also, CPU now uses less power, Core 2 duo will consume a substantially less power than the FX-62.
July 21, 2006 6:48:41 PM

Quote:
Slightly off topic:

Im really not liking the hype surrounding Windows Vista. The day will come when MS doesn't support XP anymore. And when that day comes, if I have to chose between Vista and Macintosh, I will be chosing Macintosh.

And if DirectX 10 doesn't let me pull a Willi wanka candy bar out f my TV screen, I would just as soon keep playing DirectX 9.0c games.

Anyone noticed Apples Quaterly earnings? Now I know why it is so high.


If you dont want to play games, then go ahead, don't give out your opinion about Macintosh or PC. Since Macintosh suck at games, it's good at graphics design and video though. If you said when people start to have DX10 games, and you still play DX 9.0c games, then you can say the same thing with DX 8. And you know how suck it is for not playing the really good games.

Finally, I like Apple and their services, but not because Macintosh computers that Apple make the most money from. If they dont have the Ipod savior, they were on the verge of going bankcrupt (Steve Jobs wasnt in Apple at that time, he owned Pixar). Michael Dell even said "Give the money back to the shareholders"
July 21, 2006 6:51:54 PM

Quote:
So then why manufacturers recommend whoever use the X1900xtx needs at least 450W PSU? Because they assume that if you have enough money to buy this monster, then your system must be high-end. That means 2 or more hard drives, a monster CPU (like Netburst or FX-60). They rather use the worst scenario than tell consumers they only need 350W for their systems.


I talked to an eVga designer about that. He told me their recommendation was made to keep from having to RMA every card that died due to being run on the ragged edge till it lost a chip in the voltage regulaiton circuit.

Quote:
You can take the Dell XPS 400 for example, they only use a 375W PSU with a 12a rail, and it still supports the X1900xtx (configuration).


Using Dell to prove any point about PS configuration is a losing battle.

Quote:
I think the 1000W PSU is overkill, even for DX10 GPU.


Probably so. But if you put together a quad SLI system, it's probably about right.
July 21, 2006 6:56:10 PM

Of course quad SLI will use that much power. But as I stated in my second paragraph, that's only for the best system you can get for your money. Not everyone has that money or that crazy to use a Quad SLI system, since most games don't benefit that much from Quad SLI until you use the sub highest end resolution (1600x1200 still doesnt prove that much benefit).

I pointed out XPS 400 is an example for anyone who wants to play game but dont do that much overclock (normal gamers), because as we all know, you can't overclock in a Dell or HP system.
July 21, 2006 7:04:13 PM

Quote:
I talked to an eVga designer about that. He told me their recommendation was made to keep from having to RMA every card that died due to being run on the ragged edge till it lost a chip in the voltage regulaiton circuit.


That's right, but if you dont use an FX-62 with a Raptor X and dont play games together with Burning your movies (CPU+optical drive), scanning viruses (CPU power+hard drive), then the power will never reach 352 W (heavy load). As I said, the crazy gamers dont make up more than 10% of the PC owners population
July 21, 2006 7:15:10 PM

Quote:
while I do agree that normal gamers (what tom's here called mid-range gamers) do not ned that much over-clocking you are missing one small point that was spoken out loud in the article: sooner or later, those of us who bought mid-range GC will end up buying what was once concidered a high-end card which consumed A LOT of electricity.

Quad today is concidered high-end, the high-end of today is the mid>low range of the future which you and me will end up buying.


(few months ago i got an X800 - two years from now ill get the X1900, 5 years from now i will end up with an older yet better quad ...)


For that I said, the mid-range gamers only need 400 W good (not those $20) PSU. The chart from Toms review show it only needs 352 W under heavy load, that means everything in the system is running full capacity. I normally dont scan viruses since it's annoying (I set it up to run while Im sleeping) sometimes I burn movies but not often. Also, trust me, you will never use Quad if you dont play game in insane high resolution, for that you need a huge LCD screen (30 inch), you will end up with a good GPU rather than combine a bunch of low end GPU (x1800xt, five years from now). Everyone knows 2 7600 GT doesnt prove that much different from one 7600 gt, so stick with x1800xt or 1900xt for the same money.
a b } Memory
a b ) Power supply
July 21, 2006 7:17:06 PM

Quote:
I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite!



The Core 2 Extreme X6800 consumes a meek 66W on load.



What is wrong with the world?!?!?!

~Ibrahim~


The 332w is Total System Power. That means everything including the CPU.

The X1900XTX consumes 121w of power at stock speed.

The X1900XT consumes 109w of power at stock speed.
July 21, 2006 7:19:53 PM

Quote:
I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite!



The Core 2 Extreme X6800 consumes a meek 66W on load.



What is wrong with the world?!?!?!

~Ibrahim~


The 332w is Total System Power. That means everything including the CPU.

The X1900XTX consumes 121w of power at stock speed.

The X1900XT consumes 109w of power at stock speed.

Agreed, people dont be confused with those two (GPU and total system power)
July 21, 2006 7:32:10 PM

If we get Linux on the PS/3 I know what I'm going to use for an external graphics card. It may even be possible to simulate geometry shaders using the SPUs.
a b } Memory
a b ) Power supply
July 21, 2006 7:36:21 PM

Quote:

For that I said, the mid-range gamers only need 400 W good (not those $20) PSU. The chart from Toms review show it only needs 352 W under heavy load, that means everything in the system is running full capacity.


Actually, it's better to look at how much amps the 12v rails will provide since most of the components are drawing power from those rails. RAM draws power from the 3.3v rail. The hard drives draw power mainly from the 12v rails, and maybe 3w to 5w from the 5v rails. The motherboard most likely draws power from the 3.3v, 5v and 12v rails.
July 21, 2006 7:40:04 PM

Quote:


For that I said, the mid-range gamers only need 400 W good (not those $20) PSU. The chart from Toms review show it only needs 352 W under heavy load, that means everything in the system is running full capacity. I normally dont scan viruses since it's annoying (I set it up to run while Im sleeping) sometimes I burn movies but not often. Also, trust me, you will never use Quad if you dont play game in insane high resolution, for that you need a huge LCD screen (30 inch), you will end up with a good GPU rather than combine a bunch of low end GPU (x1800xt, five years from now). Everyone knows 2 7600 GT doesnt prove that much different from one 7600 gt, so stick with x1800xt or 1900xt for the same money.


yes, but what about next year? will you not move to X1900?
and the year after will you not get a x2 x1900 (or SLI solution)
and a year after quad?

yes- today we need "only" a 350 - 400w psu (by the way all GPUs today need at least 350, 2 years ago it was 250W, two years from now its 500W... do the math) but you are still stuck in today - think ahead, what about tomorrow?

This article does not deal with the present - it deals with the future, please take note my friend. :) 

You can go ahead and stick two x1900xt, I rather buy a better GPU since SLI or CF needs a huge resolution to prove superiority. If you read other threads by people on this forum, they never advise anyone to buy 2 7600gt over a x1900xt.

The article didnt deal with DX9 GPU my friend, my post is just to point out the guy who said that 352W is the x1900xtx power consumption is misleading. Even though there is advancement to the GPU industry and lead to more power consumption, but everything has it own limit. I doubt there will ever be a 2000W system even though the graphics will get better, because nobody will buy it, energy will be more and more costly. Dont take the trend today at stick it to 10 years later. When ATI and Nvidia create the GPU that consume a certain amount of power, they will need to compete on efficiency, like Intel and AMD today, or no one will buy their crap. I do agree that the GPU will consume more and more power, but it will reach a limit.
July 21, 2006 7:43:01 PM

Quote:

For that I said, the mid-range gamers only need 400 W good (not those $20) PSU. The chart from Toms review show it only needs 352 W under heavy load, that means everything in the system is running full capacity.


Actually, it's better to look at how much amps the 12v rails will provide since most of the components are drawing power from those rails. RAM draws power from the 3.3v rail. The hard drives draw power mainly from the 12v rails, and maybe 3w to 5w from the 5v rails. The motherboard most likely draws power from the 3.3v, 5v and 12v rails.

Yeah, I said that in my other post, the 12V rail and total power consumption needs to be considered, not just the PSU alone. If you have a cheap 450W PSU without a 12V rail, you can kiss the x1900xtx goodbye or any card that need a 6-pin connector. But if you have the 12V rail on a 400W PSU and plan to have an array of 2 raptor x and FX-60 and a bunch of other high end stuff, then you are dreaming.
July 21, 2006 7:48:06 PM

:oops:  I thought those numbers seemed kind of high. *Smack Head Repeatedly* Thank you both for pointing that out. Need to quit the pot.. :oops: 

OK, still. If it consumes 121W, that is still around DOUBLE of the CPU. And, forgive me for my ignorance, but doesn't the CPU have more processing power? Where are all these Watts going?

I'm editing my post right now to avoid any more confusion. Again, sorry.

~Ibrahim~
July 21, 2006 7:51:36 PM

Quote:
:oops:  I thought those numbers seemed kind of high. *Smack Head Repeatedly* Thank you both for pointing that out. Need to quit the pot.. :oops: 

OK, still. If it consumes 121W, that is still around DOUBLE of the CPU. And, forgive me for my ignorance, but doesn't the CPU have more processing power? Where are all these Watts going?

I'm editing my post right now to avoid any more confusion. Again, sorry.

~Ibrahim~


Look at the FX-60 (130W) or Pentium D 820 (135W) since both use 90nm techonology. Dont look at the new Core 2 duo, since it has the latest technology (65nm), GPU makers is still in transition to 90nm technology (the same as FX), even ATI still uses 130nm (or 110nm I forgot) for the X1900, that's why the die size is gigantic to fit all 352 million transistor, thus, it needs the most power.

Also, you said CPU need more processing power, that doesnt mean that GPU need less power. It uses power to render graphics, and power consumtion here means electricity, not processing or rendering. And I do believe that in the future, ATI and Nvidia needs to compete on efficiency like Intel and AMD.
July 21, 2006 8:00:47 PM

Quote:
:oops:  I thought those numbers seemed kind of high. *Smack Head Repeatedly* Thank you both for pointing that out. Need to quit the pot.. :oops: 


LOL, very funny, first I dont understand wtf is quit the pot? Then I look at your picture, clever.
July 21, 2006 8:03:39 PM

Top-of-the-Line Gaming Rig(Homebuilt):$2500
Power Supply(Not Included):$400
Monthly Electric Bill:$450
The look on your wifes face as you try to explain why that little disk in your
electric meter is turning faster than the platters in your Raptor(s):p RICELESS
July 21, 2006 8:08:33 PM

Quote:
Top-of-the-Line Gaming Rig(Homebuilt):$2500
Power Supply(Not Included):$400
Monthly Electric Bill:$450
The look on your wifes face as you try to explain why that little disk in your
electric meter is turning faster than the platters in your Raptor(s):p RICELESS


If you have time and go to Ebay, look at some of the custom built computers. I remembered one guy has to sell his computer (a top of the line) because his wife told him to. LOL.

Btw, did you send that to Mastercard competition? I think it's good, very computer geek.
July 21, 2006 8:26:04 PM

Bring on the external graphics cards.

MHz for MHz, the Nvidia and ATI GPUs are alot more powerful than the Intel and AMD CPUs.

Yes ATI and Nvidia need to take time and develop more energy efficient chips.

Why not put both GPU and CPU in one enclosure? Then if you want to watercool them it's easier.

Either that or we need a major overhaul of the current motherboard and case designs. Powersupplies are going to need venting straight up out of the case and may be long enough to fill up most cases currently available. I already say that Motherboards are two short for my needs. I want 4 RAM slots, 2 CPU slots, 2 PCI-X 16X Slots (that have space for dual-slotted GPU coolers, and don't cover any other slots), 2 PCI-X 1x slots for a SB X-Fi and one Gigabyte RAM-Drive. 2 PCI-X 4x slots for a Ageia card and one availble slot. Then of course, one PATA connector, 6 SATA2 connectors, 2 FIREWIRE connectors, 10 USB2 connectors, PS2 Mouse and Keyboard.
July 21, 2006 8:29:28 PM

The thing is ATI/Nvidia won't change their policy until there is a demand for lower wattage cards. The demand right now is for speed, not lower power bills. You think people running dual 7900GTX's in SLI are worried about their power bill? I think not. When people demand more power-efficient cards and stop buying power-hungry cards, then we might see a change. Until then, it's all about speed! Then again, who knows what will happen with the DX10 cards? They make make a Pentium M style card that shuts itself down when not being used. As in, the GPU throttles back to a lower, power-saving level.
!