The Graphics State of the Union

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
Tom\'s Hardware graphics presidente Polkowski is concerned about the 3D arms race. Power and heat dissipation are skyrocketing, but external graphics boxes could eliminate the imminent need for 1,000 W power supplies.

Interesting article. I got a kick out of the comment about AMD pulling the Antec 550 watt PS out of their game development lab and how they are now using 200-300 buck units! Many people on this forum swear by PS calculators and don't realize that these are just a starting point. PS failure can be ugly and I predict we'll be seeing an epidemic of failures when DX-10 cards roll out.

Where can I get one of thos external GPUs? My laptop is jealous!
 

Stout

Distinguished
Aug 11, 2005
3
0
18,510
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.
 

HonestIago

Distinguished
Jul 11, 2006
21
0
18,510
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.

If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way? There are internal PSUs available, so you do not have to upgrade your main PSU. And imagine vid cards coming out with additional equipment - they would get even more expensive than they are right now.

Give the developers some time to realize that power consumption might be something valuable in relation to advertising and tests - and they will start designing cards that will use less power, just like the upcoming row of cpus already shows.
 

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.

If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way?

It's not a bad upgrade path, especially if available as an option (as it now is). I tend to advise people to get an overkill PS, but if you have a rig and want to get more GPU power, an outboard PS is a decent solution at least for short to mid term.
 

ikjadoon

Distinguished
Feb 25, 2006
1,983
44
19,810
I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite! Wrong. I'm completely wrong on that. Thank you all for correcting this horrid mistake. The X1900XTX consumes 121W on load.

I'm not deleting it, so that we don't have an argument that I deleted and then I said I never said it, so again: I'M WRONG. IT IS 121 WATTS ON LOAD.

The Core 2 Extreme X6800 consumes a meek 66W on load.

power-2.png


What is wrong with the world?!?!?!

~Ibrahim~
 

samir_nayanajaad

Distinguished
Feb 22, 2006
331
0
18,780
and everyone (including myself) said I was crazy for getting a 680w psu, but hey I always go by a price performace ratio in this case it was price per watt, and I got that 680w for 120 bucks and it has an active pfc so now I can justify that 680w. Although it doesnt come close to the pirce per watt as my brothers psu a 580w for 30 bucks I didnt feel safe going that cheap on a psu.
 
If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way?
I'm with you on that! I got enuff wires, power packs, and cords galore tucked and stuffed behind the machine and under my desk, let alone the idea of an external GPU on the desktop or external GPU power supply on the floor.

In light of this article, the idea of a socketed GPU on the mobo seems like a better way to go. The mobo wouldn't be much different from a dual socket mobo now, 1 socket for the cpu w/ its own memory slots and then a second socket for the gpu with its own dedicted memory slots. Think about the upgradability and expandibility of that platform!

As far as needing to upgrade the psu just to accommodate a gpu, that will be ridiculous. There should be no reason that a good quality 550 or 600 watt psu with plenty of amps on the 12v rail(s) shouldn't be able to handle everything. Ironic considering all the talk and competition between Intel and AMD over who has the most power efficient procs, almost as if gpu's are going in the opposite direction.
 

chewbenator

Distinguished
Jul 5, 2006
246
0
18,680
Yet another reason to stick with xp, or God forbid I learn the penguin.

I'm thinking the decision not to go with a smaller processing unit will haunt the graphics manufacturers. But, it does come with a huge price tag, so hold on to technology till it can't put out anymore I guess.
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
Why don't vid cards start using external power supplies? Why not a power brick that just plugged into the wall and then into the back of the vid card? It would be much easier than having to upgrade your power supply just so you can get a new vid card.

I don't mean to be rude but thats a ridiculous idea. I bought a $160 PSU to power 1 computer, and now you want me to deal with another power brick just for a GPU? Now thats just plain dumb, imo. Look at the xbox 360 power brick, do you really want to lug that around just to power a GPU? Not even the entire computer, just a GPU.

I know your going to say "but its a desktop nobody cares," quite the opposite. The imense power consumed by a computer has a direct effect on power bills, cooling bills, home comfort, noise problems and the like. The more power is drawn the more heat is generated (generally, lets not get into nit picking here) and that heat has to go somewhere. Most people won't go with water cooling because its risky and not worth it (to them), so lets just toss a few more fans on there, why not? Wrong those fans start adding noise that is like a swarm of mosquitos buzzing around your head all day long. It might just drive a person mad.

Once you get the heat out of the case it has to go somewhere, and guess where? Into your bedroom, living room, or wherever, then it has to be cooled. Guess what? More power is needed to cool the room now.

You might say "hey this guy is retarded, computers don't heat up rooms that much," to which I reply "Are you still using a 386? Ok, thought so, now shut it" lol :tongue: . I used to have a P4 Northwood with a 6800GT in it, and within an hour of gaming I could tell the ambient temp rose 4-6 degrees, it was ridiculous. I had a 15x15 or so bedroom with good air conditioning but the room just got damn hot.

All this power consumption is ridiculous, it needs to be dealt with. They need to reduce the overall power draw of a system. CPU's have already taken a sizeable step towards this. AMD now has 35w desktop CPU's and Intel is down to 65w (last I checked), and Intel used to run 130W!

There has to be a stopping point because at this rate we will all be using phase change to cool our computers within a decade (an exageration, but you get the picture).

Someone please backhand ATI and Nvidia.

/rant
 

reebo

Distinguished
May 3, 2006
36
0
18,530
As with anything, the manufacturers won't spend dollars optimizing power consumption until it's in their best interest.

Until people stop buying the high power eating cards, it doesn't hurt their pocketbook. So we, the geek consumers, need to start raising the uproar and make it a topic that gets attention. Only when it's embarassing to them and they percieve it as having a sales impact, will we see progress.

Personally I'm hoping that one company makes the mistake of advertising their power consumption as being better than the other. That'll be throwing down the marketing gauntlet and force a power war between them just liek AMD and Intel had.
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
If we start that way, you will have a whole array of additional sockets and power bricks laying and standing around - do you really want to go that way?
I'm with you on that! I got enuff wires, power packs, and cords galore tucked and stuffed behind the machine and under my desk, let alone the idea of an external GPU on the desktop or external GPU power supply on the floor.

In light of this article, the idea of a socketed GPU on the mobo seems like a better way to go. The mobo wouldn't be much different from a dual socket mobo now, 1 socket for the cpu w/ its own memory slots and then a second socket for the gpu with its own dedicted memory slots. Think about the upgradability and expandibility of that platform!

As far as needing to upgrade the psu just to accommodate a gpu, that will be ridiculous. There should be no reason that a good quality 550 or 600 watt psu with plenty of amps on the 12v rail(s) shouldn't be able to handle everything. Ironic considering all the talk and competition between Intel and AMD over who has the most power efficient procs, almost as if gpu's are going in the opposite direction.

This is why I dropped about $160 for my FSP Group 600w Epsilon, 4 12v rails at 15a a piece, hmmm...... sexy! Not to mention its blue, I mean come on, how cool is that!?
 

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
You might say "hey this guy is retarded, computers don't heat up rooms that much," to which I reply "Are you still using a 386?

Well, I try to avoid giving labels like "retarded" but regardless, I do know how computers heat rooms up. I set up an imaging workstation lab once and we had to triple that amount of air conditioning capacity for the room - it had previously been a two-person office space. My home office area has two desktops and a laptop and those certainly do heat the area.

Someone please backhand ATI and Nvidia.
/rant

They are going for performance per dollar right now. Supposedly, the 7900 series cards were to run cool and quiet but that was just more hype. Give it time, power requirements for GPUs are bound to drop inthe future.
 

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
Personally I'm hoping that one company makes the mistake of advertising their power consumption as being better than the other. That'll be throwing down the marketing gauntlet and force a power war between them just liek AMD and Intel had.

Bingo!
 

ikjadoon

Distinguished
Feb 25, 2006
1,983
44
19,810
I have a Dual-Xeon Server with four hard drives in a tiny closet. I swear it feels like the desert when you get in and noisy as swarm of bees...

~Ibrahim~
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
You might say "hey this guy is retarded, computers don't heat up rooms that much," to which I reply "Are you still using a 386?

Well, I try to avoid giving labels like "retarded" but regardless, I do know how computers heat rooms up. I set up an imaging workstation lab once and we had to triple that amount of air conditioning capacity for the room - it had previously been a two-person office space. My home office area has two desktops and a laptop and those certainly do heat the area.

Someone please backhand ATI and Nvidia.
/rant

They are going for performance per dollar right now. Supposedly, the 7900 series cards were to run cool and quiet but that was just more hype. Give it time, power requirements for GPUs are bound to drop inthe future.

Yeah, its just getting out of hand imo. I hate to use the word retarded in a public sense, but this problem really bugs me. Luckily I've been in a loft apartment for the past 2 years so the problem has greatly diminished due to 20-30ft ceilings. Argh, go marketing mistakes go!!!!
 

Hyperion2010

Distinguished
Jul 21, 2006
11
0
18,510
Ok, so, I too am sick of these rediculous power consumption, which leads me to ask, why are we still rendering using old algorythms based on a zbuffer etc.

In the latest Scientific American there is an article about RPU, or Raytracing processing units. Basically, by using new RT algorythms and a chip clocked at a measly 66mhz they have managed to get reasonable framerates. One of the huge advantages of raytracing is that it "auto occludes" everything which the rays (or light) do not hit, and not only that but because of this it doesnt need as much memory to store and process all the information about every single polygon in the visable scene. If you want to read the article itself it is in the August edition of Sciam, which should be on the shelves shortly.

I realize that this is "too far" in the future, but it seems like it could solve a number of the problems stated in this article, not by changing the problem itself (the cards) but by attacking the root cause of the problem: our current inefficient methods for rendering scenes.
 

sailer

Splendid
I wish what Intel did with Core 2 Duo would happen to nVidia/ATi.

The X1900XTX can use all the way upto 352W on load! Holy f'ing shite!

What is wrong with the world?!?!?!

~Ibrahim~

And some people wonder why I use a 680 wt psu. With my X1900XTX and the water cooling sucking up wts, I'm running close to the edge as it is. If power requirements go up, well, I live in a desert already so what's a little more heat?
 

rezzzman

Distinguished
Jun 23, 2006
9
0
18,510
Slightly off topic:

Im really not liking the hype surrounding Windows Vista. The day will come when MS doesn't support XP anymore. And when that day comes, if I have to chose between Vista and Macintosh, I will be chosing Macintosh.

And if DirectX 10 doesn't let me pull a Willi wanka candy bar out f my TV screen, I would just as soon keep playing DirectX 9.0c games.

Anyone noticed Apples Quaterly earnings? Now I know why it is so high.
 

amddiesel

Distinguished
Feb 24, 2006
265
0
18,780
Tom\'s Hardware graphics presidente Polkowski is concerned about the 3D arms race. Power and heat dissipation are skyrocketing, but external graphics boxes could eliminate the imminent need for 1,000 W power supplies.

Interesting article. I got a kick out of the comment about AMD pulling the Antec 550 watt PS out of their game development lab and how they are now using 200-300 buck units! Many people on this forum swear by PS calculators and don't realize that these are just a starting point. PS failure can be ugly and I predict we'll be seeing an epidemic of failures when DX-10 cards roll out.

Where can I get one of thos external GPUs? My laptop is jealous!

I 2nd your opinion on the failures
 

Stout

Distinguished
Aug 11, 2005
3
0
18,510
Let's see...

If we go down that path we'll have all sorts of extra bricks laying around? Why? What other single component requires you to go buy a new PS before you can upgrade it? You'd have more available power anyway once you took the vid card off the internal PS.

Bricks are hot and ugly....Sure, I can agree with that, but one more brick laying around is to me personally far less annoying than having to go out and buy a new PS just to upgrade a vid card.

The card makers will never learn and they'll just keep producing power hogs...I'd have to see a cost per year comparison of internal vs externally powered cards. I suspect it's not a budget busting amount.
 

NeonDeon

Distinguished
Jan 16, 2006
113
0
18,680
Hah! I have it!!
I say we develop a solar panel system that powers our graphics cards! Or even the whole system! How about that? One solar panel as a PSU! Where the energy can be stored in a battery. Plug the cable into your system from the battery and taddow...

I'm sure someone from Tom's Hardware can put this into the think tank and make it work..........
 

4Aces

Distinguished
Jul 18, 2006
102
0
18,680
I've used the vista beta rc2. It looks like MS thinks one we're all blind the GUI is childish in appearnace , two only MS knows what's best for your security conatantly asking do you really want to use the start button? A web browser that's just a rebagged firefox. I see nothing that interests me, benchmarks show it's much slower than XP either 32 or 64bit in everything. got no use for it and probably won't buy it, XP is doing just fine and it'll be supported for several years to come. So when it comes to MS saying you need graphics card or we wont play nice, I say let them take their OS and go home :)