The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...
Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...
Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...
EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...
I don't know why you think that CPU's are 'faster' or 'more powerful or 'more efficient' than GPU's.
GPUs do a very specific task, they do it very quickly in parallel i.e. doing lots of it at the same time, CPU's have only just gotten (past 2 years or so) to doing two things at the same time. If you want a comparison of power levels of a GPU compared to a CPU think about this.
ATI (definately) and Nvidia (I believe) are planning to use spare GPU processing power to do physics calculations in response to ageia. These are doing calculations that the CPU can't do quickly enough which created the need for Ageia in the first place. Yes at the moment the Ageia drivers are not very good, but they are improving and CPU on its own can only run the demo's without eye-candy turned on.
If you are referring to the fact that GPU's are hotter and need more power, then yes they do, but having so much to do in parallel, will cause a lot of heat to be generated. Also with a GPU most of data is changing every 1/60th of a second, whereas for a CPU running a couple of Apps most of the data is constant, even moving data to and from memory creates heat.
You can run most things with a low end card, I'm running XP with an old Ti4800 and its very happy. But... to play any games you will need significant amounts of processing power, hence the arms race between Nvidia and ATi to provide the greatest Video processing power on a card. So the fact that people want to play games is forcing the high end GPU to exist.
You chose to buy a 1900xt, it wasn't needed in order to run the PC, or infact to run most games, most of them are perfectly happy on much older hardware. I'm running BF2 and Oblivion at 1280x1024 on a 6600GT and they look fine to me.
I couldn't agree more in regards to the arms race between Nvidia & ATi. You can run a number of games at 1280 x1024 with a card that is a year or two old, but there are people who want more, faster, brighter, crisper.
Years ago when 3dfx was fiddling with the idea of having a seperate power supply for their top end graphics card people thought they were insane. Now with the frame rates, and high resolutions that are available it almost sounds reasonable.
It all boils down to whether or not the public will buy the next generation of high Temperature/Power Consumption GPUs....my guess is they will.