R600 a power failure -total -failure for the watt

Okay guys i hve just been reading on some sites like anandtech and hothadware etc and the amount of wattage the 2900 is pulling is crazy .

I mean even a single gtx is not puling that kind of watts .

its just crazy. Looking at the wattage and the electricity bill this card is going to be pulling i have to say that it is a absolute failure.

Ati have definitly stumbled if not failed .


i mean the wattage is crazy :roll: nvidia 1 ati 0 :tongue:
37 answers Last reply
More about r600 power failure total failure watt
  1. Yeh we know the power is insane for a video card that is designed for dx10 and has very immature drivers at this point. The 2900Xt 80nm is what you could call a failure but the core oc's like mad if you can keep it cool like way cool. The 65nm refresh should be very competitive.

    Could we stop posting threads about the 2900XT 80nm being a failure we've seen enough...

    Peace
    Shargrath :tongue:
  2. People have a right to know :lol:

    They have waited so long for this 8O
  3. Quote:
    People have a right to know :lol:

    They have waited so long for this 8O


    reading all reviews, it pulls 15-30 watts more than the 8800GTX.

    I supose adding that many feactures like the onboard HD sound, the rage theater chip and all that crap made it worse XD
  4. Quote:
    People have a right to know :lol:

    They have waited so long for this 8O


    reading all reviews, it pulls 15-30 watts more than the 8800GTX.

    I supose adding that many feactures like the onboard HD sound, the rage theater chip and all that crap made it worse XD

    Yeah i know the wattage is crazy for a weaker card. Also what is the point of all the extra chips if its going to make the card a weaker product ...700 watt psu is nutz :D
  5. rename the thread to:
    we love Nvidia. we get sexual pleauser when we bash ATI thread.
    --
    Performence/Watt is something that Steve Jobs invented, when he forced Apple to switch from real computers PPC, to the 70technology X86.

    Core Duo is as fast as dual core G5.
    Only that the G5 was 4 years earlier.

    So: How to manipulate the masses? Performence/Watt
  6. You my friend have no undestanding about heat and watts .

    get a 2900 and sit back :lol: wait for the electricity bill hahahahahahaha.


    NVIDIA RULEZ :lol: :lol: :lol: go home ati boy :lol: :lol:
  7. Matrox the best. Matrox 1 nVidia 0 ATI 0 :lol: :lol:
  8. The transition to 65nm should hopefully lower power consumption, for both Nvidia and ATI alike.
  9. I would never call it a compleat falier tho, not a sucsses either. The fact that the xt was aimed at the gts and is compeating with that well the only problem is that it draws that ammount of power. also ATI have made a more compleat packege with the hdmi audio on borde which in my eyes quite an achevement.....

    Quote:
    NVIDIA RULEZ go home ati boy


    can anyone say nooob ?? or fanboy??
  10. Nvidia rules blah blah blah blah untill i see solid results, all is blah blah,obviously fanboys, look at how long the nvidia cards had to mature..............

    wait
    they are still not mature..........just like half the posters in this forum :twisted:
  11. Who the heck is ever going to use HDMI on a freaking PC??


    And u can buy DVI to HDMI adaptors for any other card.
  12. yes but if u use those adaptors you are stuck with hdmi vidio with no sound and the same goes for dvi u will get video with no sound, so i think as i do run a computer trough a hd-tv it is important to have. Also i have decided to buy an hd2900xt as i have seen that it is on par with the gts and in some cases can beat the gtx and i think there is more to get from the drivers
  13. I don't think its too much of a big deal, all a person would have to do is rememeber to turn off the lights after using them in the house and balance is restored :P

    Second: God damn, you ATI fanboys are wussies... "please stop, its too painful anymore to hear about its faults, wah wah wah drivers" Really suck it up and don't act like such a baby. To the Nvidia fanboys: Don't gloat or karma will blow up your card and force you to buy a R600 ;) The market changes and I know the ATI boyz will have their time to shine so really ease up a little.
  14. Quote:
    I don't think its too much of a big deal, all a person would have to do is rememeber to turn off the lights after using them in the house and balance is restored :P

    Second: God damn, you ATI fanboys are wussies... "please stop, its too painful anymore to hear about its faults, wah wah wah drivers" Really suck it up and don't act like such a baby. To the Nvidia fanboys: Don't gloat or karma will blow up your card and force you to buy a R600 ;) The market changes and I know the ATI boyz will have their time to shine so really ease up a little.


    And i agree with that. I bought a BFG 8800 GTS 2 months ago and i'm happy with my buy. Market changes, there's no winner, or yes there is a winner : we customers we pay less for more performance when there's competition. Even a fanatic NV fanboy needs Ati to shift gears in order to buy a 8800GTX/Ultra for less cash.

    as someone wrote once on this forum " i don't buy brand, i buy performance(per dollar)"
  15. Sounds like power being used to me, 2900 dual dvi for large screens, yes 8800 no. Sound out, 2900 yes, 8800 no. Yes it leaks, yes its hotter. But as has been said, it has more to it. If someone doesnt want HD- dvd playback on their computer, then go ahead, stay where you are. In 2 years, all this changes, and HD will be common. Just because ATI realised this before nVidia, dont slam them for it. If you dont want the newest, then buy nVidia at the top end. Even nVidia's mid and low is doing it, so whoever said they dont need it/ want it then dont buy nVidia either and go with Matrox
  16. Quote:

    as someone wrote once on this forum " i don't buy brand, i buy performance(per dollar)"


    ahhh, the voice of reason... :)
  17. Quote:

    as someone wrote once on this forum " i don't buy brand, i buy performance(per dollar)"


    ahhh, the voice of reason... :) I agree 100%. If youre paying for something you wont or dont use, then the extras of the 2900 arent for you. There are some people who game on their HD TV's. For them, the 2900 offers an eccellent choice. Those and others watch movies on their rigs, with a monitor or HDTV., another alternative for the 2900. All games can be played well on DX9 by both the 8800 series and the 2900, so really best bang for buck has just been widened by the 2900
  18. Quote:
    Yeh we know the power is insane for a video card that is designed for dx10 and has very immature drivers at this point. The 2900Xt 80nm is what you could call a failure but the core oc's like mad if you can keep it cool like way cool.


    I cant see drivers affecting power consumption :P

    The core only overclocks about 100MHz oe 13% without outlandish cooling, while the 8800GTS will overclock 20-25% on the stock cooler.

    Hopefully the 2950XTX will rock, I need something to upgrade my 8800GTX to, and it needs to be ATi because SLi wont be supported on Bearlake X38 :(
  19. slashzapper, it only needs a 550W PSU. The 700W claim from Fudzilla was for the Crossfire setup.
  20. Quote:
    rename the thread to:
    we love Nvidia. we get sexual pleauser when we bash ATI thread.
    --
    Performence/Watt is something that Steve Jobs invented, when he forced Apple to switch from real computers PPC, to the 70technology X86.

    Core Duo is as fast as dual core G5.
    Only that the G5 was 4 years earlier.

    So: How to manipulate the masses? Performence/Watt


    IBM cpus were way ahead of everything. My dual 1.8Ghz G5 is almost as fast as my oced e6600 (is raped in quicktime). But there is no equal software to compare it. Quicktime is highly optimized for mac. But hey, to render a 70Mb video to PalDV high quality it takes 2.30 min compared to the 9min it takes on the 3.4Ghz e6600... I need a 1000mva psu for my mac though... :wink:
  21. Quote:
    Matrox the best. Matrox 1 nVidia 0 ATI 0 :lol: :lol:

    Wrong!
    3DFX 100 Others(Matrox,nVIDIA,ATI) 0 8)
  22. Quote:

    Core Duo is as fast as dual core G5.
    Only that the G5 was 4 years earlier.


    What?? There is a CPU better than the Core 2 Duos?
  23. Quote:
    Core Duo is as fast as dual core G5.
    Only that the G5 was 4 years earlier.


    Got proof??

    A post without links
    Is a post that stinks

    And you can quote me on that :)
  24. It really is about bang for ur buck, altho i do think we haven't seen the true potential of either the 2900 or the 8800, but because the 2 are competing we should see better drivers in the future!

    But price wise i would have to go for the 2900, i could buy 2/crossfire setup for 1 8800 ultra, and i could fit them both in my case, which i can't with a GTX let alone an ultra. Now granted 1 2900XT is about 90% of an 8800 ultra performance wise, BUT i can buy 2 2900s for the same price!

    I think the litmus test for these cards will be CYSIS as NVIDIA and AMD/ATI will have a while to mature the drivers for both cards
  25. Quote:

    Performence/Watt is something that Steve Jobs invented, when he forced Apple to switch from real computers PPC, to the 70technology X86.

    Core Duo is as fast as dual core G5.
    Only that the G5 was 4 years earlier.


    Maybe, maybe not. I do know that the C2D are a whole lot cheaper than those G5 chips though. And if there is a stereotype for capitalists, then that would be Mr. Jobs.
  26. Quote:

    Core Duo is as fast as dual core G5.
    Only that the G5 was 4 years earlier.


    What?? There is a CPU better than the Core 2 Duos?

    Better in every aspect ($$, wattage, heat...) ,no, but just in terms of raw performance I believe the ibm powerpc processors are superior in most of the apps.
  27. There is an old saying in the auto industry,

    There is NO replacement for displacement Except technology.

    This being said when the g5 came out in 2002 it came in 3 verieties. 1.6 ghz 1.8 ghz and duel processor 2.0 ghz


    now the g5 topped out at 2.5 ghz and a configuration of a quad core g5 running at 2.5 ghz each.

    this was 2 years ago this option was readily available.


    these are the "today" review of the g5 vs the c2d
    http://www.barefeats.com/quad11.html


    now like i said before in the auto industry,

    There is NO replacement for displacement Except technology.


    Intel had over 4 years to match the g5 , witch they did. And today Drivers are what sperate the 2 year old g5 " quad core g5 2.5 ghz " from the new c2d.

    http://www.barefeats.com/quad16.html

    as you can see the universal binary is now showing the age of the g5.

    over 5 years since the birth of the G5 and now its legs are tired.
  28. Quote:

    But price wise i would have to go for the 2900, i could buy 2/crossfire setup for 1 8800 ultra, and i could fit them both in my case, which i can't with a GTX let alone an ultra. Now granted 1 2900XT is about 90% of an 8800 ultra performance wise, BUT i can buy 2 2900s for the same price!


    Um....

    No, just no.

    I can just about buy 2 8800GTX 320MBs for the price of 1 x2900XT.

    Doesn't mean much. 1 2900XT is most certainly NOT 90% of an 8800 Ultra, given that the ultra shows about a 10% advantage over the GTX at hich resolutions and the GTX has about the same over the x2900XT.

    That's not to say I'd ever buy an 8800 Ultra, they ARE overpriced when there are cheaper overclocked 8800GTX's about, but the 8800 Ultra is the undisputed king and people with more money than sense will buy 2 in SLI just to have the fastest rig going.

    The x2900XT sits neatly, both in terms of price and performance, between the 8800GTS 640MB and the 8800GTX.

    Hopefully the x2950XTX will be out soon then I can get excited :)
  29. Quote:

    The x2900XT sits neatly, both in terms of price and performance, between the 8800GTS 640MB and the 8800GTX.


    that may be true in your country, it certainly isn't true here
    http://www.azerty.nl/producten/product_detail/160/13188/geforce-8800gtx.html
    http://www.azerty.nl/producten/product_detail/160/12337/en8800gts.html
    http://www.azerty.nl/producten/product_detail/165/33162/radeon-hd2900xt.html

    as you see the gtx is about 200€ more expensive here, which is, according to the performance difference, a lot more
    even the gts is more expensive

    2900xt here i come :)
  30. Wow....

    And I'm in the UK, not the US.

    The x2900XT is definitely the better option there.
  31. Quote:
    There is NO replacement for displacement Except technology.



    Not to hijack the thread too much, but there are other tests on the "Bare Feats" web site you link to, that show the C2D about twice as fast as a G5: Link


    Can't find a link right at this moment, but I seem to recall Steve Jobs mentioning one reason Apple switched to Intel was to substantially increase the CPU performance.
  32. Quote:
    There is NO replacement for displacement Except technology.



    Not to hijack the thread too much, but there are other tests on the "Bare Feats" web site you link to, that show the C2D about twice as fast as a G5: Link


    Can't find a link right at this moment, but I seem to recall Steve Jobs mentioning one reason Apple switched to Intel was to substantially increase the CPU performance.

    But then Steve Jobs is an arrogant prick that will say whatever he thinks will sell more units at the time no matter if it contradicts what he has said before or not....

    Jobs is one big reason I'd never buy apple.
  33. Quote:
    Jobs is one big reason I'd never buy apple.


    There are many reasons not to buy Apple but I agree Jobs is a big one. Also those ridiculous Mac vs PC ads. And their "customer service": Apple sued over grainy laptop displays in which Apple tells complaining customers there's something wrong with their eyes and a dithering 6-bit per pixel display on their expensive Macbook Pro is fine and dandy :)
  34. Clearly you are the one who does not understand power consumption and electricity bills. The card only draws huge amounts of power when actually gaming. So it might cost you - maybe - $20 a year extra to run the card. I think you can afford that, if you can afford the card.

    You should start studying English, instead of making retarded posts here. Start with the spelling of "definitely".

    I presume you are enjoying your $9 per hour job? I hope so, as you'll be doing it for a long time to come.
  35. Too bad nvidia does not support tesselation tho.
    It's a cool feature that will be in the next version of directx.
    Ati 1 nvidia 0
  36. Quote:
    Clearly you are the one who does not understand power consumption and electricity bills. The card only draws huge amounts of power when actually gaming. So it might cost you - maybe - $20 a year extra to run the card. I think you can afford that, if you can afford the card.

    You should start studying English, instead of making retarded posts here. Start with the spelling of "definitely".

    I presume you are enjoying your $9 per hour job? I hope so, as you'll be doing it for a long time to come.


    emm
    i'm also not very fond of these ati-hate threads
    but is it really neccesary to be that harsh to other posters?
    maybe you were in a bad mood when you posted this? :?
  37. Quote:
    Clearly you are the one who does not understand power consumption and electricity bills. The card only draws huge amounts of power when actually gaming. So it might cost you - maybe - $20 a year extra to run the card. I think you can afford that, if you can afford the card.

    You should start studying English, instead of making retarded posts here. Start with the spelling of "definitely".

    I presume you are enjoying your $9 per hour job? I hope so, as you'll be doing it for a long time to come.


    emm
    i'm also not very fond of these ati-hate threads
    but is it really neccesary to be that harsh to other posters?
    maybe you were in a bad mood when you posted this? :?
    Necessary? Probably not.
    Deserving? Maybe.
    Fun? Definitely.
Ask a new question

Read More

Graphics Cards Power Graphics