Nvidia GeForce GTX 465 1 GB Review: Zotac Puts Fermi On A Diet

Power Consumption And Temperature

Power consumption is most definitely an issue to consider when it comes to the GeForce GTX 480—416W of total system draw under load is significant compared to the Radeon HD 5870’s 295W. The GeForce GTX 470 isn’t in as bad of shape, though the fact that it’s slower than the 5870 and more power-hungry is at least worth pointing out.

Once you get down to the GeForce GTX 465, though, total power use is a little more reasonable. Two hundred eighty three watts is a big number, but the Radeon HD 5830 it goes up against is only 10W behind. Consider power consumption on this card a non-issue for us. We're satisfied with its draw, the temperatures it generates under load, and the minimal noise it creates in the process.

What about heat? The GeForce GTX 465 idles at about the same temperature as all of the other cards (with the exception of the Radeon HD 5870; our sample dropped way down there).

Under load, there’s no doubt it gets warmer than even AMD’s fastest single-GPU. But it’s a fair bit cooler than the flagship GeForce GTX 480. And even more notably, the card requires far less fan speed to maintain that temperature, while the GTX 480 must blow much harder to stay around 91 degrees.

Of course, we addressed the resulting noise of these cards’ fans in our GeForce GTX 480 Update: 3-Way SLI, 3D Vision, And Noise.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
69 comments
    Your comment
    Top Comments
  • fatkid35
    i'll stick to my crossfire'd 5770s. same money and same power consumption.
    17
  • rohitbaran
    The GTX4xx line is definitely not the way it is to be played and this latest crappy piece of hardware further proved it. Hot and expensive but poor on performance. The more cards they launch, the clearer ATI's victory becomes.
    15
  • Annisman
    Dang, it looks like Nvidia has almost no real answers for the AMD/ATI lineup of cards. However, if this card can drop in price a little it may be competitive because of some of it's Nvidia-only features. I mean, it runs cooler and uses a fair amount less power than the 470 and 480, maybe this will become the Phsyx card to get ? Espescially if they could manage a single slot version and drop the price. Anyways, no competition is bad for everyone and I hope Nvidia can get their act together asap.
    13
  • Other Comments
  • Annisman
    Dang, it looks like Nvidia has almost no real answers for the AMD/ATI lineup of cards. However, if this card can drop in price a little it may be competitive because of some of it's Nvidia-only features. I mean, it runs cooler and uses a fair amount less power than the 470 and 480, maybe this will become the Phsyx card to get ? Espescially if they could manage a single slot version and drop the price. Anyways, no competition is bad for everyone and I hope Nvidia can get their act together asap.
    13
  • fatkid35
    i'll stick to my crossfire'd 5770s. same money and same power consumption.
    17
  • tacoslave
    fatkid35i'll stick to my crossfire'd 5770s. same money and same power consumption.

    Or a 5870 same thing less problems but thats just me. oh and that thing got pwnd by a 5830 and thats not saying much.
    4
  • welshmousepk
    wow, the pricing of this thing is all wrong. given how well the the 480 and 470 sit in the market, this just seems like a pointless card.
    4
  • liquidsnake718
    How many times do I have to say that this is nothing but a marketing gimmick for defective GTX480's and possibly 470's as well. Like the 5830 which was a cut/gimped/ or limited 5850
    -9
  • liquidsnake718
    sorry 5870 on the above comment
    -12
  • bombat1994
    make it 60 cheaper and you might have a good card but i would buy a 5850 over this thing everyday of the weel
    9
  • dco
    retail is messed up they charge you for a brand not the product by comparison. Whats worse is that people will buy it.
    4
  • rohitbaran
    The GTX4xx line is definitely not the way it is to be played and this latest crappy piece of hardware further proved it. Hot and expensive but poor on performance. The more cards they launch, the clearer ATI's victory becomes.
    15
  • km4m
    Fail, fail, fail...suitable words for Nvidia at this moment.
    9
  • Annisman
    Yeah well who cares who is the 'victor' of this round, remember the 2900XT ? It goes back and forth forever, pulling for one team over the other is silly considering that as consumers we have historically gotten the best prices when BOTH camps were churning out cards as good as the other one, and they were forced to do multiple price cuts.
    6
  • rohitbaran
    Well, I meant ATI's victory for this generation of cards.
    7
  • Kelavarus
    liquidsnake718How many times do I have to say that this is nothing but a marketing gimmick for defective GTX480's and possibly 470's as well. Like the 5830 which was a cut/gimped/ or limited 5850


    That's every card and CPU today that isn't the top of the line. If they have defective ones, they shut down what doesn't work, price it (hopefully) accordingly to performance, and everyone wins.
    6
  • gkay09
    The Conclusion is spot on...
    2
  • JeanLuc
    Chris, I'm looking at the temps and the HD5870 is 10c cooler then next nearest card in terms of cooling performance which is a bit odd. Have you thought about using the Delta T method rather then using absolute values to evaluate cooling performance?
    0
  • randomizer
    Excluding the disabled GPC, it seems a bit odd to disable just one SM.
    -2
  • spidey180
    overpriced WTH?? cant nvidia come up with a decent pricing scheme???
    5
  • dEAne
    too expensive, just looks good. Not a good one.
    2
  • shubham1401
    One word.... OverPriced!!
    8
  • evolve60
    My 9800GTX@750mhz is still kicking and with a 8500Gt@600mhz doing the pyshx, I can still run most of today's games at 1920x1200 at and above 50FPS with high to ultra details and 2-4x AA.

    Sorry Nvidia, I just don't see myself upgrading GPUs anytime soon until you release something that was as industry blowing as the original G80 architecture.
    0