Here's a snippet from an article criticizing GF100 as "too little, too late, too hot, too expensive":
SourceWe ran benchmarks in a variety of current titles and, on the whole, the Fermi cards narrowly outperformed their ATI equivalents. In Crysis at 1,920 x 1,200 and Very High settings, the GTX 480 averaged 40fps to the HD 5870’s 38fps; the GTX 470 scored 33fps to the HD 5850’s 32fps. Higher settings saw similar margins. World in Conflict had the two Nvidia cards consistently ahead by just under 20%, and in Stalker: Call of Pripyat that margin was around 5%. Other games had ATI’s cards ahead by a whisker, and if we average all the results, Nvidia’s edge looks to be between 5% and 10%.
And that’s not all. First, these Fermi cards suck at the teat of your PSU ferociously, with a GXT 480-based test rig sucking upwards of 400W when stressed, compared to around 270W for ATI’s fastest single-GPU card. All this power causes a secondary problem - heat. The reports that the GF100 GPU can hit 98°C/208°F. I seriously have concerns as to how long a GPU pushed to this sort of level can last. And while the GPU is working this hard, you have to put up with annoying racket of the fan going flat out.