Mozilla released the hotly-anticipated Firefox 7 two days ago. Does it deliver on the promise of speed and memory improvements? Does Firefox 7 have what it takes to dethrone current Web Browser Grand Prix champion, Google Chrome? Read on to find out!

Although it's only been one month since Web Browser Grand Prix VI: Firefox 6, Chrome 13, And Mac OS X Lion, the browser wars show no signs of subsiding. The last 30 days were just as feverish as those that came before. But before we get down to business, let's get all caught up on the latest in this epic saga.
Recent Events
08/30/11: Opera updates from version 11.50 to 11.51
09/16/11: Google Releases Chrome 14
09/27/11: Mozilla Releases Firefox 7
09/29/11: Futuremark releases an open beta for the next version of Peacekeeper, announced exclusively here on Tom's Hardware.
Ongoing: Microsoft Internet Explorer market share continues to plummet, while Google Chrome market share continues meteoric rise.
Recent Drama
05/04/11: Google releases "fixed" versions of Apple's SunSpider and Mozilla's Kraken JavaScript benchmarks. We missed this the first time around.
09/01/11: David Storey, emblematic Opera developer and evangelist, leaves Opera for a new gig at Motorola, which quickly gets eaten up by Opera's arch-rival Google. Doh. Good luck, Dave!
09/20/11: Yet another Mozilla developer incites fear and chaos by suggesting a five week (or shorter) Firefox release cycle.
09/21/11: This idea is quickly rejected.
09/22/11: Another camp inside Mozilla proposes Firefox ESR (Extended Support Release) for enterprise use. ESR is to be five times slower than the standard Firefox releases.
9/29/11: Even more absurdity from Mozilla developers, this time floating the idea of banning Java to thwart security threats.
What's New In Web Browser Grand Prix 7?
We've added more composite scoring, brand new startup time tests, and retired the raw placing tables. Essentially, the benchmark suite receives yet another handful of additional refinements aimed at updating tests, enhancing accuracy, improving analysis, and most noticeable of all, yielding faster results. Hey, Firefox 7 was just released the day before yesterday! With 40+ benchmarks, multiple iterations per benchmark, and five Web browsers, this is nothing short of a monumental effort.
- Web Browser Grand Prix 7
- The Top Five Web Browsers
- Test Setup And Methodology
- Benchmark Results: Startup Times
- Benchmark Results: Page Load Times
- Benchmark Results: JavaScript, CSS, And DOM
- Benchmark Results: HTML5 Performance
- Tom's Hardware Exclusive: Peacekeeper 2.0
- Benchmark Results: Flash Performance
- Benchmark Results: Java And Silverlight
- Benchmark Results: HTML5 Hardware Acceleration
- Benchmark Results: WebGL Performance
- Page Load Reliability
- Memory Efficiency
- Standards Conformance
- Benchmark Analysis
- The Crowning Of A Champion
lol people still think they can feel the difference in terms of speed in real world performance and there's still people that doesn't use a browser for their needs and preferences, but just because they have seen some silly benchmark.
Ridiculous. I bet those are the same people that are always complaining in the forums about crashes, viruses and blue screens.
On the surface, everything looks good - the author sets out a methodology, clearly presents the results, and draws conclusions based on them. Unfortunately, in doing so he reveals his severely lacking knowledge of testing methodology, the browsers themselves, as well as how one interprets the results of benchmarks.
To aggregate across criteria such as "performance" and "standards compliance" (never mind the fact that HTML5 hasn't yet been drawn up), using an arbitrary weighting system, and then conclude that one browser beats other "overall" is nonsensical.
Nowhere has the author talked about relevance (this is critical) or statistical significance of his tests. I'm sure he put in a lot of effort into the article, and that it was written out of the best of intentions; however, this article remains a jumble of random tests clumsily grouped together. For example, can the author explain to the readers why the removal of SVG fonts in the ACID3 test is important? Should browsers have support for SVG fonts? Should one test for it? If he can't, he's just mechanically running benchmarks that he's found on the internet.
Obviously it's easier to criticise - but it's much more beneficial for people to actually try the browsers out for themselves (it is free after all) than to read this kind of poorly conducted "showdown".
Now they could change their famous icon to a more minimalist/modern style and we're done. Speedy AND classy, just like a fire fox.
Indeed. I have been quite content with FF8 though.
Now that IE is good again, I can't fault anyone for using it in lieu of the others.
lol people still think they can feel the difference in terms of speed in real world performance and there's still people that doesn't use a browser for their needs and preferences, but just because they have seen some silly benchmark.
Ridiculous. I bet those are the same people that are always complaining in the forums about crashes, viruses and blue screens.
Firefox 7 shows significant improvement over version 6, moving up to third place. As a result, IE9 drops to fourth.
On the surface, everything looks good - the author sets out a methodology, clearly presents the results, and draws conclusions based on them. Unfortunately, in doing so he reveals his severely lacking knowledge of testing methodology, the browsers themselves, as well as how one interprets the results of benchmarks.
To aggregate across criteria such as "performance" and "standards compliance" (never mind the fact that HTML5 hasn't yet been drawn up), using an arbitrary weighting system, and then conclude that one browser beats other "overall" is nonsensical.
Nowhere has the author talked about relevance (this is critical) or statistical significance of his tests. I'm sure he put in a lot of effort into the article, and that it was written out of the best of intentions; however, this article remains a jumble of random tests clumsily grouped together. For example, can the author explain to the readers why the removal of SVG fonts in the ACID3 test is important? Should browsers have support for SVG fonts? Should one test for it? If he can't, he's just mechanically running benchmarks that he's found on the internet.
Obviously it's easier to criticise - but it's much more beneficial for people to actually try the browsers out for themselves (it is free after all) than to read this kind of poorly conducted "showdown".
Firefox 8.0 Beta is now available.
But on a more serious tone, I honestly thought Chrome had this one again. Looking at the charts my impression was that Firefox never really won anything by significant margins.
Also, I hope Internet Explorer 10 will arrive soon. My short experience with IE10 under Windows 8 was very pleasant, even better than that of IE9.