Our page load time testing was recently overhauled. We modified the timer script to render test pages at 1080p instead of the previous, netbook-friendly resolution.
The line-up is expanded from five webpages to nine. Facebook and MSN are gone. Google, YouTube, and Yahoo! all remain, but are now up to date with the most recent home page layouts. Added to the testing is Amazon, Wikipedia, eBay, Craigslist, The Huffington Post, and good old Tom's Hardware.
In order to better reflect real-world browsing, we're not using the home pages for Amazon, Wikipedia, eBay, or Craigslist. Instead, we're using the page for Computer Parts & Components at Amazon, the Wikipedia page for Tom's Hardware, an eBay Motors search for Cadillac DeVille, and the New York City page on Craigslist.
Like the conformance benchmarks and GUIMark2 tests, page load times in WBGP6 will be averaged into a composite score, although a detail view is still provided.
Page Load Time Detail: Windows 7
The chart below shows how each of the five Windows 7 Web browsers perform on each of the nine test webpages.

Page Load Time Detail: Mac OS X Lion
This chart contains the complete detail view of the four Web browsers in Mac OS X Lion.
Page Load Time Composite
The average time each Web browser takes to load all of the test pages on each platform is displayed in the chart below.

As you can see, Chrome 13 takes the top spot in both Windows 7 and OS X Lion. In fact, Chrome 13 on OS X 10.7 has the fastest average page load time overall. Whenever this occurs, we'll change the regularly green bar to red in order to highlight the existence of a performance advantage on OS X. Safari 5.1 falls a very close second, performing relatively similarly on both Microsoft's and Apple's operating systems. Internet Explorer 9 is a third-place finisher. Opera earns fourth place on Windows, but fifth on OS X. Likewise, Firefox loses in Windows 7, but takes fourth in Lion.
- Crowning A Web-Browsing King In Windows 7 And OS X
- The Contenders
- A Spotlight On Lion's Safari
- Hardware And Test Setup
- Performance Benchmarks: Startup Time
- Performance Benchmarks: Page Load Time
- Performance Benchmarks: JavaScript, DOM, And CSS
- Performance Benchmarks: Flash, Java, And Silverlight
- Performance Benchmarks: HTML5
- Performance Benchmarks: HTML5 Hardware Acceleration And WebGL
- Efficiency Benchmarks: Memory Usage
- Efficiency Benchmarks: Memory Management
- Reliability Benchmarks: Proper Page Loads
- Conformance Benchmarks: HTML5, CSS3, JavaScript, And DOM
- Placing Tables
- Analysis Tables
- Two Winners: One In Windows 7, One in OS X
thank you, workin' on it
chrome13 completely obliterats it.
and firefox 8/9 are still a memory hog.
not really surprised by poor show of ie9. moat updates it gets are "security updates".
Yeah? And exactly what principle would that be?
Bring back the Google Dictionary, otherwise I will use Bing Search, Firefox and Facebook instead of Google Search, Chrome and G+.
According to the graphic on "Reliability Benchmarks: Proper Page Loads" on MacOS Firefox is actually second, not third.
thank you, workin' on it
These "browser" GP are getting more and more complete and the're always very interesting.
I have to say, I am a bit surprised to see FF being so close to Chrome now: kudos to Mozilla.
I have been using FF since 1.0 and only recently coupled it with Chrome (it is just convenient for me to have 2 completely different setups).
FF 7.0 should have a significant boost in memory efficiency: if everything else stays the same, we´ll have a new champion ...
But if anythin is clear from these reviews, is that nothing stays the same for very long in the browser´s domain (well, except IE).
Looking forward to GP7, whenever that will be.
You should've put more emphasis on the actual scores and performances in tests rather than count the times when certain browsers placed 1st. Thus a browser that had a small advantage in more and minor tests and at the same time severe handicaps in more important but fewer tests would seem better, when technically it is not. Suggestion: tie all the candidates when the differences between them in a certain test are less than a single digit percent. Good article anyway.
And to think Apple hates Flash...
There are no points in the analysis tables. They simply list how each browser rates per category of testing. The 'Strong' part of the table was added a long time ago and it basically means that it's right up there with the winner in terms of performance. When we get a solid point-based scoring system figured out 'Winner' will only receive a minor boost above 'Strong', whereas 'Strong' will receive a significant boost above 'Acceptable', and 'Acceptable' above 'Weak'. We're not there yet, but we're getting closer with every WBGP. The composite tests are a BIG step in that direction, and the new benchmark rankings further lay the groundwork for a fair scoring system which accurately reflects scale.
The analysis tables were created to balance the raw placing tables. The problem with what you're saying is that you would have to decide which categories are more important than others. Is JavaScript more important than CSS? Is HTML5 more important than Flash? This is going to depend on who you ask. People who only watch Netflix with an HTPC will put mega emphasis on Silverlight perf, whereas the chronic YouTuber will be more concerned with Flash, and devs are going to gravitate towards standards conformance. Ranking benchmarks based on the importance of what they test isn't a one-size-fits-all type of thing with Web browsers. As far as your other suggestion, dealing with practical ties, this is something we definitely want to look into moving forward. Thanks!