For the past year or so, one reader's request has gnawed at us: come up with a weighted numerical scoring system for determining the winner of the Web Browser Grand Prix. Today we're transitioning to such a system.
First, we need to modify the existing analysis table. We dropped the distinction between Performance, Efficiency, Reliability, and Standards Conformance, though the benchmarks are still arranged that way in the article. We also condensed the categories of testing into major categories. For example, Startup Time is no longer split between light and heavy, Page Load Time is no longer split between cached and uncached, all the memory tests now fall under Memory Efficiency, and the same applies to Standards Conformance.
As before, each category of testing has four columns: winner, strong, average (previously, acceptable), and weak. Winner is obviously the browser that achieves the highest scores in that category. The Strong column is for browsers exhibiting superior performance, but not achieving a first-place victory. Average is for browsers that perform adequately or in-line with a majority of their competitors. The Weak column is for browsers that perform poorly, or substantially lower than their competitors.
In order to reflect how each category of testing affects the average end-user Web browsing experience, we need to create brackets (or levels of importance) to place the different categories of testing into.
|Important||Flash, HTML5, Memory Efficiency, Page Load Time, Startup Time|
|Unimportant||HTML5 Hardware Acceleration, WebGL|
We received some great feedback from readers on the brackets, and consequently made a few changes to our original grouping. Page Load Time was demoted from Essential to Important due to the minuscule scale of those tests. Memory Efficiency and Startup Time were promoted from Nonessential to Important due to the large number of folks who still browse the Web on much older hardware, such as our XP-based test system.
The Essential bracket contains those categories of testing that are indispensable to rendering the vast majority of Web pages online today. The Important bracket is for categories not quite essential to browsing the Web, yet still affect the user experience to a great degree. The Nonessential bracket contains the popular plug-ins Java and Silverlight. While these plug-in technologies are nowhere near as ubiquitous as Flash, certain applications like corporate intranet apps and Netflix simply will not work without them. Finally, the Unimportant bracket is for emerging technologies, such as HTML5 Hardware Acceleration and WebGL, that simply do not exist outside of testing/demo sites.
Now that the brackets are all sorted out, we can apply a numerical point system to the finishes of each bracket.
|Header Cell - Column 0||Winner||Strong||Average||Weak|
As you can see, we decided to apply negative point values to the Weak finishes and start the Average performances at zero for the Unimportant bracket. The Winner has also been de-emphasized over strong finishes, with just a small tie-breaking bonus going to the winner.
Web Browser Grand Prix XP Analysis Table
|Header Cell - Column 0||Winner||Strong||Average||Weak|
|CSS||Chrome||Safari||Opera||Firefox, Internet Explorer 8|
|DOM||Opera||Row 2 - Cell 2||Chrome, Firefox, Safari||Internet Explorer 8|
|Page Load Reliability||Opera||Internet Explorer 8||Firefox||Chrome, Safari|
|Standards Conformance||Chrome||Opera, Firefox||Safari||Internet Explorer 8|
|Flash||Internet Explorer 8, Opera, Safari||Chrome, Firefox||Row 7 - Cell 3||Row 7 - Cell 4|
|HTML5||Chrome||Opera, Safari||Firefox||Internet Explorer 8|
|Memory Efficiency||Chrome||Firefox||Opera, Internet Explorer 8, Safari||Row 9 - Cell 4|
|Page Load Time||Opera||Chrome, Safari||Firefox, Internet Explorer 8||Row 10 - Cell 4|
|Startup Time||Opera||Chrome, Firefox||Row 11 - Cell 3||Internet Explorer 8, Safari|
|Java||Chrome, Internet Explorer 8, Opera||Firefox||Safari||Row 13 - Cell 4|
|Silverlight||Chrome||Firefox, Internet Explorer 8||Opera, Safari||Row 14 - Cell 4|
|HTML5 Hardware Acceleration||Chrome||Opera||Firefox, Safari||Internet Explorer 8|
|WebGL||Chrome||Row 17 - Cell 2||Row 17 - Cell 3||Firefox, Internet Explorer 8, Opera, Safari|
Now, let's total up the points so we can see the final standings and crown a champion for Windows XP.
I do kinda feel the difference with Firefox's responsive going from my main modern desktop to my older labtop that has regulated to a makeshift HTC. I believe Firefox XUL interface is the culprit; it was a big enough problem for Firefox mobile to abandon it in favor of native Android GUI, but who knows at this point. I guess might actually give Opera a chance.
For example, for each category you could subtract the lowest-placed score from all scores and then normalize in the range by dividing all adjusted scores by the topmost adjusted score. This way the top perfomer always has 1 and the worst performer always has 0 modified score (you'd need to invert them for tests where lower is better of course, e.g. subtract these from 1). Then apply your ranks to these scores and you get the composite score. It's not a perfect transformation, but it certainly has more fairly distributed weight (pun intended) than what you have used here.
XP can't run 9. Need to upgrade OS in order to get higher IE.
1.A lot of corporates still use IE7. maybe you should include that too in your benchmarks
2.if you remove HTML5 (with and without H/W acceleration), i think Opera's victory margin will be quite huge.
3.Regarding smoothness, i beleive FF is quite poor in this. But the developers know about it and are very activle working on it. I thik FF13 will be the release when smoothness will improve. look at "Firefox Snappy".
4. i would like to have a subjective recommendation at the end of the article, something you subjectively felt was the best amongst all the browsers, even though it may be trailing in numbers.
Also that would definitely disable the H/W acceleration of browsers.