- Articles & News
- For IT Pros
- Your Opinion
For the past year or so, one reader's request has gnawed at us: come up with a weighted numerical scoring system for determining the winner of the Web Browser Grand Prix. Today we're transitioning to such a system.
First, we need to modify the existing analysis table. We dropped the distinction between Performance, Efficiency, Reliability, and Standards Conformance, though the benchmarks are still arranged that way in the article. We also condensed the categories of testing into major categories. For example, Startup Time is no longer split between light and heavy, Page Load Time is no longer split between cached and uncached, all the memory tests now fall under Memory Efficiency, and the same applies to Standards Conformance.
As before, each category of testing has four columns: winner, strong, average (previously, acceptable), and weak. Winner is obviously the browser that achieves the highest scores in that category. The Strong column is for browsers exhibiting superior performance, but not achieving a first-place victory. Average is for browsers that perform adequately or in-line with a majority of their competitors. The Weak column is for browsers that perform poorly, or substantially lower than their competitors.
In order to reflect how each category of testing affects the average end-user Web browsing experience, we need to create brackets (or levels of importance) to place the different categories of testing into.
|Important||Flash, HTML5, Memory Efficiency, Page Load Time, Startup Time|
|Unimportant||HTML5 Hardware Acceleration, WebGL|
We received some great feedback from readers on the brackets, and consequently made a few changes to our original grouping. Page Load Time was demoted from Essential to Important due to the minuscule scale of those tests. Memory Efficiency and Startup Time were promoted from Nonessential to Important due to the large number of folks who still browse the Web on much older hardware, such as our XP-based test system.
The Essential bracket contains those categories of testing that are indispensable to rendering the vast majority of Web pages online today. The Important bracket is for categories not quite essential to browsing the Web, yet still affect the user experience to a great degree. The Nonessential bracket contains the popular plug-ins Java and Silverlight. While these plug-in technologies are nowhere near as ubiquitous as Flash, certain applications like corporate intranet apps and Netflix simply will not work without them. Finally, the Unimportant bracket is for emerging technologies, such as HTML5 Hardware Acceleration and WebGL, that simply do not exist outside of testing/demo sites.
Now that the brackets are all sorted out, we can apply a numerical point system to the finishes of each bracket.
As you can see, we decided to apply negative point values to the Weak finishes and start the Average performances at zero for the Unimportant bracket. The Winner has also been de-emphasized over strong finishes, with just a small tie-breaking bonus going to the winner.
|CSS||Chrome||Safari||Opera||Firefox, Internet Explorer 8|
|DOM||Opera||Chrome, Firefox, Safari||Internet Explorer 8|
|Page Load Reliability||Opera||Internet Explorer 8||Firefox||Chrome, Safari|
|Standards Conformance||Chrome||Opera, Firefox||Safari||Internet Explorer 8|
|Flash||Internet Explorer 8, Opera, Safari||Chrome, Firefox|
|HTML5||Chrome||Opera, Safari||Firefox||Internet Explorer 8|
|Memory Efficiency||Chrome||Firefox||Opera, Internet Explorer 8, Safari|
|Page Load Time||Opera||Chrome, Safari||Firefox, Internet Explorer 8|
|Startup Time||Opera||Chrome, Firefox||Internet Explorer 8, Safari|
|Java||Chrome, Internet Explorer 8, Opera||Firefox||Safari|
|Silverlight||Chrome||Firefox, Internet Explorer 8||Opera, Safari|
|HTML5 Hardware Acceleration||Chrome||Opera||Firefox, Safari||Internet Explorer 8|
|WebGL||Chrome||Firefox, Internet Explorer 8, Opera, Safari|
Now, let's total up the points so we can see the final standings and crown a champion for Windows XP.