Help Us With the Web Browser Grand Prix Scoring System
Requests for a points-based scoring system have appeared in the comments section of the Web Browser Grand Prix articles for some time now. Today we're ready to implement one, but we need help. And who better to ask than you, the Tom's Hardware readers.

Over the past couple of years we've implemented several reader suggestions into the Web Browser Grand Prix, such as adding the analysis tables alongside raw placing, later dropping the placing tables entirely, and de-emphasizing the winner over other strong finishers. However, one of the most frequent requests has been to incorporate some kind of points-based scoring system. One which gives added weight to the more important categories of testing, and less weight to areas that have little or no bearing on everyday real-world Web browsing.
We've received numerous emails suggesting such a system, but so far they've all been too simplistic or far too complicated (think Dungeons & Dragons rule-set). With the tenth installment of the Web Browser Grand Prix just around the corner, we think it's about time to grant this request. So, we're seeking your help.
First, let's look at the the current analysis table from which the champion is largely determined. Today the Web Browser Grand Prix has 48 individual tests which fall into the following 14 categories:
| Winner | Strong | Average | Weak | |
|---|---|---|---|---|
| Page Load Time | ||||
| JavaScript | ||||
| DOM | ||||
| CSS | ||||
| Page Load Reliability | ||||
| Standards Conformance | ||||
| Flash | ||||
| HTML5 | ||||
| Startup Time | ||||
| Memory Efficiency | ||||
| Java | ||||
| Silverlight | ||||
| HTML5 Hardware Acceleration | ||||
| WebGL |
From here we need to rank these categories into brackets which reflect their importance to the average Web browsing experience. We've come up with the following four brackets:
| Essential | Page Load Time, JavaScript, DOM, CSS, Page Load Reliability, Standards Conformance |
|---|---|
| Important | Flash, HTML5 |
| Nonessential | Startup Time, Memory Efficiency, Java, Silverlight |
| Unimportant | HTML5 Hardware Acceleration, WebGL |
The Essential bracket holds everything that makes up the core of what it is to browse the Web. The Important bracket includes the ubiquitous Flash plug-in and the rapidly-evolving HTML5 spec. The Nonessential bracket is for tests that could apply to any application (not just browsers) as well as the common, but lesser-used plug-ins. The Unimportant bracket is for upcoming technologies that simply aren't found in the wild, outside of testing and demo pages. While these brackets aren't set in stone and we're still open to feedback, the next step is where we really need your help.
This is where the points come in. We need to assign point values to the bracketed analysis table. There are a variety of ways to go about this. We could have a simple system where each type of finish (winner, strong, average, and weak) has a set score and a different modifier is applied to each bracket. Alternatively, we could have different point values assigned to each finish in each bracket.
Either way, there are more questions to be answered. Does an acceptable finish rate any points at all? Should weak be given negative points? Or should every type of finish in every bracket merit some points? How much of a bonus does the winner deserve over the strong finishers? Et cetera, et cetera.
Testing for the tenth installment of the Web Browser Grand Prix is complete - this one has a twist, and it's not what you'd think. Give us your feedback on the scoring system in the comments below so we can declare a champion. The outcome of Web Browser Grand Prix 10 is up to you!
page reliability and correctness.
Speed always comes after that.
Features and WIP are something that is about correctness.
Maybe give a feature categorie divided in essential and experimental???
Essential: Page Load Time, CSS, Page Load Reliability, HTML5, Startup Time, Memory Efficiency
Important: Flash, JavaScript, DOM, Standards Conformance
Nonessential: Java, Silverlight
Unimportant: HTML5 Hardware Acceleration, WebGL
The importance system you had was based on what developers and power users are interested in. What is in important is how the main user experiences the browser. The main user doesn't care about things like Standard Conformance and DOM. They want their webpage to load beautifully and fast.
I will not post a points recommendation until you make your importance scheme actually make sense. Your old scheme is far better than this new one you propose.
Robert
page reliability and correctness.
Speed always comes after that.
Features and WIP are something that is about correctness.
Maybe give a feature categorie divided in essential and experimental???
I thought this was a website for power users. I start my browser like once per reboot and leave it open forever. I don't care how long it takes to start up. My system has plenty of memory, so I'm not terribly concerned about efficiency either. What I care about is if using browser x will result in the page displaying quickly and correctly, which is why load time/standards conformance/dom/js belong at the top.
Don't get me wrong, this website is definitely for power users. But you do not represent power users and developers. Unfortunately we all have different features that we feel are more important, but these tests weren't created to show what will be the best for developers - if so, Internet Explorer never would have the Grand Prix.
I just think that the way Adam Overa (author of the Grand Prix) is taking these tests is wrong way. We shouldn't be doing these tests for us - power users already know whats best for us. These tests are for the public who may not be aware of speed improvements.
Robert
Anyways, I don't agree with the importance categories on just a few details:
Page Load Times is non-essential on modern web browsers which already have so very little differences between them. Some millisecond or even 1 second difference of page load times between two browsers is absolutely something not important. Web browsers are used in an event-like basis, not as a web spider collecting thousands of websites in a serial fashion. I need to hit a button, a link, or a toolbar item to navigate to a page. Often the time I take to move my mouse there is longer than whatever benefit I may gain from a faster web browser. I never consider page load times.
Startup Time is however essential. Not everyone likes, or can afford, to keep a browser opened all the time. It's nice to have a web browser that becomes itself available quickly. That said, the only reason I consider this Essential is because of the categories you established. Saying it is Nonessential is a bit to much since it does have some weight on my decisions.
Flash, Java and HTML5 are essential. It's irrelevant whether one likes or not flash. It's still an important part of the modern web and it will remain so for many years, unfortunately. As for HTML5, it is an emerging technology and not having it on a browser means not being able to properly use HTML5 web pages. Now, not being able to use ANY web page on the web is to me absolutely unacceptable. The same goes for Java. One cannot just choose to categorize web technologies as essential or non essential when looking at them from the POV of a web browser. Any web browser that doesn't support a web technology that has expression on the web is a crippled browser and one that bars me from web content. AOL again.
I would also like to know how truthful the browsers are in private browsing, or "do not track." I know my writing skills need a lot of improvement but hopefully you understand what I am trying to say.
Thanx
I have to disagree on HTML5 simply because, how can it be more essential than Flash when soooo many more sites have flash content than HTML5 code? Ditto for JS, and I just don't get the DOM downgrade. And while the average user might not consciously care about standards conformance, they will when they can't properly load a page.
Maybe I should actually give the feedback Tom's is looking for....I think a weak should give 0 points for any category, but I would forgive a particularly horrendous showing getting negative points. Average 1 point for unimportant and nonessential, 2 points for important and essential. Strong should give 2 points for unimportant and nonessential, 3 points for important, and 4 for essential. Winner should be 3 for unimportant and nonessential, 4 for important, and 5 for essential. Did I treat unimportant and nonessential the same? Yes. Are the points distributions fairly arbitrary? So what if they are?
I also see your point on page load times, but once you average the page load times of multiple sites, you're measuring differences in full seconds. True you only navigate to one page at a time, but think about a search session: Google a topic, click a link, not relevant, click back, choose another link... opening several pages in quick succession is relatively common.
Startup times... I guess we'd have to test on much older hardware to see how much of an impact that can have, but I think that's a demographic issue, and from my experience both power users and granny open the browser about once per boot, the former because they can spare the system resources to leave it open always, and the latter because that's the only app ever used.
Java is essential to you?
Anyone?
Open 40 tabs with different live Web sites in each, all at once. Unless you have a seriously high-end rig, it will give you a ton of borked pages, or crash.
I would like to give comments on your number 2. When I was using my 1GB Ram laptop, the memory efficiency is very important for me. I can't use chrome with that laptop because the memory just run out with just 5-8 tabs, and that's why I use Aurora as my browser in that laptop. And even my new 3GB Ram laptop still can run out of memory if I use IE10 with 10 tabs, why I use IE10? Because it's the fastest browser I think but it lacks of feature and compability
Something I do everyday.
Rhymes with Barn.
I believe that an absolute scoring system (like 5 for browser C, 4 for browser F, 3 for O and 2 for S) is unfair, specially when you have categories with similar scores, like this example. Browser C seems much better than browser S, with more than twice the score, but it is not that much faster. This may end up favoring a browser that is marginally better than the others on some categories, but much worse on the others (like a browser that is a little faster with JavaScript, but way worse on memory use).
It would be more fair to attribute 10 to the best score, and proportional grades for the other browsers. Then the score would more fair to a browser that is does good (but not necessarily best) on every category.
I also agree with afmenez that there should be a score out of say 1000 instead of just rankings.
A missing point in the tests is the add ons (presence of popup blockers, ads remover,...) and easy and efficient download feature ( suspend, resume, list of downloads,...)