Help Us With the Web Browser Grand Prix Scoring System

Over the past couple of years we've implemented several reader suggestions into the Web Browser Grand Prix, such as adding the analysis tables alongside raw placing, later dropping the placing tables entirely, and de-emphasizing the winner over other strong finishers. However, one of the most frequent requests has been to incorporate some kind of points-based scoring system. One which gives added weight to the more important categories of testing, and less weight to areas that have little or no bearing on everyday real-world Web browsing.

We've received numerous emails suggesting such a system, but so far they've all been too simplistic or far too complicated (think Dungeons & Dragons rule-set). With the tenth installment of the Web Browser Grand Prix just around the corner, we think it's about time to grant this request. So, we're seeking your help.

First, let's look at the the current analysis table from which the champion is largely determined. Today the Web Browser Grand Prix has 48 individual tests which fall into the following 14 categories:

Swipe to scroll horizontally
Header Cell - Column 0 WinnerStrongAverageWeak
Page Load TimeRow 0 - Cell 1 Row 0 - Cell 2 Row 0 - Cell 3 Row 0 - Cell 4
JavaScriptRow 1 - Cell 1 Row 1 - Cell 2 Row 1 - Cell 3 Row 1 - Cell 4
DOMRow 2 - Cell 1 Row 2 - Cell 2 Row 2 - Cell 3 Row 2 - Cell 4
CSSRow 3 - Cell 1 Row 3 - Cell 2 Row 3 - Cell 3 Row 3 - Cell 4
Page Load ReliabilityRow 4 - Cell 1 Row 4 - Cell 2 Row 4 - Cell 3 Row 4 - Cell 4
Standards ConformanceRow 5 - Cell 1 Row 5 - Cell 2 Row 5 - Cell 3 Row 5 - Cell 4
FlashRow 6 - Cell 1 Row 6 - Cell 2 Row 6 - Cell 3 Row 6 - Cell 4
HTML5Row 7 - Cell 1 Row 7 - Cell 2 Row 7 - Cell 3 Row 7 - Cell 4
Startup TimeRow 8 - Cell 1 Row 8 - Cell 2 Row 8 - Cell 3 Row 8 - Cell 4
Memory EfficiencyRow 9 - Cell 1 Row 9 - Cell 2 Row 9 - Cell 3 Row 9 - Cell 4
JavaRow 10 - Cell 1 Row 10 - Cell 2 Row 10 - Cell 3 Row 10 - Cell 4
SilverlightRow 11 - Cell 1 Row 11 - Cell 2 Row 11 - Cell 3 Row 11 - Cell 4
HTML5 Hardware AccelerationRow 12 - Cell 1 Row 12 - Cell 2 Row 12 - Cell 3 Row 12 - Cell 4
WebGLRow 13 - Cell 1 Row 13 - Cell 2 Row 13 - Cell 3 Row 13 - Cell 4

From here we need to rank these categories into brackets which reflect their importance to the average Web browsing experience. We've come up with the following four brackets:

Swipe to scroll horizontally
EssentialPage Load Time, JavaScript, DOM, CSS, Page Load Reliability, Standards Conformance
ImportantFlash, HTML5
NonessentialStartup Time, Memory Efficiency, Java, Silverlight
UnimportantHTML5 Hardware Acceleration, WebGL

The Essential bracket holds everything that makes up the core of what it is to browse the Web. The Important bracket includes the ubiquitous Flash plug-in and the rapidly-evolving HTML5 spec. The Nonessential bracket is for tests that could apply to any application (not just browsers) as well as the common, but lesser-used plug-ins. The Unimportant bracket is for upcoming technologies that simply aren't found in the wild, outside of testing and demo pages. While these brackets aren't set in stone and we're still open to feedback, the next step is where we really need your help.

This is where the points come in. We need to assign point values to the bracketed analysis table. There are a variety of ways to go about this. We could have a simple system where each type of finish (winner, strong, average, and weak) has a set score and a different modifier is applied to each bracket. Alternatively, we could have different point values assigned to each finish in each bracket.

Either way, there are more questions to be answered. Does an acceptable finish rate any points at all? Should weak be given negative points? Or should every type of finish in every bracket merit some points? How much of a bonus does the winner deserve over the strong finishers? Et cetera, et cetera.

Testing for the tenth installment of the Web Browser Grand Prix is complete - this one has a twist, and it's not what you'd think. Give us your feedback on the scoring system in the comments below so we can declare a champion. The outcome of Web Browser Grand Prix 10 is up to you!

  • Trewyy
    First of all, I think your importance system is terribly flawed and wrong. Here is the revised system I recommend:

    Essential: Page Load Time, CSS, Page Load Reliability, HTML5, Startup Time, Memory Efficiency
    Important: Flash, JavaScript, DOM, Standards Conformance
    Nonessential: Java, Silverlight
    Unimportant: HTML5 Hardware Acceleration, WebGL

    The importance system you had was based on what developers and power users are interested in. What is in important is how the main user experiences the browser. The main user doesn't care about things like Standard Conformance and DOM. They want their webpage to load beautifully and fast.

    I will not post a points recommendation until you make your importance scheme actually make sense. Your old scheme is far better than this new one you propose.

    Robert
    Reply
  • annymmo
    The most important things of course if it does what it should do in the first place:
    page reliability and correctness.
    Speed always comes after that.

    Features and WIP are something that is about correctness.
    Maybe give a feature categorie divided in essential and experimental???
    Reply
  • MooseMuffin
    trewyyFirst of all, I think your importance system is terribly flawed and wrong. Here is the revised system I recommend:Essential: Page Load Time, CSS, Page Load Reliability, HTML5, Startup Time, Memory EfficiencyImportant: Flash, JavaScript, DOM, Standards ConformanceNonessential: Java, SilverlightUnimportant: HTML5 Hardware Acceleration, WebGLThe importance system you had was based on what developers and power users are interested in. What is in important is how the main user experiences the browser. The main user doesn't care about things like Standard Conformance and DOM. They want their webpage to load beautifully and fast.I will not post a points recommendation until you make your importance scheme actually make sense. Your old scheme is far better than this new one you propose.Robert
    I thought this was a website for power users. I start my browser like once per reboot and leave it open forever. I don't care how long it takes to start up. My system has plenty of memory, so I'm not terribly concerned about efficiency either. What I care about is if using browser x will result in the page displaying quickly and correctly, which is why load time/standards conformance/dom/js belong at the top.
    Reply
  • JamesSneed
    Exactly my thoughts as well ^ annymmo I would like to see a huge amount of weighting on reliability/correctness of page loads. IE9 had issues with this at first and drove me crazy. I live in the real world so test the top 100 most visited web sites and tell me did it load the page properly and how long did it take. You know for testing graphics cards prowess in games we don't test how fast a graphics card can render a polygon, no we test what the end user sees(Image Quality) and feels(FPS), the same abstraction from the tech should be done here. Thanks.
    Reply
  • Trewyy
    MooseMuffinI thought this was a website for power users. I start my browser like once per reboot and leave it open forever. I don't care how long it takes to start up. My system has plenty of memory, so I'm not terribly concerned about efficiency either. What I care about is if using browser x will result in the page displaying quickly and correctly, which is why load time/standards conformance/dom/js belong at the top.
    Don't get me wrong, this website is definitely for power users. But you do not represent power users and developers. Unfortunately we all have different features that we feel are more important, but these tests weren't created to show what will be the best for developers - if so, Internet Explorer never would have the Grand Prix.

    I just think that the way Adam Overa (author of the Grand Prix) is taking these tests is wrong way. We shouldn't be doing these tests for us - power users already know whats best for us. These tests are for the public who may not be aware of speed improvements.

    Robert
    Reply
  • You'll get so many different opinions on this, I'm not sure how we can help really. And popularity of an opinion won't necessarily mean you'll get the right choices.

    Anyways, I don't agree with the importance categories on just a few details:

    Page Load Times is non-essential on modern web browsers which already have so very little differences between them. Some millisecond or even 1 second difference of page load times between two browsers is absolutely something not important. Web browsers are used in an event-like basis, not as a web spider collecting thousands of websites in a serial fashion. I need to hit a button, a link, or a toolbar item to navigate to a page. Often the time I take to move my mouse there is longer than whatever benefit I may gain from a faster web browser. I never consider page load times.

    Startup Time is however essential. Not everyone likes, or can afford, to keep a browser opened all the time. It's nice to have a web browser that becomes itself available quickly. That said, the only reason I consider this Essential is because of the categories you established. Saying it is Nonessential is a bit to much since it does have some weight on my decisions.

    Flash, Java and HTML5 are essential. It's irrelevant whether one likes or not flash. It's still an important part of the modern web and it will remain so for many years, unfortunately. As for HTML5, it is an emerging technology and not having it on a browser means not being able to properly use HTML5 web pages. Now, not being able to use ANY web page on the web is to me absolutely unacceptable. The same goes for Java. One cannot just choose to categorize web technologies as essential or non essential when looking at them from the POV of a web browser. Any web browser that doesn't support a web technology that has expression on the web is a crippled browser and one that bars me from web content. AOL again.
    Reply
  • Estix
    What it sounds like to me is that, rather than a single tell-all score, there should be multiple scoring systems; that is, a score for people who care about speed, a score for people who care about standards compliance, a score for what will run best on modest or old hardware, etc...
    Reply
  • pharoahhalfdead
    I would like to see a how secure each browser is with and without addons. We all know FF is notorious for having lots of addons that make us feel safe, but do they work as advertised? Security could be tested with something along the lines of how easy it is to click jack, or automatically install malware once a site is visited, etc.

    I would also like to know how truthful the browsers are in private browsing, or "do not track." I know my writing skills need a lot of improvement but hopefully you understand what I am trying to say.

    Thanx
    Reply
  • adamovera
    trewyy: I put startup time and memory efficiency in level 3 for a few reasons: 1) Most people only start their browser once per boot. 2) There is no evidence that memory efficiency correlates to better performance. 3) Both are measurements that can apply to ANY locally-installed application, and not just Web browsers.
    I have to disagree on HTML5 simply because, how can it be more essential than Flash when soooo many more sites have flash content than HTML5 code? Ditto for JS, and I just don't get the DOM downgrade. And while the average user might not consciously care about standards conformance, they will when they can't properly load a page.
    Reply
  • of the way
    There are two issues that I would love to see addressed (sorry for not answering the actual question). With the test system used, you push as much as you can on the browser so that we see what the browsers are truly capable of. That's good. But how different would certain things (mainly speed) be on less powerful systems? Some browsers might be affected more than others, and that seems important to me. The second issue regards memory efficiency. Do we want low memory usage? Yes. Do we want our memory back after we close tabs? Yes. But not necessarily immediately. If I have memory to spare, and I reopen a recently closed tab, I wouldn't mind if the browser had held onto that memory. When there isn't memory to spare then it is an issue, and so once again, I feel a less powerful test system could be beneficial. The 'memory given back after X minutes' is nice, but I don't feel the prix ever explains memory efficiency well enough.

    Maybe I should actually give the feedback Tom's is looking for....I think a weak should give 0 points for any category, but I would forgive a particularly horrendous showing getting negative points. Average 1 point for unimportant and nonessential, 2 points for important and essential. Strong should give 2 points for unimportant and nonessential, 3 points for important, and 4 for essential. Winner should be 3 for unimportant and nonessential, 4 for important, and 5 for essential. Did I treat unimportant and nonessential the same? Yes. Are the points distributions fairly arbitrary? So what if they are?
    Reply