Composite Scoring
The JavaScript composite is the geometric mean of the four JavaScript performance test results (RIABench, Peacekeeper, Kraken, and SunSpider), multiplied by one thousand (to create nice, whole number scores).

Predictably, Chrome takes the lead in JavaScript performance, followed closely by Safari in second place. Firefox grabs third place, with Opera taking last.
Once again, the Windows 7 scores all dwarf the OS X scores. Chrome remains the leader on Windows, followed by Firefox. About 200 points behind Firefox is Opera in third place, followed by last-place finisher IE9.
Drill Down
The charts below contain the individual JavaScript benchmarks, Peacekeeper, Kraken, and SunSpider, followed by RIABench JavaScript for OS X and Windows 7.
Peacekeeper
Kraken
Google SunSpider
RIABench JavaScript - OS X
RIABench JavaScript - Windows 7
All of the JavaScript performance tests place Chrome in the lead, with the exception of Peacekeeper, where Safari wins by a hair. In OS X, Firefox has a poor showing in Run-length Encoding, while Opera has a disadvantage in the Focus test. IE9 exhibits considerably lower scores than the other browsers in the Focus test and MD5 Hashing
Google Octane
About a week ago Google introduced its new JavaScript performance benchmark, dubbed Octane. This new benchmark contains the eight tests that make up the older V8 JavaScript benchmark, along with five new tests. Unfortunately, IE9 cannot run Octane, and the IE10 RTM build cannot finish the test. Therefore, Octane will not be included in the JavaScript composite until it functions with the current version of Internet Explorer.

Chrome takes the lead in both operating systems, followed by Safari on OS X and Firefox on Windows 7. Firefox snatches third place on OS X, while Opera takes third on Windows and fourth on OS X.
While Chrome dominates the Windows 7 results, the OS X results are a lot more in line with the other JavaScript performance benchmarks.
- The Top Four Browsers, Tested And Ranked
- Chrome, Firefox, IE9, Opera, Safari
- Test System Specs And Software Setup
- Test Suite And Methodology
- Start Time
- Page Load Time
- JavaScript Performance
- DOM And CSS Performance
- HTML5 Performance
- Hardware Acceleration Performance
- Plug-In Performance: Flash, Java, Silverlight
- Memory Efficiency
- Reliability, Responsiveness, And Security
- Standards Conformance
- Test Analysis
- OS X And Windows 7 Winners' Circle
When we have more [official] stable 64-bit browsers, I'll definitely do a 64-bit WBGP - including versus their 32-bit counterparts.
Nearly every performance benchmark there is points in that direction. This probably has a lot to do with how much time developers spend optimizing for Windows - after all, Windows holds 90+% of the desktop user base. However, it is interesting that the rift between Windows and OS X is far greater than between Windows and Linux for the core stuff like JS, CSS, DOM, page loads, etc. Plug-ins are another story, they're always much better on Windows than the other two platforms.
(The nice popular ones like ABP, Lazarus, Greasemonkey all have equivalents; some lesser-used plugins like Rikaichan also have ports by now. Only a matter of time!)
as always, a great read.
All versions of Chrome hold up incredibly well cross-platform, if you look back at the two Linux WBGPs, it won there, too. Thanks for reading!
Absolutely, a Windows 8-based WBGP is already in the cards for October.
When we have more [official] stable 64-bit browsers, I'll definitely do a 64-bit WBGP - including versus their 32-bit counterparts.
Testing these browsers at stock doesn't reveal even an eighth of the picture.
btw great work adamovera keep it up man
Interesting idea, so basically a tweaked-out edition of the WBGP, where we use all the tools available to each browser for performance gains... That could work, but I gotta warn you that the next three WBGPs are already decided, so it would probably be real late in the year, or even next year before I could get to it.
Nearly every performance benchmark there is points in that direction. This probably has a lot to do with how much time developers spend optimizing for Windows - after all, Windows holds 90+% of the desktop user base. However, it is interesting that the rift between Windows and OS X is far greater than between Windows and Linux for the core stuff like JS, CSS, DOM, page loads, etc. Plug-ins are another story, they're always much better on Windows than the other two platforms.
The big problem with including the dev channel browsers is the amount of time it takes to produce the article (testing/charts/writing/editing/translating), combined with the tendency of the dev channel to constantly update. Before testing is even completed it's certain that something will update. TBH, the stable channels of Chrome and Firefox are a handful as it is. For example, for this article I had to test 8 browsers (4 on each OS), but I ended up testing 18+ due to OS X, Chrome, Firefox, Opera, Flash, and Java updates. Sorry, but I'm just not sure it's even doable in this format. Thanks for reading!
well I wanted to include it in my comment myself but I forgot I wanted to say if the timing allows
My computer is fast enough that it does not really mater what browser I choose.
In my case, ease of use means that I can see what is going on.
I decry the trend towards dumbing down the UI on every program I use.
(I also refuse to call software 'Apps', to me an app is a mini-program on a phone.)
I always turn on all menus, buttons and labels in WaterFox.
BTW: Good point.
Why don't you include WaterFox in your testing?
It is the 64 bit version of FireFox and I am sure that in your speed tests it may do a little better.