The Great Browser JavaScript Showdown

In The Day Performance Didn't Matter Any More, I found that the performance of JavaScript improved a hundredfold between 1996 and 2006. If Web 2.0 is built on a backbone of JavaScript, it's largely possible only because of those crucial Moore's Law performance improvements.

This is a companion discussion topic for the original blog entry at:

Nice work, you’ve presented the data well.

I work on a Javascript-heavy application and would definitely like to see performance improve. My applications deal with a ton of data and are struggling a bit on the common computer at my workplace.

The IE string result is curious. Any idea why it’s happening?


Can you better define what each of the benchmarks means? For example is ‘access’ proper I/O or parsing the DOM tree?

If ‘string’ means ‘parsing any stringified data’ that means that its probably involved in building the DOM tree from the [X]HTML which means that it is one of the more important aspects of the benchmark.

What does ‘3d’ mean? z-positions? Proper 3d rendering? Making buttons look roundy??

The point is that these tests appear to cover a significant part of the javascript libraries but fails to identify which ones are used more often on the web. I would hope that most browsers implementations of javascript are optimized for the more common web use-cases.

A benchmark set up to measure the performance of javascript “as it is commonly used on the web” might show us more useful results. Properly identifying the benchmark terms would be a good step at the minimum.

Now we know why code refactoring was one of the primary concerns for Firefox 3

How come you didn’t mention Tamarin? It’s been making huge news lately.

FYI, Adobe donated a JIT compiler for ActionScript (a very close relative to JavaScript and ActionScript) to Mozilla. It’ll be a major feature of Mozilla 2 and JavaScript 2 support.

@James Justin Harrell: At this point Tamarin isn’t relevant as it cannot run the majority of the JavaScript available on the Internet. It may interesting in the future, but it’s worth waiting until it is viable as technology before making noise about it.

@Freiheit: The benchmarks are purely about JavaScript. There is no DOM access, rendering or network access.

Freiheit, I too wish there was more documentation and explanation of each test.

Here’s a complete list of the tests:

var tests = [ “3d-cube”, “3d-morph”, “3d-raytrace”, “access-binary-trees”, “access-fannkuch”, “access-nbody”, “access-nsieve”, “bitops-3bit-bits-in-byte”, “bitops-bits-in-byte”, “bitops-bitwise-and”, “bitops-nsieve-bits”, “controlflow-recursive”, “crypto-aes”, “crypto-md5”, “crypto-sha1”, “date-format-tofte”, “date-format-xparb”, “math-cordic”, “math-partial-sums”, “math-spectral-norm”, “regexp-dna”, “string-base64”, “string-fasta”, “string-tagcloud”, “string-unpack-code”, “string-validate-input” ];

To load each one, add it to the URL like so:

Then simply view source; each test is contained in an embedded script tags in the page.

I’d love it if you could run the code on Firefox 3 beta 2 and add that to your graphics.

Anyway, do performance in such small part (but yet important) of the browser matter? In the end it’s how many seconds it takes from the point that I press enter to actually render the stuff on the screen. In that respect, I believe that the rendering process is actually what needs more attention.

Why measure just javascript? Measure everthing that it takes to put a webpage in my monitor. You are seeing with the eyes of a developer, as you said programmer should see their software as the user would see.

@Mark Rowe:
If I follow you right this is disconnected from how Javascript is used in the browser. For example if I have javascript change the style on a particular element it is just fiddling with some variable that the browser provided for it. That kind of access is no different from setting some arbitrary value; Javascript is just doing some processing and sticking the result somewhere.

Thanks for the clarification, I abhor JavaScript but I’m kind of interested in this discussion because it shows how better coding of the JS engine can make a browser better.

Hoffmann, performance of js matters because whole web 2.0 is about huge js libraries which perform still quite slow to be widely used.

The results more or less bear out my own experiences. I’ve found Opera to be the fastest gun in the West with regard to both rendering and script processing. Sadly, Firefox is the worst. Hitting sites like the ExtJS libraries, or scriptaculous’s site, etc. Firefox on both Linux and Windows stutters and struggles. Even with html/css it is terrible. I have a page that uses a UL/LI vertical menu with a little opacity applied and position:fixed. I get “tearing” when the page is scrolled on a machine with 4GB of RAM and a Quad-core CPU! Other browsers are smooth as butter.

I tested Firefox 3 beta 1 the other week and it was noticeably faster but still stuttered and teared. I hope they can improve. As it stands, I use FF for tools like Firebug but Opera for browsing.

I second the other David’s request to have FF3b2 benchmarked, so far it seems much faster for me.

I second the other David’s request to have FF3b2 benchmarked

OK, I’m running some benchmarks here on my home PC, so why not. It’s Vista x64, 3.2 GHz Core 2 Duo.

IE7 32-bit – 17100 ms
IE7 64-bit – 15909 ms
Firefox 2.0.11 – 10768 ms
Firefox 3 b2 – 8260 ms

The specific improvements from FF2 - FF3 in each area:

3d – 12% faster
access – 21% slower (!)
bitops – 47% faster
controlflow – 11% faster
crypto – 7% faster
date – 65% faster
math – 5% faster
regexp – 14% faster
string – 6% faster

I will never understand why Firefox doesn’t offer a 64-bit version. I’m sure it’d be even speedier!

Great article, one question though, what did you use to generate those beautiful graphs?

Jeff: If you’re taking requests, I’d be curious to see how the latest WebKit nightly build ( stacks up against the other browsers on Windows.

Hoffmann: This benchmark focusses on one part of the functionality of the web browser. There are other benchmarks that cover DOM access, page loading and rendering. A benchmark that covers every piece of functionality in a web browser provides information that is incredibly complex to analyse and thus very difficult to use in a practical fashion to optimise the browser.

Freiheit: Setting a variable and setting a property of a DOM object appear syntactically similar but are vastly different in terms of implementation. Setting a property on a DOM object typically results in the rendering engine being required to update the layout, repaint a portion of the screen, etc. The performance of these operations is obviously of great interest to developers, both of web applications and browsers, but are not the focus of SunSpider. There are other benchmarks that cover these areas.

“I will never understand why Firefox doesn’t offer a 64-bit version. I’m sure it’d be even speedier!”

Well, it is open source…\

Just like Fred, I have seen some visually appealing graphs in your posts. Can you please share what you use to generate these graphs? Is it a charting tool from a spreadsheet, perhaps Excel?

Actually anomalous is exactly the correct word for it: anomalous – inconsistent with or deviating from what is usual, normal, or expected.

Looking at the other 8 sections, all the browsers are in the same range of time. The string test was incosistent with all of the other tests.

In my testing of Firefox 2.x and IE 6.x on the AJAX applications I have developed for my client, I have found that Firefox is 2 to 5 times faster for real world operations that load XML data, transform it via XSLT, and render lots of DOM elements (DIVs, IMGs, TABLEs, etc.) as a result. My client uses IE only but I develop using Firefox and when I test under both environments I really notice the difference in performance. IE 6 sucks. I have not tested under IE 7 since my client has yet to upgrade to it.