Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I was experimenting a couple of days ago with high duty and extreme DOM nodes crunching and FF's SpiderMoneky blew Chrome's V8 out of the water for 7 - 9 multiples gain in performance measured in time elapsed to complete the operations.

This likely has nothing to do with the JS engines themselves and everything to do with the browser they were running in. To actually benchmark something like that you'd need to simulate the dom with something like https://github.com/tmpvar/jsdom



Are you suggesting that the remarkable disparity in performance was DOM specific?

Strange because I used a very common method appendChild() and I was under the impression that both browsers had optimized their respective inner workings a long time ago to the point that we should not notice such divergence in performance.


Yes. DOM manipulation in all major browsers is implemented in C/C++. The JS engine is just a wrapper; any noticeable performance difference in DOM manipulation is almost certainly due to differences in the underlying layout engine and not in the JS engine.


Well, there's one way in which the JS engine effects it: how efficiently one can call into C++ from JS. Mozilla have done a lot of work to reduce the cost of that in SpiderMonkey.


Sure, though in the grand scheme of things that penalty is pretty small when compared to the DOM operation itself. Eg. doing an appendChild() on an attached element and causing a reflow.


It depends a lot on what you're doing — if you're hitting fast-paths (esp. if you're dealing with out-of-tree nodes) it's entirely possible to end up with the JS/C++ trampoline being a significant part of the bottleneck, for much the same reasons as why Array.prototype.reduce can in several implementations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: