Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When you write Chrome itself in JS, then you can talk to me about performance.


Not sure if serious or not, but Firefox is a long-standing example of this. It has always been a mixed C++/JS codebase. (Since before it was even called Firefox, that is, though nowadays, it's also Rust, too.) I routinely point this out in response to complaints about the slowness attributed to e.g. "Electron". JS programs were plenty fast enough even on sub-GHz machines before JS was ever JITted. It's almost never the case that a program having been written in JS is the problem; it's the crummy code in that program. When people experience Electron's slowness, what they're actually experiencing is the generally low quality of the corpus that's available through NPM.

Arguably, the real problem is that GCC et al are enablers for poorly written programs, because no matter how mediocre a program you compile with them, they tend to do a good job making those programs feel like they're performance-tuned. Today's trendier technology stacks don't let you get away with the same thing nearly as much—squirting hundreds or thousands of mediocre transitive dependencies (that are probably simultaneously over- and under-engineered) through V8 is something that works well only up to a point, and then it eventually catches up with you.

Besides, there's no such thing as a fast or slow language, only fast and slow language implementations.


AFAIK all of the major browsers (and other JS runtimes) have implemented some performance-sensitive APIs in JS, specifically because it performs better than crossing the JS<->native boundary. Granted that’s usually specifically about JS API performance, but that’s a lot of where performance matters in a JS host environment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: