Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> People hate on electron, but it's a pretty amazing piece of tech if you ask me.

It really does come down to how people build on top of it.

I tried opening a 2gb log file in Atom and it took like 10 minutes before crashing.

VSCode on the other hand, while a bit slow, opened it and I could search quickly and scroll it without any issue at all.

That was what caused me to switch.



I used VS Code beta and it was slow… I remember an issue in the beginning where the cursor was refreshing at 60fps, causing huge delays. The software evolved really well over the years.


I didn't try during beta because I was convinced VSCode would never take off. Boy was I wrong :D


Your specific example highlights the performance difference between VS Code and Atom very nicely, and VS Code is an absolute gem of engineering.

But, for large files, I am not sure it has much to do with Electron in this specific case. Notepad is about as native as they come, and famously struggled with large files - anything over a few dozen KB - for a few decades (maybe still today?). Basically, if you're going to do seriously stupid stuff in an app's code, like loading the entire file upfront and applying word-wrapping, it doesn't really matter if you're doing it 2-5x faster, it will still take too long.


Loading large files need not have anything with Electron. The file can live in another child process either fully in memory or swapped to the filesystem. When you scroll you can load the required contents into the view (Electron). For searching you can build index or even perform search on the swap and show results accordingly.. It's complex but can be done with Electron (as proved by VSCode).


Wow man.

Your comment brought me memories. This exactly was what made me switch over too back then.

Pre-VS setup was: Atom for most cases but [large files | quickly edit something] was with Brackets (It was unbelievably swift)

VS Code comes in and says: Don't worry. I got you!


Opening large files is a pretty niche case anyway. Even CLI tools like less and grep are slow with things like searching muti GB files.


> Even CLI tools like less and grep are slow with things like searching muti GB files.

In what universe?

I routinely use "less -n" to open terabyte-sized text files.


In what context do you have terabyte-sized text files?


/s here you forgot this




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: