While the web platform is catching up due to the continuous supply of abstractions by modern browsers, once you must deviate from those abstractions, you quickly find yourself needing to implement something yourself that is much less efficient than a native implementation.
I wrote about developing my own block editor from scratch[1] using C++ and QML after finding that Notion (and so many other web apps) are extremely slow and inefficient - in terms of CPU/RAM/battery life.
I detailed a comparison between native and web block editors, and the difference is huge. The fastest web app (MarkText) is 60x slower at loading texts and uses 3x more RAM than my native app. Also, all web apps couldn't handle loading a very large text file (they were all hanging).
Modern computers are blazing fast and efficient, there's no reason a text editor couldn't load large files. This is why, in my view, web apps aren't really the progress people make them to be. We're going backward, not forward, with web apps. This need to change.
First the needs of the user-base should trump those of the dev.
And secondly the kinds of apps that are referred to here are not the type that need massive efficiency or some complex feature - when inconvenienced by yet another single-use car park payment app, I've never once thought how marvellous it was that the text downloaded so much faster than the many web sites I regularly use: mainly because that responsiveness is blown away by the need to faff about installing the app (not to mention the effort needed to avoid giving unnecessary phone access out!)
Obsidian is an Electron app (I don't know if it belongs to the Block editor category). It loads just as fast as your app. I tried copying and pasting the text file War and Peace (66035 lines) from Notepad into both apps and, interestingly, Obsidian is slightly faster. Also, scrolling through this large chunk of text is slightly faster on Obsidian, too. Obsidian memory consumption (4 processes) is 172 MB and Daino Notes consumption (1 process) is 352.7 MB. Tested on Windows 11 PC.
Obsidian is not a block editor. Can you put a Kanban or any other complex block in the middle of a document? From my understanding, you can't. Here's how to think of it: a block editor is a basically a virtualized list with dynamic loading, so it can load any arbitrary component *while* allowing the user to interact with the list as it was a singular piece of document - so you get text selection between these discrete blocks, editing, etc like you would in a regular text editor.
Again, from my understanding, Obsidian is not that. If I remember correctly it is based on CodeMirror which is designed to only handle (EDIT: rich) text.
Edit (addendum): BTW, I'm not sure your Obsidian RAM reading is correct, an empty instance of Obsidian with one note uses 285MB (all 4 processes together) on my machine (M1).
Here is the screenshot showing memory consumption of Obsidian (I did wait 30 seconds for memory to settle down after initial spike which was 240 MB): https://pasteboard.co/uW2lPNSbL7f7.png
EDIT: Btw, I do have plans to cut RAM usage significantly in Daino Notes (I focused more on load time and responsiveness). But getting back to my point - I can do these optimizations because those RAM inefficiencies are a result of my code, not some abstractions I can't change.
RAM is cheap, my time is not. VSCode is the best game in town (for me), and my 32GB computer has no problem with its RAM requirements. Even 8GB would be enough for VSCode depending on what else your toolchain requires.
Apple RAM is expensive. Every other kind of RAM is pretty cheap. 32GB DDR4 can be had for under $30, and 16GB DDR4 can be had for about $25. I'm not sure who you think has a computer, is developing software, and can't afford that. Maybe someone in India, I guess. Too bad if that's you, but "top 1%" is a laughable claim when RAM is so cheap. 16GB of RAM is nowhere near "top of the line". You're just trolling here, "hnthrowaway2376".
Let's say I spent $50 on 32GB of RAM. Over the lifetime of the computer that upgrade would cost ~$0.02 per day. Two pennies a day. And that's US prices, it can be less expensive elsewhere.
I've used VSCode on a computer with 2GB of RAM, and it worked. I expected everything to run slower - and it did run slower, but it ran. And I developed, and contributed to the project I was working on while away from my workstation. This was a cheap $70 Windows 10 tablet. YMMV.
> Apple RAM is expensive. Every other kind of RAM is pretty cheap. 32GB DDR4 can be had for under $30, and 16GB DDR4 can be had for about $25.
I'm sure that's pretty cheap for you, yes. Taxes and other fees tend to increase those prices outside the US, by the way.
> I'm not sure who you think has a computer, is developing software, and can't afford that.
There is a market for lightweight code editors, isn't there?
> Too bad if that's you, but "top 1%" is a laughable claim when RAM is so cheap.
That was a bit of hyperbole on my part, but let's not forget that just being an employed SWE in the US easily places you in the top 1% globally.
> I've used VSCode on a computer with 2GB of RAM, and it worked. I expected everything to run slower - and it did run slower, but it ran. And I developed, and contributed to the project I was working on while away from my workstation. This was a cheap $70 Windows 10 tablet. YMMV.
Fair enough. VSCode is hardly the worst offender though - it actually runs quite well for an Electron app.
> but let's not forget that just being an employed SWE in the US easily places you in the top 1% globally.
And not being able to afford $30 as a developer for a decent amount of RAM puts you in the bottom 1% of developers globally. Yes, I made that up just as you are making up your own numbers. But as I explained, you don't need 128GB of RAM, you don't need 64GB of RAM, you don't even need 8GB of RAM, you can still develop with VSCode with 2GB of RAM. Nobody is handing out free RAM, so if you need more, save your rupees, or pennies, or euros, or whatever. The daily cost of it spread over time is miniscule for anyone on the planet, and you will get back the investment in saved time.
Not everybody can upgrade RAM due to warranty seal/lack of slot or simply doesn't know how to do it. Software should use as little resources as it could.
7. The previous version of Daino Notes, called Notes is FOSS (free and open-source software) available at https://www.notes-foss.com/ and the source code is available at https://github.com/nuttyartist/notes. I decided to make Daino Notes closed source due to difficulties in monetizing FOSS. In order to comply with Notes' MPL license, all common files between Notes and Daino Notes are published in https://github.com/nuttyartist/daino-notes-public
Tldr: The FOSS version earned a stable revenue through Google Ads placed on the website, since the website ranked high on Google searches. Two years ago, that changed since the website got de-ranked, so I created a different, proprietary version of the app based on the FOSS version but with a totally revamped block editor that I wrote from scratch - that I worked on full-time for a whole 1 year.
A web app that cannot handle a text file bigger than X bytes doesn't become useless, in the same way that a native app isn't useless even though it, too, has a limit on the maximum file size it can handle.
Any text editor that struggles to load a large text file on a modern computer is, simply put, inefficient. If 20 years ago they managed to write programs that could handle such cases and today many (web) apps fail at this task means we're going backward.
My point is that it's much harder to write efficient code in the web ecosystem because you're bound to specific abstractions from the browser. Once deviating from said abstractions, it's not trivial to write efficient code.
Inefficiency also compounds. If you're sending too much data over an unreliable connection using a bloated protocol (say), you have three multipliers. Now start daisy-chaining these things together, host them on bloated images on pods in underpacked nodes in k8s (not a potshot at k8s, which I like quite a bit, just... another plausible source of inefficiency). Write all the servers in Python (or worse, some Ruby on Rails backed by MySQL or something comically underperformant).
We could keep going, but it maths out to mind-blowing amounts of waste just copying bytes around between buffers with no value add.
(Old man editorializing at clouds: "and all so we can employ people who don't know how computers work to satisfy corporate product pipelines by shoveling digital shit onto people that they neither want nor need")
The old metaphor of shipping bananas by packing the entire jungle surrounding the ape that’s holding the banana does very well to illustrate the truly egregious level of inefficiency at play here, especially when one considers how there’s tens or hundreds of thousands of these jungles involved in any given product…
How is Discord not loading fast enough the result of a “third world mindset”? Is this one of those “this software is bad because it was made in China / India / outsourced to one of those countries” arguments (which I don’t even think applies to this topic???)
I’ve routinely needed to open a 5GB text file on my computer before (previous job), and only some “apps” can do it. If we even call them apps lol. It’s just bloated web browser junk packaged to look like a native app.
notepad++ solved this problem 20+ years ago.
I agree it’s an uncommon use case but it’s kind of sad when an app struggles to open a file like that on a modern machine in 2024. Just sad.
I've recently been summarizing entire directories into a single chunk of text for use with Gemini, the other day I overshot and ended up pasting 28 million characters into vscode. It handled it pretty well.
I don’t know that I completely agree. It depends on the functionality offered, right? Like vim, for example, can struggle with very large files if you ask it to do syntax highlighting all the way from the beginning (or, it can give you syntax highlighting that is just wrong if you don’t). I don’t think vim is very inefficient (could be wrong there, though), and I don’t see any way to generally do syntax highlighting without looking at the whole file (although, of course, in practice there are often shortcuts for specific languages…)
As you have pointed out, QML is buggy. Chromium's rendering engine is probably the most stable and polished GUI toolkit there is, not to mention a cross-platform one too. Throughout the last 10 years I only had to deal with 2 Chromium bugs and they were very minor. Well-written JavaScript is fast and the machines are getting faster every year. It does not take much real time computation to provide a UI for a desktop app, it's not a video game. And many of the those things that are real time, like the caret in the text editor or hover states are implemented in native code by the web browser, with no JS interaction. I agree though that a block editor is a little more real time than the average UI.
The key word is well-written JavaScript. What is the most popular state management framework? Redux, possibly. What is the most inefficient state management framework? Also Redux. With Redux, if you have an app that displays a timer that updates every second, every subscription to any piece of the state throughout the entire app will trigger. I'm not sure if the app used Redux, but I used to use a time tracker app that would use 30% of my CPU when idle (I since moved to a CLI C++ solution and it is so much faster, but that does not mean a decent time tracker could not be built with web technologies). So if Redux is the most popular framework, you can see just how little the average web dev cares about writing apps that are not slow resource hogs.
> Also, all web apps couldn't handle loading a very large text file (they were all hanging).
Could it be that QT has some optimisation technique to not render all those lines out of view? I.e. if you have a huge file that can still be loaded to RAM, C++ won't sweat it, but is it actually getting all rendered at the same time in a savvy implementation, whether at the level of the app or the framework? Probably not. On the other hand, the textarea element or a contentEditable div just was not made for something like this. It could still be developed by implementing a custom element / component that loads the text dynamically while scrolling. If it's too much for JS to hold, it could use WASM or another process and pass it with IPC. It is definitely possible to write an Electron-based text editor that can open a 1 GB text file efficiently, it's just not out of the box experience and most people do not think there's a need for such a use case.
Hi there! Indeed, QML is very buggy. But there's also a large discrepancy between Chromium's budget (Google) and The Qt Company. Also The Qt Company tend to prioritize advancement in the embedded world (where it probably gets most of its cash) rather than regular applications. So, many bugs get fixed through open-source contributors (KDE, individuals, etc) And that might be a big reason why non-critical bugs don't get prioritize enough.
Like with anything, we're dealing with abstractions. Qt and QML are also abstractions. But I'd argue they are better abstractions than the web for dynamic semi-complex to complex applications (for static sites/simple applications, the web is fine). The reason Qt and QML are a great abstractions are mainly:
1. Native modules/APIs - you can always plug in native modules into your app as needed. For example, I use native Objective-C APIs to draw the window on macOS for my app. It just looks better than what you get with just Qt.
2. Performance - Almost all QML-based components (called Qt Quick), are written in fast, compiled language C++, and if needed, you can create your own components in C++ and expose them via QML.
3. There are many more reasons, one of them is that I think QML is the best declarative UI language I've seen, and it plays very nicely with Qt style of C++ (signal and slots etc.).
> Could it be that QT has some optimisation technique to not render all those lines out of view?
Well, I detailed in my blog post my technique - it's not really novel - you can build virtualized lists in many languages, including JavaScript. You can look into the source code of many web apps that have done the same type of block editor that I implemented. MarkText[1] seems to be the most efficient one from my testings. My point is that building upon the abstractions of the web makes it very hard to write truly efficient code that is well-optimized for your computer resources. You might be an amazing programmer, but you're limited by a certain upper bound of performance, by the mercy of the web standards council and web browser engines implementation of those standards.
> But there's also a large discrepancy between Chromium's budget (Google) and The Qt Company. Also The Qt Company tend to prioritize advancement in the embedded world (where it probably gets most of its cash) rather than regular applications
Yep. That's precisely the point, you get all this stuff from a billion dollar project for free.
I really would not mind writing some C++ instead, even if it was more difficult. If anything, it would only be better because of higher moat of the project as well as my own skills. I agree 100% on the principles that native is better, faster and JS is an unnecessary layer of abstraction slowing things down.
However, if I can compare 2 timelines, one where I am using QML for a project, another one where I am using Electron and think about the time spent working around bugs, reporting bugs and the users of the app complain about crashes in the former, or not have any of that at the trade off of having something slightly slower, to me it's a no brainer.
In the context of what you wrote in the article:
> One of the most frustrating aspects of developing a Qt application is the slew of Qt bugs you encounter along the way. During ten months of development, I reported seven bugs, three of which were assigned 'critical' priority—two of which resulted in crashes. I also came across many bugs already reported by others that remain unfixed.
I would rather have an app that is slightly slower than one that can crash unexpectedly. Even if they are quick to fix bugs, new bugs may be introduced in new releases. Your intent was to promote QT in your blog post, but unfortunately it has only affirmed to me that it's not something production-ready (QML on desktop).
That's just the unfortunate state of industry where we are at. Hopefully it changes one day. Maybe Chromium could be forked into a C++ GUI toolkit where DOM could be manipulated directly by C++. Has anyone ever considered that?
> I would rather have an app that is slightly slower than one that can crash unexpectedly. Even if they are quick to fix bugs, new bugs may be introduced in new releases. Your intent was to promote QT in your blog post, but unfortunately it has only affirmed to me that it's not something production-ready (QML on desktop).
Haha, that's interesting. But to be honest, it's really not that bad as it seems. Again, crash reports tend to be highly prioritized and most of the time you can find your way around them until they get fixed. It's indeed a frustrating experience when non-crash related bugs aren't being prioritized, but then again, like I explain in the blog, I could use a different library, probably an open source solution like I described using QBasicHtmlExporter[1] since QTextDocument toHtml uses weird inline HTML (and has some other bugs).
The thing is, with experience I kinda start to have my own boilerplate of battle-tested components/tools/libraries. I made the following client for Ollama[2][3] while not working on it full-time (still WIP) in around a month. It already is better than many web apps I tried who kept hanging while the model was generating a response. Also, try to copy text from a code block in web apps while a model is still generating a response -> it's almost always impossible since most web apps keep re-rendering everything on each completion, while I (like the native macOS chatGPT app) do incremental parsing which is much more efficient. The binary is 28MB (and can be even smaller), the app is fast and can handle very large amount of data. So I can build QML apps really, really fast these days due to the experience I gained and still gaining. I'm also wondering if I should open source my components as AGPL and then have some commercial licensing for it... Not to mention, I rarely use my own heap allocations myself - I try to put as much as I can either on the stack or in QML - so Qt handles all the heap allocation itself. While I'm relying on Qt to do an appropriate job, it seems to be very, very stable for now.
I am so happy that you made this, legit. I’m absolutely going to try it whenever I get a chance.
I’ve wanted to do the same thing you did but with coding notebooks (e. Jupyter) for a while now. It frustrates me to no end that the only native software for notebooks is JetBrains IDEA (and even that’s only an “I think it’s naive” lol). Hopefully I can take what you learned and documented and apply it to my app ^-^
Very cool! I hope my blog post could be of help. Let me know if you need any help my socials are in my HN profile. And let me know what you think of my app, would love to hear any feedback.
It's funny that part of the reason computer hardware has gotten faster and more efficient is because heavy usage work flows, even things like web apps.
So while you think web apps are going "backwards", they've likely helped contributed to modern computing hardware speeding up your native programs!
Is that true, or the reverse? That web apps became a feasible thing only after consumer hardware, esp phones, became performant enough to handle loads like that (which lead to less and less offloading to servers)?
I'd say that web apps became a thing because Google really wanted them to be. They first tried with their browser plugin. That worked, but adoption wasn't good enough. So they ditched Google Gears and started developing a browser with sufficient performance for web- native apps. They succeeded quite well.
So in my view, browsers became capable, but then plenty of "heavy" web apps appeared, which required more beef in the machine.
That's also the typical way it goes: current hardware being okayish but not great is one of the strongest drivers for better hardware. Whether it is gaming (PCs), camera (smartphones), the web bloating (both).
Native apps also give users a type of control that web apps can’t, by way of existing as fully independent executables on storage in possession of the user.
Web apps can just up and disappear, spontaneously grow paywalls, or slowly enshittify over time, and unless both the user is technically savvy and the web app is fully open source, there’s nothing the user can do about it.
In contrast, when a new release of a native app is worse or its company goes under, the user retains a useful product (old binary) that can be run bordeline indefinitely one way or another (hacks, emulation, etc).
Yes, but native Apps have adds, which at least I don't know how to block. On Firefox mobile I can just use ublock origin to watch YouTube Videos without ads, as one example...
That applies mostly to online-service apps, which are in a bit of different category as their usefulness without a connection is extremely restricted. Most apps don’t need to fall into the bucket.
Let’s normalize offline web apps, then. Your examples at the end go back to needing technical chops, and in the end, everything gets bitrot.
The biggest problem is with apps that show you content. Web sites give the user more control over what content to save, better exposure to scrapers and APIs, standard navigation to every other web site.
Offline web apps are better but still not great because unless their dev has gone out of their way to wrap it in Electron, they don’t come in nice self-contained units like native apps do… for instance, if you’re upgrading your computer and want to copy over a previously installed but now defunct offline PWA, where do you go looking? The wrapper binary built by your browser doesn’t actually contain it, all the inner workings are squirreled away in some obscure directory with an inscrutable name.
Websites can give more control but that’s hardly a rule these days and depends on how the site/webapp in question was built. Something built with a canvas-based UI (as is sometimes necessary for displaying high volumes of information without performance degradation) for example isn’t going to give the user any better control than a native app would, and in some cases less.
I wrote about developing my own block editor from scratch[1] using C++ and QML after finding that Notion (and so many other web apps) are extremely slow and inefficient - in terms of CPU/RAM/battery life.
I detailed a comparison between native and web block editors, and the difference is huge. The fastest web app (MarkText) is 60x slower at loading texts and uses 3x more RAM than my native app. Also, all web apps couldn't handle loading a very large text file (they were all hanging).
Modern computers are blazing fast and efficient, there's no reason a text editor couldn't load large files. This is why, in my view, web apps aren't really the progress people make them to be. We're going backward, not forward, with web apps. This need to change.
[1] https://rubymamistvalove.com/block-editor
[2] https://rubymamistvalove.com/block-editor#8-performance