Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Personally I find that Casey presents simple hard facts that rile people up, but at that point they're doing... what? Arguing with facts?

He had a particularly pointed rant about how some ancient developer tools on a 40 MHz computer could keep up with debug single-stepping (repeatedly pressing F10 or whatever), but Visual Studio 2019 a on multi-GHz multi-core monster of a machine can't. It lags behind and is unable to update the screen at a mere 60 Hz! [1]

I have had similar experiences, such a the "new Notepad" taking nearly a minute to open a file that's just 200 MB in size. That's not "a small cost that's worth paying for some huge gain in productivity". No, that's absurdly bad. Hilariously, stupidly, clownshoes bad. But that's the best that Microsoft could do with a year or more of development effort using the "abstractions" of their current-gen GUI toolkits such as WinUI 3.

This is not progress.

[1] The whole video is worth watching end to end, but this moment is just... sad: https://youtu.be/GC-0tCy4P1U?t=1759



If I only spoke Arabic or Russian or Chinese, could I write words in my language on those ancient developer tools? Or would I be limited to ASCII characters?

If I were blind, could the computer read the interface out to me as I navigated around? If I had motor issues, could I use an assistive device to move my cursor?

I'm not saying this excuses everything, but it's easy to point at complexity and say "look how much better things used to be". But a lot of today's complexity is for things that are crucial to some subset of users.


> If I only spoke Arabic or Russian or Chinese, could I write words in my language on those ancient developer tools?

For relevant cases, YES!

NT4 was fully Unicode and supported multiple languages simultaneously, including in the dev tools. Windows and all associated Microsoft software (including VS) has had surprisingly good support for this even back in the late 1990s, possibly earlier. I remember seeing in 2001 an Active Directory domain with mixed English, German, and Japanese identifiers for security groups, OU names, user names, file shares, etc... Similarly, back in 2002 I saw a Windows codebase written by Europeans that used at least two non-English languages for comments.

Note that programming languages in general were ASCII only for "reasons", but the GUI designers had good i18n support. Even the command-line error messages were translated!

Linux on the other hand was far behind on this front until very recently, again, for "reasons". You may be thinking back to your experience of Linux limitations, but other operating systems and platforms of the era were Unicode, including MacOS and Solaris.

None of this matters. UTF8 is not the reason the GUI is slow. Even the UCS16 encoding used by Windows is just 2x as slow, and only for the text, not any other aspect such as filling pixels, manipulating some GUI object model, or responding synchronously vs asynchronously.

Look at it this way: for a screen full of text, ASCII vs Unicode is a difference of 10 KB vs 20 KB in the volume of data stored. The fonts are bigger, sure, but each character takes the same amount of data to render irrespective of "where" in a larger font the glyph comes from!

> If I were blind, could the computer read the interface out to me as I navigated around?

Text-to-speech is an unrelated issue that has no bearing on debugger single-stepping speed. Windows had accessibility APIs since forever, including voice, it was just bad for reasons to do with hardware computing limitations, not a lack of abstractions.

> If I had motor issues, could I use an assistive device to move my cursor?

That's hardware.

> But a lot of today's complexity is for things that are crucial to some subset of users.

And almost all of it was there, you just may not have been aware of it because you were not disabled and did things in English.

Don't confuse a "lack of hardware capacity" or a "lack of developer budget" with the impact that overusing abstractions has caused.

These things were not a question of insufficient abstractions, but insufficient RAM capacity. A modern Unicode library is very similar to a 20-year-old one, except the lookup tables are bigger. The fonts alone are tens of megabytes, far more than the disk capacity of my first four computers... combined.

Today I have a 4.6 Ghz 8-core laptop with 64 GB of memory and I have wait for every app to open. All of them. For minutes sometimes. MINUTES!

None of this has anything to do with accessibility or multi-lingual text rendering.


Your comment indicates a pretty poor understanding of why multilingual text rendering is a lot harder than it was in the early 90s. Back then, to display some text on the screen, the steps were: for each character (=byte), use the index to select a small bitmap from a dense array, copy that bitmap to the appropriate position on the screen, advance.

But modern font rendering is: for a span of bytes, first look up which font is going to provide those characters (since fonts aren't expected to contain all of them!). Then run a small program to convert those bytes into a glyph index. The glyph index points to a vector image that needs to be rendered. Next, run a few more small programs to adjust the rendered glyph for its surrounding text (which influences things like kerning spacing) or the characteristics of its display. And then move on to the next character, where you get to repeat the same process again. And if you've got rich text, all of this stuff can get screwy in the middle of the process: https://faultlore.com/blah/text-hates-you/

Indic text rendering, for example, never worked until the mid-2000s, and there's even more complex text rendering scenarios that still aren't fully supported (hello Egyptian hieroglyphs!).


Having personally implemented a Unicode text renderer in both DirectX and OpenGL back in 2002 for the Japanese game market, I dare say that chances are that I know a heck of a lot more than you about high performance multilingual text rendering. Don't presume.

You just made the exact same argument that Casey Muratori absolutely demolished very publicly.

The Windows Console dev team swore up and down that it was somehow "impossible" to render a fixed-width text console at more than 2 fps on a modern GPU when there are "many colors" used! This is so absurd on its face that it's difficult to even argue against, because you have to do the equivalent of establishing a common scientific language with someone that thinks the Earth is flat.

Back in the 1980s, fully four decades ago, I saw (and implemented!) colorful ANSI text art animations with a higher framerate on a 33 MHz 486 PC! That is such a miniscule amount of computer power that my iPhone charger cable has more than that built into the plug to help negotiate charging wattage. It's an order of magnitude less than the computer power a single CPU core can gain (or lose) simply from a 1 C degree change in the temperature of my room!

You can't believe how absurdly out-of-whack it is to state that a modern high-end PC would somehow "struggle" with text rendering of any sort!

Here is the developers at Microsoft stating that 2 fps is normal, and it's "too hard" to fix it. "I believe what you’re doing is describing something that might be considered an entire doctoral research project in performant terminal emulation as “extremely simple” somewhat combatively" from: https://github.com/microsoft/terminal/issues/10362#issuecomm...

Here is Casey banging out a more Unicode compliant terminal that looks more correct in two weekends that can sink text at over 1 GB/s and render at 9,000 fps: https://www.youtube.com/watch?v=99dKzubvpKE

PS: This has come up here on HN several times before, and there is always an endless parade of flabbergasted developers -- who have likely never in their lives seriously thought about the performance of their code -- confidently stating that Casey is some kind of game developer wizard who applied some "black art of low-level optimisation". He used a cache. That's it. A glyph cache. Just... don't render the vector art more than you have to. That's the dark art he applied. Now you know the secret too. Use it wisely.


> You can't believe how absurdly out-of-whack it is to state that a modern high-end PC would somehow "struggle" with text rendering of any sort!

I never said that.

What I said is that modern text rendering--rendering vector fonts instead of bitmap fonts, having to deal with modern Unicode features and other newer font features, etc.--is a more difficult task than the days when text rendering was dealing with bitmap fonts and largely ASCII or precomposed characters (even East Asian multibyte fonts, which still largely omit the fun of ligatures).

My intention was to criticize this part of your comment:

> None of this matters. UTF8 is not the reason the GUI is slow. Even the UCS16 encoding used by Windows is just 2x as slow, and only for the text, not any other aspect such as filling pixels, manipulating some GUI object model, or responding synchronously vs asynchronously.

> Look at it this way: for a screen full of text, ASCII vs Unicode is a difference of 10 KB vs 20 KB in the volume of data stored. The fonts are bigger, sure, but each character takes the same amount of data to render irrespective of "where" in a larger font the glyph comes from!

This implies to me that you believed that the primary reason Unicode is slower ASCII is because it takes twice as much space, which I hope you agree is an absurdly out-of-whack statement, no?


> This implies to me that you believed that the primary reason Unicode is slower ASCII is because it takes twice as much space, which I hope you agree is an absurdly out-of-whack statement, no?

Actually, this precise argument comes up a lot in the debate of UTF-16 vs UTF-8 encodings, and is genuinely a valid point of discussion. When you're transferring gigabytes per second out of a web server farm, through a console pipeline, or into a kernel API call, a factor of two is... a factor of two. Conversely, for some East Asian locales, UTF-16 is more efficient, and for them using UTF-8 is measurably worse.

The point is that any decent text renderer will cache glyphs into buffers / textures, so once a character has appeared on a screen, additional frames with the same outline present will render almost exactly as fast as in the good old days with bitmap fonts.

Not to mention that Microsoft introduced TrueType in 1992 and OpenType in 1997! These are Windows 3.1 era capabilities, not some magic new thing that has only existed for the last few years.

None of this matters in the grand scheme of things. A factor of two here or there is actually not that big a deal these days. Similarly, rendering a 10x20 pixel representation of maybe a few dozen vector lines is also a largely solved problem and can be done at ludicrous resolutions and framerates by even low-end GPUs.

The fact of the matter is that Casey got about three orders of magnitude more performance than the Windows Console team in two orders of magnitude less time.

Arguing about font capabilities or unicode or whatever is beside the point because his implementation was more Unicode and i18n compliant than the slow implementation!

This always happens with anything to do with performance. Excuses, excuses, and more excuses to try and explain away why a 1,000x faster piece of hardware runs at 1/10th of the effective speed for software that isn't actually that much more capable. Sometimes less.

PS: Back around 2010, my standard test of remote virtual desktop technology (e.g.: Citrix or Microsoft RDP) was to ctrl-scroll a WikiPedia page to zoom text smoothly through various sizes. This forces a glyph cache invalidation and Wiki especially has superscripts and mixed colours, making this even harder. I would typically get about 5-10 fps over a WAN link that at the time was typically 2 Mbps. This is rendered in CPU only(!) on a server, compressed, sent down TCP, decompressed, and then rendered by a fanless thin terminal that cost $200. The Microsoft devs are arguing that it's impossible to do 1/5th of this performance on a gaming GPU more than a decade later. Wat?


I continue to be mystified as to why you think I am commenting in any way on the quality (or lack thereof) of the Windows Console.


Single-stepping is moderately complicated. I'm even less impressed with modern editors that can't keep up with typing. My first computer could do that. It had a 2MHz 6502.


I've had the same thing happen with a drawing app.. I've an older android tablet that has a sketch app. It works great.. I got a new high powered tablet, because the battery on the old one wouldn't keep a charge.. New one has a sketch app that lags.. draw line... wait. draw line... wait. It's unusable.. It has a newer processor, and more memory, but I can't draw.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: