I don't believe this is the root cause, computers got faster, and software got quicker to the state of "run good enough". I'm calling Wirth's law on it.
"Clean code" is indeed often a bad idea, but you are overestimating the impact. Even software written by people caring very much about performance consume way more than it theoretically should.
Plus, if this was that simple, people would have already rewritten all the bad software.
Your message is exactly the reason why I do not like Casey, he is brainwashing everyone into thinking this is a culture problem. Meanwhile nobody tries to solve it technically.
The free market is preventing technical solutions. People generally buy based on features first and everything else second. This allows for a precarious situation in software market: the company producing the most bloat the fastest wins the biggest market share and sees no need to invest in proper fixes. Everyone that cares about software quality too much gets outcompeted almost immediately.
And since software can be rebuilt and replicated with virtually zero cost, there is no intrinsic pressure to keep unit costs down as it happens in industry, where it tends to keeps physical products simple.
It doesn't have to come from the free market, FOSS is hardly exempt of awfully slow/unstable software. Nobody figured out yet how to make writing good software the default/path-of-least-resistance.
Wirth's law doesn't bolster your point. He observed software is getting slower at a more rapid rate than computers are getting faster. Which is the whole point. We write increasingly slow, increasingly shitty code each year. I read and hear this attitude all the time that is basically "if you optimize something, you're bad at your job, only juniors try to do that". That's a culture problem.
It's frankly insulting you think Casey brainwashed me into this stance, when it's been obvious to me since long before I'd ever heard of him. IDGAF if code is clean or not. I care that Jira can display a new ticket in less than 15 seconds. I care that vscode actually keeps up with the characters I type. None of this software is remotely close to "runs good enough".
What I am saying is that this is a natural phenomenon assuming no technical solution. People will tend to optimize their software based on the performance of their hardware.
I completely agree that many apps are horrendously slow, but given the alternative are hard pressed to arrive, I can only conclude they are considered "good enough" for our current tech level.
The difficulty involved in rewriting modern apps is one of the reason I would give that result in slow software. Can't really complain about the number of independent web browsers when you look at the spec. Ensuring the software we use is easily reimplementable by a few or one developer in a few days would go a long way improving performance.
Another reason would be the constant need to rewrite working code, to work on new platforms, to support some new trendy framework, etc. etc. You cannot properly optimize without some sort of stability.
Jira runs good enough to get idiot managers to pay for it, which is what it's designed for. And, yeah, microoptimising (or micropessimising, who knows since the changes are usually just made on vibes anyway) random bits of code that probably aren't even on any kind of hot path, while compromising maintainability, is something only juniors and people who are bad at their job do. It's easy to forget how common security flaws and outright crashes were in the "good old days" - frankly even today the industry is right to not prioritise performance given how much we struggle with correctness.
A lot of code is slow and could be faster, often much faster. This is more often because of people who thought they should bypass the abstractions and clean code and do something clever and low level than the opposite. Cases where you actually gain performance on a realistic-sized codebase by doing that are essentially nonexistent. The problem isn't too many abstractions, it's using the wrong algorithm or the wrong datastructure (which, sure, sometimes happens in a library or OS layer, but the answer to that isn't to bypass the abstraction, it's to fix it), and that's easier to spot and fix when the code is clean.
"Clean code" is indeed often a bad idea, but you are overestimating the impact. Even software written by people caring very much about performance consume way more than it theoretically should.
Plus, if this was that simple, people would have already rewritten all the bad software.
Your message is exactly the reason why I do not like Casey, he is brainwashing everyone into thinking this is a culture problem. Meanwhile nobody tries to solve it technically.