Heh, there's nothing legacy about Unreal. It's the premier game engine in the world today and actively maintained by hundreds of developers. They're now pushing into film and do new releases regularly, often breaking backwards compatibility when they do. Where is this move away from OOP and inheritance? Even newly developed features like Nanite use it:
But more importantly, is this take falsifiable? What does "old" or "legacy" mean? People have been pushing this line for at least 15 years here on HN, yet what we see in the most well funded and actively maintained codebases is lots and lots of inheritance, with no efforts to remove it. Not just Unreal but also Chrome, MS Office, iOS, Android, Java, and more, all use this technique with no ill effects as far as anyone can tell. When the maintainers talk about what issues they face and are putting refactoring efforts into, inheritance or OOP never seem to be on the list. In the Java case it's actually the opposite, they like to complain about people violating OOP encapsulation and want to make enforcement stricter. Meanwhile heavily hyped successors that lacked it, like Haskell, have vanished without a trace, leaving not even one widely used program in their wake.
What would it take to falsify the claim that inheritance is a legacy technique? Because I see no real evidence of it. Every codebase I've worked on has used it without anyone remarking on that fact, and it didn't seem to cause issues more often than other design patterns.
> Meanwhile heavily hyped successors that lacked it, like Haskell, have vanished without a trace, leaving not even one widely used program in their wake.
Haskell still is a thing, and Pandoc and shellcheck are widely used, if you aren't, you're missing out.
And Haskell impressed Tim Sweeney enough that a whole bunch of Haskell people are working on Epic's Verse language.
Epic is a very successful software company, they're familiar with FP/Haskell, and their new language has classes with inheritance. There is no evidence here of any shift away from the concept, let alone Unreal being legacy! Apple is the same. Swift deviates from other languages in many ways, but it has inheritance:
> Meanwhile heavily hyped successors that lacked it, like Haskell, have vanished without a trace, leaving not even one widely used program in their wake.
Do you think that inheritance had something to do with that?
A programming style isn't automatically "good" just because it's financially and technically unfeasible to migrate to something else overnight.
The reason Unreal uses inheritance is because this is what people did in 1998. The reason it can't stop using is because it's too late to change. There's nothing more to it.
> What would it take to falsify the claim that inheritance is a legacy technique?
A demonstration that inheritance is a good technique. Which it is not. It hurts locality of behaviour, and that's bad because it increases the rate of mistakes. There have been studies about this.
Legacy and "good" are different things. I believe I falsified the idea that it's a legacy technique. I actually think it's also a good technique. I also think a lot of other devs agree, they just can't be bothered arguing about it on HN, hence the fact that large high-budget codebases usually seem to use it extensively and aren't switching to some other design.
As for studies, I'd be interested to read those, as most studies of developer productivity I've seen aren't that good.
By "legacy", I mean "old stuff we thought were good, but is not useful for new projects". Now nothing prevents people from using a useless technique for new projects.
Anyway, I don’t care about what we should mean by this and that. What I do care about is whether inheritance is useful or not. And so far, I have seen no evidence that it is anything more than highly situational — meaning, doesn’t have a better alternative for almost everything.
The study I recall didn’t measure productivity, it counted bugs.
Anyway, I spent enough time on the subject to close the case. Until I stumble upon cogent evidence to the contrary, inheritance is not worth my time. I won’t use it, I will steer my colleagues away from it, and I will quit gigs that use it too heavily.
Broader question (much broader): Do you know of any place that has a good collection of studies that answer questions like this (questions about CS, language design, and such)?
https://dev.epicgames.com/documentation/en-us/unreal-engine/...
But more importantly, is this take falsifiable? What does "old" or "legacy" mean? People have been pushing this line for at least 15 years here on HN, yet what we see in the most well funded and actively maintained codebases is lots and lots of inheritance, with no efforts to remove it. Not just Unreal but also Chrome, MS Office, iOS, Android, Java, and more, all use this technique with no ill effects as far as anyone can tell. When the maintainers talk about what issues they face and are putting refactoring efforts into, inheritance or OOP never seem to be on the list. In the Java case it's actually the opposite, they like to complain about people violating OOP encapsulation and want to make enforcement stricter. Meanwhile heavily hyped successors that lacked it, like Haskell, have vanished without a trace, leaving not even one widely used program in their wake.
What would it take to falsify the claim that inheritance is a legacy technique? Because I see no real evidence of it. Every codebase I've worked on has used it without anyone remarking on that fact, and it didn't seem to cause issues more often than other design patterns.