They almost did, and I mostly blame Java. Java's type system is so poor that you go through all this ceremony of declaring types (and remember that Java didn't even always have `var` and `<>` inference shortcuts), and you still inevitably had a bunch of runtime type bugs because of its obscene handling of null and its type-erased generics (and maybe even its intentionally-incorrect array type variance).
Its type system was so lame that it gave rise to most of the "Gang of Four" patterns, most of which are just workarounds for Java- not some first order principles of programming.
So you had these awkward and verbose "patterns" and polymorphism and naming a bazillion new types that were almost the same as each other, and STILL had runtime type errors and crashes. What's the point?
It's no wonder at all that every desktop app I installed on my Linux machine in 2008 was written in Python.
It's no wonder someone thought it was actually a good idea to run *JavaScript*, of all things, on the backend *and* the desktop. Holy crap- how bad do statically-typed languages have to be to make people want JavaScript?! (I'll tell you: they have to be Java and C++98 bad- that's how bad.)
Thank goodness for Swift, Rust, Go, etc in the post-2010 world.
For the last few years people are looking down their noses at dynamically typed languages, but I think statically-typed languages almost died thanks to Java and C++.
> Its type system was so lame that it gave rise to most of the "Gang of Four" patterns, most of which are just workarounds for Java- not some first order principles of programming.
Java wasn't released until after the first edition of Gang of Four. C++ is probably what you should be pointing the finger at.
Totally right, I apologize. I think my point still stands if we replace Java with C++ (and I still think Java was worse than C++ for static typing's reputation).
Its type system was so lame that it gave rise to most of the "Gang of Four" patterns, most of which are just workarounds for Java- not some first order principles of programming.
So you had these awkward and verbose "patterns" and polymorphism and naming a bazillion new types that were almost the same as each other, and STILL had runtime type errors and crashes. What's the point?
It's no wonder at all that every desktop app I installed on my Linux machine in 2008 was written in Python.
It's no wonder someone thought it was actually a good idea to run *JavaScript*, of all things, on the backend *and* the desktop. Holy crap- how bad do statically-typed languages have to be to make people want JavaScript?! (I'll tell you: they have to be Java and C++98 bad- that's how bad.)
Thank goodness for Swift, Rust, Go, etc in the post-2010 world.
For the last few years people are looking down their noses at dynamically typed languages, but I think statically-typed languages almost died thanks to Java and C++.