Here's the core problem with next generation languages. Languages that come out of academia focus too much on syntax and computer science level functionality, and it's extremely rare for a language of that sort to make it in the real world. The languages we use today either come from big companies with the ability to promote anything long enough to get traction on any language that is at least "good" or they come from the "streets". From extremely small teams who create incredibly flawed languages that are immimently practical and go on to rule the world. Perl, Ruby, Javascript, PHP.
It's "worse is better" again in spades. Ivory tower language designers try to come up with perfection when what we really need is to improve on the basics.
The next big language is probably not going to be something like Haskell (as nice as all that functional purity is) it'll be something that builds profiling and unit testing and better source control support right into the language, compiler, and tools.
Edit: if you look at where the average developer is spending most of their time and especially where the majority of the pain is it's typically in things like testing, debugging, performance profiling and optimization, and deployments. And if you look out there in the field you'll see lots and lots of awesome tools and systems helping peoplee tackle those problems. But it's exceedingly rare to see a new language which approaches those problems or tries to codify those tools into first class language features.
Go was created in part by Ken Thomson, and supported by Google Inc. In other words, completely the opposite of what you are talking about.
Further, anyone who uses the "ivory tower" identifies themselves immediately as someone incredibly biased and political. There are politics in science to be sure, but overall it's frowned on.
Understand first principles. Understand your needs and the needs of others. Build something based on that. If you continually follow what is popular you will continue to be below average. By definition, popular is average.
Languages that come out of academia don't really focus on syntax much (unless you count research into macros for Racket/Template Haskell/etc). If anything, the lack of attention paid to syntax tends to hurt research languages' adoption.
I like to reply to Rust comments, for obvious reasons. Rust usually gets brought up around here in Go threads, unfortunately. I try not to do language advocacy -- like I said, Go is a great language, and it's in a different space from Rust and the languages are not competing -- but if I've crossed the line somewhere I apologize.
To avoid derailing, I'll try to refrain from commenting on anything not immediately related to Rust in other languages' threads in the future.
Here's the core problem with next generation languages. Languages that come out of academia focus too much on syntax and computer science level functionality
Is that really a problem, or do they simply have a different goal to the languages you want to use?
Plenty of language concepts that enter industrial programming were born in academia and went through the mill in so-called academic languages long before they found their way into mainstream tools. Just look at all the ideas from the functional programming world that have become almost universal in recent years.
That doesn’t necessarily mean that the academic languages where these ideas matured are themselves good tools for industrial applications. To be successful in industry, a language needs a lot more going for it: a good set of developer tools and a critical mass of users, for a start. This often creates a chicken-and-egg situation that can sink a new language regardless of its potential or technical merit, particularly if the approach to programming is very different to what most practitioners are used to at the time.
Anyway, I think you’re being rather unfair with the “ivory tower” characterisation. If you look up Simon Peyton-Jones’s comments on “programming language nirvana”, for example, it’s pretty obvious that he understands these issues and the roles of different kinds of language just fine.
The main problem of academic languages is that they improve one or two aspects and neglect the rest. Real world languages must improve one or two aspects without hurting the rest too bad.
The rest means for example: debugging, IDE, multi-platform, standard library, performance, deployment
Unless your language research focuses on debugging, IDE, standard library; I don't bother with multi-platform, performance, and deployment, but I know others that do. Really, we are just single people, we are not out to create the NBL, we are out to push things forward and create well-thought-out ideas that could be included in the next NBL, and we realize that most of our ideas will fail to make it big time. But perhaps some of them will survive and have an impact (so is the depressing life of an academic programming language design researcher).
What would be really cool is a language where I just write the tests and the compiler writes the actual code. That would be awesome.
Something like what Critticall attempted ten years ago: http://www.critticall.com/ The idea there was that an evolutionary algorithm wrote the core code, you just had to give it an environment and some way of knowing the results were still okay.
As someone who has tried Genetic Programming I can tell you that we're quite far from this becoming a reality. However, maybe there is something to that concept: I can imagine that it's possible to generate a good part of the usual boilerplate (interfaces, function parameters, data initialization, library initializations) knowing the tests. The programmer would still have to choose the names of inner functions and variables and then do the actual implementation. You do not want to leave the implementation itself to an AI, at least not any AI we know right now.
Although, maybe a Watson-like AI fed with knowledge from stack overflow and official code examples might change that in the future ;).
> it'll be something that builds profiling and unit testing and better source control support right into the language, compiler, and tools.
I think Go does an excellent job at this. The go tool comes with an excellent profiler, is version control aware and works with the `testing` package to make unit testing and benchmarking pretty easy.
There is still something to be said about languages that try to reduce the amount of time you spend Testing, debugging, profiling and optimization. Where the goal is to aid you in getting as much of the concept intact to the computer with as little effort as possible. End goal is of course AI.
D has profiling and unit testing built in (also coverage analysis and documentation generation), but I'm curious how you'd see source control built in.
It's "worse is better" again in spades. Ivory tower language designers try to come up with perfection when what we really need is to improve on the basics.
The next big language is probably not going to be something like Haskell (as nice as all that functional purity is) it'll be something that builds profiling and unit testing and better source control support right into the language, compiler, and tools.
Edit: if you look at where the average developer is spending most of their time and especially where the majority of the pain is it's typically in things like testing, debugging, performance profiling and optimization, and deployments. And if you look out there in the field you'll see lots and lots of awesome tools and systems helping peoplee tackle those problems. But it's exceedingly rare to see a new language which approaches those problems or tries to codify those tools into first class language features.