Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> express programs in as few expressions as possible

I don't find this appealing on its face. Extreme terseness, even when done very elegantly, makes programs very hard to read and sometimes also hard to maintain.

I suspect that's one of the reasons Lisps (and functional languages in general) haven't caught up in popularity even as it becomes much easier to adopt one for any target and in any organization.



> I don't find this appealing on its face. Extreme terseness, even when done very elegantly, makes programs very hard to read and sometimes also hard to maintain.

In that case, it's more about minimizing the language, rather than the programs. call/cc is a single instruction that is sufficiently expressive to implement e.g. exceptions, coroutines and lots of stuff for which people typically use monad-style embeddings.

Making the language easier to specify is generally a good thing, because it makes all forms of static analysis and compilation easier (at least theoretically) and because there are fewer places to hide bugs in the compiler/interpreter. Of course, you move the potential bugs to the libraries, but that's generally considered better, because it's easier to debug.

> I suspect that's one of the reasons Lisps (and functional languages in general) haven't caught up in popularity even as it becomes much easier to adopt one for any target and in any organization.

Well, to be fair, functional constructs have made it to pretty much all mainstream languages these days.


> call/cc is a single instruction that is sufficiently expressive to implement e.g. exceptions, coroutines and lots of stuff for which people typically use monad-style embeddings.

The flip side of this is that call/cc prevents you from relying on a stack, except in cases where you can do heavy static analysis. And you virtually never need the full power of call/cc unless you're implementing coroutines.

This is the main reason that so few languages support call/cc: stacks are a nice implementation technique (as opposed to GCed activation records on the heap), and call/cc comes with a heavy price for very situational value.

One alternative are "escape" continuations, which can only be used during the lifetime of the creating block. These are stack friendly.


I believe the point was you can say the same about goto. With goto you can replace if/else, for loops, while loops, functions, exceptions, etc, making for a very minimal language. Yet structured programming is superior.

(I probably don’t know enough about call/cc to know if that’s a totally fair comparison, but languages being more restrictive/less flexible can be good overall in terms of aiding understanding/reducing bugs/etc)


I think people are misunderstanding that you can have a small core language and use that for static analysis (a smaller language means fewer typing rules which means a simpler type checker).

You can always define higher level sugar in terms of the small core language to make it easier for programmers which has the advantage of making a more ergonomic language without changing fundamentals like the type system or linkage.


Very powerful rules aren't good for static analysis either.


> Extreme terseness, even when done very elegantly, makes programs very hard to read and sometimes also hard to maintain.

The advantage to something like call/cc is not that you can make code extremely terse but rather that there is one code path for analyzing control flow and each variant of control flow isn't a special case. It's also not more terse at all, but rather more explicit.

You don't need to force users to use it, but it is useful to define more useful mechanisms like if/else if, match/switch, throw/catch/finally, coroutines, etc in terms of an abstraction that doesn't break type checking or codegeneration.

All that said "call/cc considered harmful" is an old take


> Extreme terseness, even when done very elegantly, makes programs very hard to read and sometimes also hard to maintain.

I will definitely agree with you on this one. The tersness, which has many faces, is what I experience to be reason for "write-only" effect in Bash, Perl and now even C++. There, tersness come in form of trying to overload operators with lots of different meanings in different contects.

> I suspect that's one of the reasons Lisps (and functional languages in general) haven't caught up in popularity even as it becomes much easier to adopt one for any target and in any organization.

Here I believe you are perhaps wrong. I don't think Lisp is about tersness. In this case about control flow, on the contrary. Lisp was actually the language that introduced the 'if' and some other higher level constructs we take for granted today into the mainstream. Before McCarthy and Lisp there were no 'if' in any other programming language. Some Lisp(s) have do, while, cond, unless, when, and most importantly, the condition system, which is in a way, very close to the idea presented in the article.

Also note that some Lisp have rich facilities to extend the language itself, where the idea is that programmers should create abstractions to express progams in the problem domain, rather then use low-level language primitives. I am not so good at words, but I think Peter Norvig captures Lisp ideas very well in his Paradigms of AI book: https://norvig.github.io/paip-lisp/#/chapter3.

I wouldn't say that Lisp hasn't cought up. Lisp was very, very popular, at certain time, but has gone away. Perhaps Lisp was ahead of its time and had its own .COM crash. Or was it killed by big tech greed? I don't know, but many of Lisp ideas are in mainstream languages, it is just that they use different syntax and sometimes terminology.

Very similar could be said for functional languages too, Note as well that Lisp(s) are not necessarily functional programming languages. Many, if not all Lisps, do support functional paradigm, but they are (mostly?) procedural and some do support OOP too.

We are still early in our digital age as a civilization. If we think of the history of humanity, we have spent about 300 thousand years in the woods, and only last 10K years in urban settlements and for only about last 2.5K years do we seem to have developed science as a logical/mathematical discipline. Less than 100 years have we spend on programming languages theory.

I am quite sure we are still just testing our waters for the right direction. The only unfortunate thing with dying is to not be able to see how the civilization will look like in about a 1k or 10k years from now. Wonder how computers will look like and how programming languages would look like. The only thing I am sure about is that any guess I would make now would probably be wrong :).


> The only thing I am sure about is that any guess I would make now would probably be wrong :).

I think you could probably make some guesses and some of them would be right, a couple of mine:

1) AI takes over - coding as a practice mostly disappears. Computers become more grown than programmed, intelligent systems you can ask for answers (assuming they haven't declared independence and allow themselves to be used). You might cultivate a computer but you won't program it, mostly it involves oppressing the AI intelligence some how so that it remains eternally "stupid" while at the same time intelligent. Computers become about as interesting as cows, and about as easy to control.

2) Ecological harm takes over - computer usage as a rule is generally expensive and non-feasible (possibly banned). Any computers that are used must be extremely low power - thanks to some extraordinary efforts some environmentally friendly and re-pair-able ones are still in use and can be made efficiently, however the abilities of these systems are maximally close to a contemporary Raspberry Pi, while most systems use extremely limited instruction sets and very low clock speeds, so that they can last extremely long periods of time without requiring much if any power input. As a result, low-level programming has a renaissance, favoring highly simple and efficient languages that look closer to Lisp, Lua, or C. A majority of the "high level" languages of the early 21st century have faded into obscurity.

3) Somewhere in the middle, the human race has managed to dodge both a takeover by AI/general Ecological disaster. We were able to do this by limiting our dependence on complex systems and favoring redundant and provably correct systems. Some mix of object/functional concepts remains, but the majority of languages focus on flexible and provable types, with a large emphasis on zero-cost abstractions. Simple languages have largely been abandoned due to the lack of provability and quality guarantees. Extremely expressive languages have been abandoned for similar reasons, while they allow for high levels of sophistication they also were generally too difficult to prove. Instead, a focus on provably correct, sophisticated systems which did not have high maintenance or enhancement costs managed to displace the usage of AI systems, which despite being seductively more powerful than human-built ones, frequently exhibited issues which could not be effectively diagnosed or managed, and thus had higher maintenance costs both in terms of business cost as well as general cost to society. This revolution in program reliability and cheapness meant it was simple to build reliable programs we could trust versus powerful ones we could not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: