Another thing to mention: there is not only the compatibility at the side of software (scripts), that to me is easy to maintain (just keep the old software with the new compatible one and you are done).
The most important thing is compatibility with humans. The main reason because I tend to use a pretty standard setup is because I know that these tools are the standard, I need to ssh into a server to resolve some issue, well I have a system that I'm familiar with.
If for example on my system I have fancy tools then I would stop using the classic UNIX tools, and then every time I would have to connect to another system (quite often) I would type wrong commands because they are different or totally not present or have to install the new fancy tools on the new system (and it's not always the case, for example if we talk about embedded devices with 8Mb of flash everything more than busybox is not possible).
To me the GNU tools have the same problem, they got you used to non POSIX standard things (like putting flags after positional arguments) that hits you when you have to do some work on systems without them. And yes, there still exist system without the GNU tools, MacOS for example, or embedded devices, some old UNIX server that is still running, etc.
Last thing, if we need to look at the future... better change everything. Let's be honest, if I could chose to switch entirely on a new shell and tools, and magically have it installed on every system that I use, it would probably be PowerShell. For the simple reason that we are no longer in the '80 where everything was a stream of text, and a shell that can manipulate complex objects is fantastic.
> Last thing, if we need to look at the future... better change everything.
Absolutely. We had decades to work with a fairly stable set of tools and they are not going anywhere. Whoever needs them, they are there and likely will be for several more decades.
I am gradually migrating all my everyday DevOps workflows (I am a senior programmer and being good with tooling is practically mandatory for my work) to various Rust tools: ls => exa, find => fd, grep => rg, and more and more are joining each month. I am very happy about it! They are usually faster and work more predictably and (AFAICT) have no hidden surprises depending on the OS you use them in.
> we are no longer in the '80 where everything was a stream of text, and a shell that can manipulate complex objects is fantastic.
Absolutely (again). We need a modern cross-platform shell that does that. There are few interesting projects out there but so far it seems that the community is unwilling to adopt them.
I am personally also guilty of this: I like zsh fine even though the scripting language is veeeeery far from what I'd want to use in a shell.
Not sure how that particular innovation will explode but IMO something has to give at one point. Pretending everything is a text and wasting trillions of CPU hours in constantly parsing stuff is just irresponsible in so many ways (ecological included).
I sometimes think that the original Unix shell, when run on a PDP-11, was a paragon of waste, with CPU being so slow, and RAM so scarce. It was like running Ruby on an ESP 32. Still it was too useful to ignore.
I suspect that the endless parsing may be even economical compared to the complex dance needed e.g. to call a Python method. The former is at least cache-friendly and can more easily be pipelined.
Though it's ok to be slow for a tool which sees mostly interactive use, and the scripting glue use. Flexibility and ergonomics trump the considerations of computational efficiency. So I expect that the next shell will burn more cycles in a typical script. But it will tax the human less with the need to inventively apply grep, cut, head, tail, etc, with their options and escaping / quoting rules.
> I sometimes think that the original Unix shell, when run on a PDP-11, was a paragon of waste, with CPU being so slow, and RAM so scarce. It was like running Ruby on an ESP 32. Still it was too useful to ignore.
Sure, but it was a different time. People there were down for anything that actually worked and improved the situation. Like the first assembler written in machine code, then the first C compiler being written in assembly, etc. People needed to bootstrap the stack somehow.
Nowadays we have dozens, maybe even thousands of potential entry points, yet we stubbornly hold on to the same inefficient old stuff. How many of us do REALLY need a complete POSIX compliance for our everyday work? Yeah, the DevOps team might need that. But do most devs need that? Hell no. So why isn't everyone trying stuff like `nu` or `oli` shell etc.? They are actually a pleasure to work with. Answer: network effects, of course. Is that the final verdict? "Whatever worked in the 70s shall be used forever, with all of its imperfections, no matter how unproductive it makes the dev teams".
Is that the best the humanity can do? I think not... yet we are scarcely moving forward.
> I suspect that the endless parsing may be even economical compared to the complex dance needed e.g. to call a Python method. The former is at least cache-friendly and can more easily be pipelined.
50/50. You do have a point but a lot of modern tools written in Rust demonstrate how awfully inefficient some of these old tools are. `fd` in particular is times faster than `find`. And those aren't even CPU-bound operations; just parallel I/O.
Another example closer to yours might be that I knew people who replaced Python with OCaml and are extremely happy with their choice. Both languages have no (big) pretense that they can do parallel work very well so nothing much is lost by migrating away from Python [for various quick scripting needs]. OCaml however is strongly typed and MUCH MORE TERSE than Python, plus it compiles lightning-fast and runs faster than Golang (but a bit slower than Rust, although not by much).
> Though it's ok to be slow for a tool which sees mostly interactive use, and the scripting glue use.
Maybe I am an idealist but I say -- why not both be fast and interactive? `fzf` is a good demonstration that both concepts can coexist.
> Flexibility and ergonomics trump the considerations of computational efficiency.
Agreed! The way I see it nowadays though, is that many tools are BOTH computationally inefficient AND not-ergonomic.
But there also reverse examples like GIT: very computationally efficient but it's still a huge hole of WTFs for most devs (me included).
Many would argue that the modern Rust tools are computationally less efficient because they spawn at least N threads (N == CPU threads) but the productivity gain earned from that more aggressive use of machine resources is IMO worth it (another close example is the edit -> save -> recompile -> test -> edit... development cycle; the faster that is, the bigger the chances that the dev will follow their train of thought until the end and will get the job done quicker).
---
So TL;DR:
- We can do better
- We already are doing better but the new tools remain niche
- Old network effects are too strong and we must shake them off somehow
- We are holding on to old paradigmae for reasons that scarcely have anything to do with the programming job itself
>For the simple reason that we are no longer in the '80 where everything was a stream of text, and a shell that can manipulate complex objects is fantastic.
Stream of bytes is the only sane thing to do. Nothing keeps you from not having a flag in your cli programs to choose the input/output format. In fact many programs already has this and json seems pretty popular.
Having standard serialization is just gonna be boilerplate and unneccessary for many programs. Having user choosing how to interpret the input/output is the best way.
> Having standard serialization is just gonna be boilerplate and unneccessary for many programs.
And for a lot of programs having to keep generating and parsing strings is just a bunch of unnecessary boilerplate.
> Stream of bytes is the only sane thing to do. Nothing keeps you from not having a flag in your cli programs to choose the input/output format. In fact many programs already has this and json seems pretty popular.
Having an API like this is a great idea... as a thin wrapper on top of a tool with a standardized serde protocol or binary file format. The different API's can be exposed as separate tools or as separate parts of the API of a single tool from a user's POV.
Furthermore: JSON is not just a stream of bytes nor text, neither is CSV, nor any other text format. Handling text as properly typed objects makes a lot of sense.
I have little experience with shells that can work with objects like Powershell. But I'e seen screenshots of new objects-based shell developed in Rust some time ago and it was early days still so I didn't actually even try it out, but looked downright great compared to the current stream of text-based shells of today in terms of ergonomics and capabilities.
The most important thing is compatibility with humans. The main reason because I tend to use a pretty standard setup is because I know that these tools are the standard, I need to ssh into a server to resolve some issue, well I have a system that I'm familiar with.
If for example on my system I have fancy tools then I would stop using the classic UNIX tools, and then every time I would have to connect to another system (quite often) I would type wrong commands because they are different or totally not present or have to install the new fancy tools on the new system (and it's not always the case, for example if we talk about embedded devices with 8Mb of flash everything more than busybox is not possible).
To me the GNU tools have the same problem, they got you used to non POSIX standard things (like putting flags after positional arguments) that hits you when you have to do some work on systems without them. And yes, there still exist system without the GNU tools, MacOS for example, or embedded devices, some old UNIX server that is still running, etc.
Last thing, if we need to look at the future... better change everything. Let's be honest, if I could chose to switch entirely on a new shell and tools, and magically have it installed on every system that I use, it would probably be PowerShell. For the simple reason that we are no longer in the '80 where everything was a stream of text, and a shell that can manipulate complex objects is fantastic.