It's because system design is a lot less theoretically clean than something like FP, zero-cost abstractions, GC-less coding, containerization, etc, and forces programmers to confront essential complexity head-on. Lots of engineers think that theoretically complex/messy/hacky solutions are, by their nature, lesser solutions. Networking is actually a great example.
Real life networking is really complicated and there are tons of edge cases. Connections dropping due to dropped ACKs, repeated packets, misconfigured MTU limits causing dropped packets, latency on overloaded middleboxes resulting in jitter, NAT tables getting overloaded, the list goes on. However most programmers try to view all of these things with a "clean" abstraction and most TCP abstractions let you pretend like you just get an incoming stream of bytes. In web frameworks we abstract that even further and let the "web framework" handle the underlying complexities of HTTP.
Lots of programmers see a complicated system like a network and think that a system which has so many varied failure modes is in fact a badly designed system and are just looking for that one-true-abstraction to simplify the system. You see this a lot especially with strongly-typed FP people who view FP as the clean theoretical framework which captures any potential failure in a myriad of layered types. At the end of the day though systems like IP networks have an amount of essential complexity in them and shoving them into monad transformer soup just pushes the complexity elsewhere in the stack. The real world is messy, as much as programmers want to think it's not.
> The real world is messy, as much as programmers want to think it's not.
You hit the nail on the head with the whole comment and that line in particular.
I'll add that one of the most effective ways to deal with some of the messiness/complexity is simply to avoid it. Doing that is easier said than done these days because complexity is often introduced through a dependency. Or perhaps the benefits of adopting some popular architecture (eg: containerization) is hiding the complexity within.
> It's because system design is a lot less theoretically clean
Yea this is a major problem. It's sort of a dark art.
Real life networking is really complicated and there are tons of edge cases. Connections dropping due to dropped ACKs, repeated packets, misconfigured MTU limits causing dropped packets, latency on overloaded middleboxes resulting in jitter, NAT tables getting overloaded, the list goes on. However most programmers try to view all of these things with a "clean" abstraction and most TCP abstractions let you pretend like you just get an incoming stream of bytes. In web frameworks we abstract that even further and let the "web framework" handle the underlying complexities of HTTP.
Lots of programmers see a complicated system like a network and think that a system which has so many varied failure modes is in fact a badly designed system and are just looking for that one-true-abstraction to simplify the system. You see this a lot especially with strongly-typed FP people who view FP as the clean theoretical framework which captures any potential failure in a myriad of layered types. At the end of the day though systems like IP networks have an amount of essential complexity in them and shoving them into monad transformer soup just pushes the complexity elsewhere in the stack. The real world is messy, as much as programmers want to think it's not.