It's a disaster. The proper way to do it is distro packages, but for some reason every language, framework, ecosystem and individual developer wants to reinvent this particular wheel. I really don't understand why.
It's because individual tools usually don't want to be tied to assumptions made by one particular distro. I actively avoid using distro packages for 3rd party development libraries and such, especially when a good tool for accessing upstream sources (eg pip) is available.
I use packages for certain tools and platforms, and libraries if I feel the library is really something I want to be a standard part of the system environment. For example, I am more likely to use the distro package of a python library (if available) if I'm planning to use the library for a system administration task than if I am planning to use it for application development. I'm also likely to use distro packages for things like apache, nginx, postfix, unless I have some case-specific reason not to.
One technical reason is that I might use two different versions of the same library in different projects and apt-get only allows me to have one at the time. I think npm and gem are brilliant on this regard.
Best of both words: docker. I consider docker an application packager.
You know, it is still simpler to make your own deb or rpm, than entirely different package system.
It is more of a case, that these different package systems were introduced on platform that lacks native one. Then, by combination of laziness/not wanting to build another package and recycling the already built binaries, they got a traction on linux systems too.
And there is a reason, why distro packages move slower - people having that deployed in production do not like breaking changes. If you want bleeding edge packages, use bleeding edge repos.