Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having grown up on a static linked world, where dynamic linking was a thing of big iron computers that we dreamt about being able to do in home computers this trend of static linked compiled stuff feels somehow a tragedy of some sort.

How bad have we gone, that is now trendy to return to the days of static compiled binaries and process IPC to achieve any sort of dynamism.



I think dynamic linking one of those ideas that looks neater on paper and therefore feels better but just fails in many ways in the real world (at least for statically compiled languages like C and C++).

Personally i think at least half the times i really tried to use Linux i always wanted a new shiny version of some program at some point, this usually devolved into a more or less broken system due to dependencies.

I do understand some people who really loves their package managers and knowing that their updated system will be secure, but many people are even more into shiny new things than me and if anything has held back Linux adoption on the desktop I'd probably point to this second to drivers and configuration troubles historically.

Am I alone in this? I doubt it if we consider the existence of Docker, Snap packages and similar things.


Containers fulfil other purpose, and have existence in some form on mainframes and big UNIXes, because just using UNIX permissions is not enough to actually secure a process.

They just happen to have been misused to workaround deficiencies on Linux software distributions.

The only areas where I see a failure of dynamic linking is security, given the possible exploits of the host process.

Even IPC comes to IPC hell if not everyone is speaking the same version.


Sorry for the late reply but my point was more about user and developer experience (the end result of the system in place) vs the traditional sysadmin/poweruser view of packages among Linux users/distros where the security and interop details of systems matters more but is kind of in conflict (as a loose reference, think of the conflicing requirement origins of why shadow-IT exists).

Like as a user i don't really need to care if my Chrome browser uses a newer version of the V8 JS engine or libpng for security than for example my VS Code IDE (because that IDE instance isn't active on the hostile internet visiting "random" webpages).

Sure the browser will be more secure and if it was dynamically linked I'd get the benefits of having both upgraded at the same time, however it'd also require breaking changes to an cutting-edge program like Chrome/Chromium to bring all dependencies forward, maybe breaking programs like VSCode that aren't updated as frequently but has more or less the same dependencies. (Like in practice it seems like Ubuntu even uses a dummy Chromium package that just forwards the user to a snap package).

The way forward i think would be to bring some variant of sem-ver to the C/C++ world with a unified strategy to package management, but i don't have hopes of that ever happening.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: