Well all of the 50+ people I know using linux everyday including myself, most of them as their major OS, have encountered these type of issues where you lose huge amounts of time trying to get something rather trivial to work. While your claim "It's between chair and keyboard" aka "yeah that's because they don't know how to do it" is of course perfectly valid, the reason the problem exists in the first place is usually elsewhere.
That being said, my last Debian install was pretty much effortless. But I seem to have a machine with which most Linux distors are quite happy :]
While there can be some trivialities that waste huge amount of time on every OS, however, installing Linux and setting up dev environment is not one of them.
Certainly not compared to Windows, where you must consider VS/msvcrt version, if you are using third party SDKs. It gets into "really not funny" territory when they have conflicting requirements.
Compared to that, Linux development environment is a piece of cake.
Sort of agree, the benefits of most linux' distros package managment system combined with lots of open source and gcc binaries being compatible with a quite few versions back are obvious.
But that doesn't mean they don't exist partly on Windows. "where you must consider VS/msvcrt version, if you are using third party SDKs." is a bit too narrow: that is (afaik) only the case if you use binary distributions of 3rd party libs and only for particular languages like C/C++ and if they make use of the standard libraries and force that upon the user by exporting instances of objects in it - and more often than not I've encountered all sorts of situation where that was not the case:
- it's OK to have e.g. a C lib linking against msvcrt x and use it in an application linked against msvcrt y, as long as the API is written properly and doesn't do insane things like trying to free what has been malloc'ed elsewhere
- if you have the source, you just build everything with the same VS version and that's it
- for C# etc there's a rather decent package manager so that's it as well usually
It may be or may be not easy to rebuild everything with the same VS version, especially open source packages. Many maintainers do not care about VS at all (e.g. xz/liblzma - use mingw, VS is not C99 anyway... but mingw links agains msvcrt.dll, which is a no-no) or have some crazy build system, that is supposed to work cross-platform, but really works only on Linux and maybe OSX (I'm looking at you, py2cairo). Some of them do care and building is a piece of cake (curl). Or they are somewhere in the middle, where will make their own build system, that defies all your expectations and does it's own things ignoring your vcvars (Boost).
When we get out of the C/C++ realm, it gets easier everywhere, whether C# or Python.
Oddly enough, I had a lot of trouble getting my new laptop to boot, work and sleep properly. Not for lack of experience: I first installed linux in the summer of 1992 using hlu's floppies and have used linux for daily work for about 20 of the 22½ years since then.
I can easily believe that someone used to windows with its mishmash of installers might have an unpleasant experience. Apt-get works wonderfully on a pristine system, but not so well if the system is halfway upgefucked with tarballs and rm -rf.
And particularly if systemd on ubuntu 14.04 supports his hardware as poorly as it supports my new laptop.
I'm with you - been running Linux since those days too, and I also think that the problem is entirely political in this case - he's already admitted that its not for any problem with the Linux host for C4 in particular, just that he's a frustrated Linux desktop user and wants nothing to do with Linux. But the Linux support is there and the C4 engine works on it .. so this is more of a political rant than anything else.
In that case I expressed myself badly (possibly due to systemd-induced gritting of teeth). Sorry.
I think it doesn't have to be political. The tiniest bit of ill will towards linux, a bit of windows-like behaviour (tar xf, rm -rf), and a bit of badly-supported hardware is quite enough to end up with an unworkable system and difficult-to-diagnode problems.
apt-get also only works extremely well for FOSS releases. If something is closed source (reality check) the developer will need to maintain the a build for each distro+release. The user will also need to add that apt repo and click through Linux's warnings about that.
> What a pity they've decided to take a political position rather than a technological one.
Ironically, it's the political (rather: religious) position of Linux that is getting in the way here. If you're not looking through the rosey colored RMS glasses it's very clear that the dev has taken the practical position.
Don't get me wrong, I wish it was different. Linux has amazing potential, but it unrealistically hostile toward these guys. Maybe SteamOS will bring some uniformity to the situation.
If you build your binaries against Ubuntu 12.04 (=glibc 2.15 symbols), chances are, that it will be extremely compatible with many distributions for years to come. Just make sure you know, what API you are really linking against.
The resulting package can install the repo by itself - i.e. user downloads the package, install it and suddenly, he has repo for updates installed. Google Chrome does it, for example.
Interesting! I assume you would have to statically link all other libs? Distros often swap out e.g. libfoo for foolib and from my own experience that usually ends up being a rabbit hole of "make."
You need to be able to make call, which libraries you are going to link dynamically. LSB libs, X11 and other parts of the infrastructure with staying history are safe, some random foo library is probably not.
The basic problem with systemd is that its "dependencies" based design result in all manner of nondeterministic behavior.
And i suspect this has little to do with the original complaint. I think that is more in terms of "lib version X-1 do not support the feature i need, X+1 breaks a different feature in a subtle way, and version X can't be found in any distro repo out there".
Sadly the above is not a problem of Linux or libraries, but of distro package formats balking at having multiple lib versions installed at once.
You've described the problem that a lot of developers I know have. Linux just isn't easy to understand, even for programmers, if they don't have a *NIX background to start with.
Actually, Eric Lengyel is a well-respected guy and programmer. If an OS takes him long (and by long I mean more than 10 minutes) to install and configure, that doesn't speak all-too-well for that OS.
Admittedly it's not as bad as 5 years, and yes, Ubuntu is ridiculously easy to install if everything goes well. That being said, I think obscure bugs are one of FOSS's biggest enemies. For example, a small and easily-solvable bug with NVIDIA driver can scare off a lot of users.
But, say I propose to stefantalpalaru to write an engine supporting Linux from scratch. And if that's too hard, wouldn't it be fair to say, that it is also a problem between the chair and the keyboard?
If I ever complain about needing weeks to do something as simple as installing a Linux distro and creating a development environment, feel free to say it.
But seriously, let's take a quick look (here)[http://www.terathon.com/architecture.php] and think, what might be a clear pain point in Linux, even if the "needed libraries" worked perfectly with all the hardware and all distros and the blobs were perfect.
Did I hear someone say "Linux sound support"? Yes, that is a correct answer. Sound in Linux is a know clusterfuck.
What else might be not as trivial? Did someone say support for multiple graphic cards? Yes, even with OpenGL it might be an issue.
Not to mention all the GUI tools.
So yeah, those all things obviously would never have any less than functioning dependencies on Linux...
Use PortAudio for sound (MIT license), Qt for the GUI and various OS dependent abstractions (threads, mutexes, etc.) - LGPL. If you don't like the OpenGL wrapper from Qt, go with FreeGLUT (MIT) or SDL (zlib license).
No, you're just spreading FUD about Linux. There is no audio clusterfuck, just incompetent developers like those working on Skype that only support PulseAudio instead of using a library that would give them ALSA, JACK and OSS support.
Seriously, buddy, I gave you like a ton of ways to deescalate the douche in your response, but you went right through the wall.
C4 isn't the only case where developers are frustrated with Linux. And if Windows and OS X fare better in attracting developers (which these systems do), even though Linux is essentially free (and the competition isn't), that's per definition a sign of a shit product.
So, you can of course, think that all of this is FUD and everybody in the world is incompetent, while more and more people drop support for a system that's too much duct-taped together for its own good.
Actually, in this case, glossing over the "needed libraries" is appropriate. Eric some serious NIH. Choosing to write his own video codec, for example.
I'm not sure I get your point. Although writing your own video codec seems a bit overkill.
The spec here isn't that trivial, I mean the thing has clearly strict performance and portability constraints across systems and compilers, not to mention maintainability and quality consistency across the releases. Finding libraries that fit well with those constraints, especially for a large project like this, isn't easy. So I can imagine scenarios where NIH would be justified.
I was mostly making a point, that using such broad strokes to claim someone to be incompetent, while assuming all the libraries and tools required are in perfect condition and perfectly work is dumb and Stefi's hissy-fit here is just self indulgent ego-stroking.
Yes, sorry. I should have better explained. In this case, C4 is capable of building with nearly zero libraries, afaik, only requiring OS libs, and OpenGL. In this case, I think we can expect the libraries to be in perfect working condition.
I think we found the problem. It's between chair and keyboard.