Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Linux on the Mac – state of the union (lwn.net)
208 points by ah- on Dec 12, 2016 | hide | past | favorite | 138 comments


>>> "the realization that Apple no longer caters equally to casual and professional customers as it had in the past [YouTube video]. Instead, the company appears to be following an iOS-focused, margin-driven strategy that essentially relegates professionals to a fringe group."

Ouch


Except this is largely based on mythology. People seem to equate "pro user" with "software developer" (and no other roles/occupations at all), and invent a fictitious history in which the Macbook Pro was jumping up and down on stage shouting "DEVELOPERS! DEVELOPERS! DEVELOPERS!"

Except that never happened. The perceived dev-friendliness of the Macbook Pro was a historical accident: it turns out that when you build something for use cases like audio/video engineering (which overlap a bit in their hardware needs with use cases like compiling software), and happen due to quirks of your company's history to be shipping a Unix-y operating system, developers will like it. But aside from providing tools to build applications for Apple's own platforms, the MBP and other Mac hardware wasn't deliberately catered to developers.

Meanwhile, the audio/video-engineer types still seem to like the MBP; the reviews I've read from them are positive about the touch bar and accepting of the fact that USB C is the future. It's just the developers -- and largely developers who hated Apple's products anyway and likely will never interact with the new MBP -- who say that this is proof that Apple has completely abandoned "pro" users.


>>> "Meanwhile, the audio/video-engineer types still seem to like the MBP"

I can't speak to the professional video world, but that's absolutely false in the professional audio world.

My experience is in the world of composition and sound design; the attitude towards using Mac for that shifted a while back, due to the lackluster and non-existent upgrades to the Mac Pro.

No one in the professional audio world that I know, or anything that I have read, has suggested they are liking the new laptop (which wouldn't get much use from them anyway).


>I can't speak to the professional video world, but that's absolutely false in the professional audio world.

Depends on what you mean "professional audio world". If you mean heavy old style studios, the kind Led Zeppelin might have recorded on back in the day, yes, but these are on the go (profits and usage wise) anyway, as the industry shifts.

Musicians, producers, DJs, etc most carry laptops and have home studios based around them, and most use MacBook Pro's for their stuff (as evident in all kinds of interviews and live scenarios).


I was referring to the composition and post production (sound design, engineering) world. Mostly because that is the industry I worked in for many years and still continue to work in (much more sporadically these days).

That's a world that was dominated by Mac Pros around 2010. Around 2013 I started seeing a shift, myself included, to custom built PC workstations and that trend is just increasing now. The initial switch, I believe, started with the lackluster cylinder Mac Pro, but continued due to the obvious failings in the Mac desktop market.

You speak of home based studios using MacBook pros, but anyone doing that is obviously not a professional. I will give you the fact that many DJs are using MacBooks for the mobile rigs, but at home, anyone actually doing professional audio work is likely using a massive powerful workstation or a number of PCs (master/slaves).

This notion that MacBooks (or even laptops in general) are super popular in the professional audio world is fiction made for advertising.


>You speak of home based studios using MacBook pros, but anyone doing that is obviously not a professional.

Tons of musicians/producers/etc have those, while making more money than professional studios from their productions -- and not just in EDM.

Lots of the work that studios did for even superstar musicians (pop, etc), nowadays happens in the box, and not just demos and early sketches.

>This notion that MacBooks (or even laptops in general) are super popular in the professional audio world is fiction made for advertising.

Rather the professional audio world is not what it used to be.

I'd consider million-making Bjork or whatever working on their laptop, as equally (or more) professional than some struggling studio or post-processing facility.


That's more a case of the MacBook Pro being "good enough" and Apple, and PC manufacturers too for that matter, have successfully segmented the market (with things like ports). Even Macworld will tell you these days that "whether you choose a Mac or PC for music production is largely down to the platform you prefer and who you’re collaborating with. There’s little inherent advantage to using Macs, beyond familiarity with the system, and the general robustness of the hardware" [1].

[1] http://www.macworld.co.uk/feature/mac-software/best-mac-maki...


Well if we're sharing annecdata: literally everyone I know who works in audio would give up their Mac only on death.

That covers several DJs, Producers, VO Artists, Audio Editors, and quite a few Sound Designers...


If we're sharing annecdata: Almost every programmer I know owns a Mac and loves it, and prefers working on them.

Except me. My employer doesn't let me use my own computer and refuses to buy me one.


> My employer doesn't let me use my own computer and refuses to buy me one.

? Then what are you using?


Refuses to buy them a Mac obviously. As in "doesn't let me use my own mac and refuses to buy me a mac of his own".

So, parent is using the computer the employer doesn't refuse to buy them, a PC.


Now it makes sense. I really wondered.


Here in my little corner in Europe, I hardly know anyone, other than management and colleagues doing iOS/Appstore projects, having one.

At my company they belong to a pool for iOS projects.


What would be the alternative anyway? Imagine a DJ playing a set and suddenly having Windows force an update on them. Linux during some periods barely played audio for regular use cases.

If you want good hardware and a reliable operating system, you go with a Mac. A few annoyances with adapters it might require won't change that at all.


Anecdotally the surprising answer might be rather old BeOS based systems. At least that's what I saw once and apparently some people still use some old tools that are BeOS only for audio (from talking to the DJ) :D


I'm sure you'll find 2 or even 20. I doubt you'll find 100 of those people.


I recall senior players (ie work at Air 1, Abby Road etc) in the audio world saying this 5 years ago in sound on sound


I got the strong impression that those self-proclaimed "professionals" are more or less exactly the opposite: people who spend an inordinate amount of time plugging in dozens of different peripherals with random port requirements, fiddling around with the hardware, and running benchmarks.

For what it's worth, a Mac today is still as useful (at least for web development) as it has always been. Much more so actually because homebrew has evolved to be completely stable.

And while CPU speeds have been stagnating (mostly because they've reached "good enough" and then some), people are ignoring two major shifts in hardware that have significantly improved the dev experience: Retina screens and SSDs.


The absence of SD card slot and general lack of periphery ports is universally lamented by AV professionals. There are also reports of poor battery life although I suspect it is a side effect of a brighter backlight.


No that is not mythology, Apple specifically targeted developers in the past, such as in this 2002 commercial featuring a stereotypical grey-bearded UNIX guru:

https://www.youtube.com/watch?v=O4AgFhIm1qs

Quote: "It's not just for artists. It's not just for pointy-headed intellectuals from California. It's for us, it's for programmers, it's for developers."


So then the pro applications are supposed to come from? And beyond that, where are their iOS applications supposed to come from?


That was then. But now that Apple does know that the MBP is perceived as dev-friendly, based on the type of people in that market segment that buy them, why would they let that slice of the market go?


Who said that slice of the market will go? Complainers are always amplified on the internet. Check back in 2 years.


True. But they do seem to have created a niche in the market for someone else to fill, due also to their high markups. Most "developers" don't actually need Macs, many of them just haven't realized it yet.


The perception, as displayed in this thread, is that they are. Plus the overwhelming negativity from developers regarding the new MBP, as seen in numerous HN threads and other tech news stories.

I don't really have any stock in whether the perception is a reality or not, so I'm willing to be persuaded. Hopefully in the next iteration the MBP will have at least 32GB RAM, and they will make a move that will please the software developer market.


>I don't really have any stock in whether the perception is a reality or not, so I'm willing to be persuaded. Hopefully in the next iteration the MBP will have at least 32GB RAM,

The people lamenting the missing option for 32GB RAM as devastating for pros, never had it in the first place, as the previous generation didn't support 32GB either (and yet, they still worked just fine).

Second, the limitation to 16GB is from Intel, not Apple. Intel didn't have 32GB compatible modules with low energy consumption ready -- and without those, Apple using the energy sucking 32GB option would reduce battery life 20% or more.

Not to mention that the overwhelming majority of PC laptops developers buy don't have 32GB at this point in time either, including the most famous models (XPS etc).


Well people who are lamenting the lack of the 32GB ram bought a 16GB ram a while ago and are looking to upgrade.

Apple could have kept the previous form factor and not made the battery 20% smaller which would have allowed them to have an energy sucking 32GB option. Plus the new form factor is useless anyway, since we end up having to carry adapters which take as much space and weight as what was saved.

Anyway, I've developed a wait and see approach and reserve judgement next year when the next iteration of machines come out. Maybe I'm lucky and Apple even decides to bring back the 17 inch macbook pro


>Apple could have kept the previous form factor and not made the battery 20% smaller which would have allowed them to have an energy sucking 32GB option.

More people like thinness though, than 32GB RAM (which would have been a built-to-order option that few would have clicked). Remember, Apple has those numbers too.

To quote a programmer on thinness:

"I’m have to admit being a bit baffled by how nobody else seems to have done what Apple did with the Macbook Air – even several years after the first release, the other notebook vendors continue to push those ugly and clunky things. Yes, there are vendors that have tried to emulate it, but usually pretty badly. I don’t think I’m unusual in preferring my laptop to be thin and light. (...) I’m personally just hoping that I’m ahead of the curve in my strict requirement for “small and silent”. It’s not just laptops, btw – Intel sometimes gives me pre-release hardware, and the people inside Intel I work with have learnt that being whisper-quiet is one of my primary requirements for desktops too. I am sometimes surprised at what leaf-blowers some people seem to put up with under their desks. (...) I want my office to be quiet. The loudest thing in the room – by far – should be the occasional purring of the cat. And when I travel, I want to travel light. A notebook that weighs more than a kilo is simply not a good thing (yeah, I’m using the smaller 11″ macbook air, and I think weight could still be improved on, but at least it’s very close to the magical 1kg limit)"

Linus Torvalds, APRIL 24, 2012


Iirc, Intel shipped custom CPUs to Apple for the Macbook Air. Once those CPUs became more generally available other brands followed suit quite quickly.

This has been a repeating pattern with Apple since the iPod success fattened their wallet. They would routinely single out parts they wanted, and then basically order the factory capacity for that year in one go.

Palm/HP execs lamented this fact when they unveiled the Pre 3 and Touchpad, as they often wanted to use the same parts but found Apple had gotten there before them.


A rev of the original MacBook Air (the 2008-2010 one) got a custom package of an Intel CPU before it was available commercially.

The original MBA was expensive and underpowered, a niche product at the time. PC vendors responded with equally niche designs (remember the Dell Adamo?)

The Late 2010 MacBook Air introduced the current case design at a reasonable price, used 2+ year old internals (Core 2 Duo + 330m graphics), and set the world on fire.


> set the world on fire.

Yeah i can smell the bonfires from here...


Well, considered it went top selling, got huge profits, was mimicked by almost every other manufacturer, and Intel even gave money to OEMs to make a competitive PC laptop, I don't see the sarcasm as warranted.


>Palm/HP execs lamented this fact when they unveiled the Pre 3 and Touchpad, as they often wanted to use the same parts but found Apple had gotten there before them.

Isn't that how economies of scale and bulk ordering is supposed to work?


> I don't really have any stock in whether the perception is a reality or not, so I'm willing to be persuaded.

Apple said it was their best laptop launch by sales volume ever. (Although that is partially due to such a long wait from their previous model.) Honestly I'm not that surprised - despite all the hate online I know 2-3 developers who bought them on or near launch day and several others who started salivating when mine arrived. The public internet opinion seems shockingly disconnected from that of everyone I've spoken to in person about it. Although, doing mostly web development more than 16 gigs just isn't important for me or my coworkers. My last laptop had 8 and it was more than fine.

Most people are much more curious about the touchbar (its a bit gimmicky), the screen (gorgeous but no touch-based scroll in browsers sucks) and the price (yowch!). I get a few comments both ways about the new charger (oh cool no more proprietary connecters!) (whaaat no magsafe? Boo!).

Oh, and being a Gen1 apple product there's all sorts of dumb stability problems - which is a much bigger problem than the ram! My touchbar has been glitching out sometimes and the whole system has hard-crashed a half a dozen times since getting it from connecting and disconnecting my 4k monitor. I'll be very disappointed if the software doesn't improve over the next few months.


The constant crashing, touchbar bugs (a few times it didn't even turn on!), graphical corruption, obvious cheapskating (no extension cable, non-data charging USB cable, bad keyboard backlight, no more cable holders on power brick and more) are the larger disappointments with this machine :/


Given those issues, I'm wondering if returning the laptop for a new one might be in order? For what it's worth, I've been using the new MacBook Pro with the touch bar for a few weeks now, and have only experienced one hard freeze.

Contrast that with the steady stream of issues I experienced with a similarly specced high-end Dell XPS 15"--power cord whine, wobbly and noisy fan on the graphics card, severe overheating to the point that the BIOS trigged shutdowns, repeated keys (an ongoing issue affecting multiple Dell models based on support forums), suspend/resume failures, laggy touchpad, and numerous issues with Wifi & sound across several distributions of linux--the new MacBook Pro has been a delight. To Dell's credit, they've released BIOS updates on a regular basis, but the laptop is still plagued with annoying issues, even on Windows.


Honestly I'm fed up of how video editors act like they're the whole damn "pro" market and nothing else matters.


What I don't get from the article, if I'm pissed about the hardware change why am I trying to run Linux on it/my old one?

A) You keep your old laptop and continue to run macOS for as long as possible

B) You're shopping for a new laptop and don't buy Apple, then run Linux

If you keep your old hardware why run Linux on it? I've been looking at who else is out there are making decent laptops that run Linux well (open to suggestions, especially if they have more than 16GB RAM), as I have a feeling my 2015 Retina will be my last MBP.

The hard part for me is that I've gotten used to a certain look and feel with the MBP. That nice solid, rigid body, and the display hinge that just stays. When I pick up a ThinkPad and it creaks and flexes when I open the display or casually pick it up by the corner it makes my skin crawl.


Have you taken a look at the Razer Blade? It is basically equivalent to the Macbook pro's hardware in every way, except black, an i7, a touchscreen, and an NVIDIA GeForce GTX 1060, and a bit less expensive. I just got one. I haven't put Linux on it yet, but I'm pretty blown away with the hardware.


Wow, that looks like an MBPro clone. I've never heard of this company. I wonder if its machines work well with Linux?


Razer's basically a "gamer gear" company. Think stylized light-up keyboards, mice, headsets, and so forth. Problem is, their quality control is really all over the place and they are ridiculously expensive. This weird QC is part of the reason I never bothered with their laptops despite the well specced hardware.


There was an article last week stating they don't support Linux out of the box (or at least not Ubuntu), but I don't remember details regarding how much work is required to get it running.


The Arch wiki has a whole load of details on getting one up and running[1]. These instruction will usually translate between any other (up to date) distro with a bit of translation.

1. https://wiki.archlinux.org/index.php/razer#2016_version_.28R...


Their older models from 2015 or so with 970m GPUs won't boot due consistently due to the hybrid (Optimus) graphics. The situation may have improved with the newer models, though, and their Stealth line only has Intel graphics which should work fine.


See Dell's XPS line. The 15" variants go up to 32GB ram.


I cannot recommend Lenovo X1 Carbons enough. I use it at work. Runs Linux buttery smooth. (Tested Ubuntu and Arch, both work perfectly, Arch needs a few minor tweaks)


Apart from being wrong, that narrative doesn't even make sense as an intro to this article: the criticism was directed at the new MBP hardware – so why would these super-professional uber-users buy it and then replace the OS?

(Not to mention that, if I understood it correctly, Linux currently doesn't even support the keyboard, and is otherwise probably in its usual desktop mode of "WLAN, Sound, or functioning sleep – choose any two".


Author here. As explained in the intro, folks like antirez are considering moving back to Linux and in his linked HN comment he specifically talks about trying Linux on the MacBook he's currently using to see where he's headed. The article seeks to give these folks in particular a helping hand in understanding what works and what doesn't:

https://news.ycombinator.com/item?id=12820490

That said, the kernel developers working on Mac hardware support aren't Apple haters. We're fans. I also don't share the "not for professionals" criticism that was widely directed at the new MacBook Pro fully: E.g. as stated in the article, the four Thunderbolt ports are interesting for HPC applications as you can build a portable compute cluster with up to five fully-meshed high-speed networked nodes. If that's not professional I don't know what is.


Really liked the article.

Quickly jotted down my own progress as a user http://williamedwardscoder.tumblr.com/post/154376389528/linu...

Really appreciate people making this stuff work, and hope I can use my retina MBP in the long run.


Thanks! The FileVault2 driver for Linux that you couldn't find is probably this one:

https://github.com/libyal/libfvde/wiki

It's experimental though. Better back up your data.

I'd recommend to create a partition for ZFS and move data there that you want to use on both OSes, or outright install Linux on a zpool and mount that on macOS. This has worked remarkably well for me, even with lz4 compression and all the other great features. Layer ZFS on TrueCrypt if you need encryption, I wrote a howto a while ago but never had the time to update it:

https://github.com/zfsonlinux/pkg-zfs/wiki/Dual-booting-OS-X...


maybe, but not all pros are computer science pros, e.g. photo and videographers for example are pretty pissed that the sdcard reader was removed as far as I know... but then again that's solvable, ram limits far less (3d artists might like it?)


That's right. But it seemed for a while that the MacBook Pro had the feature set, hardware configurations and broad appeal that allowed both groups to be aligned on their hardware choices (i.e. a MacBook Pro).

It is perhaps too early to wave goodbye once and for all to that coalition, but it does seem like some of the design directions that Apple has taken have drifted away from that user base.

That said, this war on the professional and creative class of users began sometime ago - I think the introduction of Final Cut X was the first salvo, while the platform has significantly evolved from the first couple of problematic releases, I personally know some who considered the product a literal betrayal and abandoned ship to Premiere Pro or AVID.

I sense Microsoft is attempting to reassemble that coalition with their offerings, namely the Surface line (especially that gorgeous looking Surface Studio) and the introduction of Bash on Windows, which while still quite limited in its functionality, can be a seen as a step.

I really like my MBP 2013 (NVIDIA GPU) and I guess I am going to keep it a while longer.


There are computer science pros that only care about Swift and Objective-C programming, not UNIX.


> E.g. as stated in the article, the four Thunderbolt ports are interesting for HPC applications as you can build a portable compute cluster with up to five fully-meshed high-speed networked nodes. If that's not professional I don't know what is.

Can you elaborate on what you mean by that? Are you implying a MacBook cluster is an effective HPC platform?


Let's say you're a penetration tester, you're on-site trying to break into a network, have no Internet connectivity (e.g. because you're in a shielded data center or don't want to raise suspicion) and need to crack some passwords. 5x Skylake CPU + 5x Polaris GPU would be a sufficiently beefy platform for Hashcat, and it would fit into your backpack.

Or let's say you want to quickly set up a Hadoop cluster to crunch some data.

Five machines is certainly small as HPC clusters go, and in reality you'll probably have to cap this to four machines because you need one port to attach to a power socket if you need the cluster for more than 2-3 hours. Nevertheless the 4x 40 GBit/s Thunderbolt ports offer some really interesting possibilities, and I'm not aware of any other portable machine that has this right now.


There's 4 40Gbit/s ports, but their total bandwidth is not 160Gbit/s. IIRC, the ports on the right side of the machine have less total bandwidth because of how they're connected, for example.


That is correct. The Thunderbolt controller has two DisplayPort sink ports which are wired to the GPU, in addition to the PCIe 3.0 4x interface which goes to a root port in the CPU. So to actually max out the 40 Gbit/s, you need to drive external displays and also max out the PCIe 3.0 4x interface. Which is another way to say that 8 Gbit/s are reserved for DisplayPort.


Can you run network stack at 40 Gbps on those ports?


Yes, but as stated in the article only on macOS or Linux on non-Macs. Not with Linux on Macs unfortunately. Someone needs to scrutinize the Intel-developed code for non-Macs, figure out how to port it to Andreas Noever's driver, and fill in the missing parts by reversing the macOS driver.

Also, the Alpine Ridge controllers are attached with PCIe 3.0 4x lanes, that's 31.504 GBit/s, so you can't exhaust the available bandwidth fully.


The article talked about how Linux supports several models, not just the latest one. If I owned an older Macbook but now regarded the product line to be doomed, I'd want to know how painful it would be to start migrating to a different platform on my current hardware.


Although I have experienced that choice on Apple hardware recently (2015 mbp retina), lately Linux seems to, let's say, 95% Just Work on non-apple x86 hardware for me, WLAN, sound, and suspend/resume.

[edit] and this is with any distro, perhaps with more or less setup work


I find sleep/resume is successful when the proprietary driver is installed, nouveau isn't doing it doing it for me on Ubuntu derived 16.x distros (this month I have tried KDE Neon and Bodhi). The problem though, is that installing the drivers can be a pain. Last time I tried I soft bricked the PC and ended up doing a full reinstall to save time. As someone further down this thread said, what advantage is there these days in Nvidia not open-sourcing the drivers? They are still going to be selling the display cards.

The other problem for Linux I have encountered recently that burned through my free time was updateing the bios, there are no bios updates with Linux installers, which is annoying because the bios is update is OS agnostic. I had to create a bootable USB drive from a windows program (luckily I have a Windows PC for work) and boot from that. But the how-to guide from HP was lacking in details, and on one PC the program doesn't allow me to create the drive on one PC for use on another, so no bios update there.

The problems I have with Linux never seem to be to do with Linux, it is almost always hardware manufacturers being obstinate and refusing to share their toys with other adults.


Yeah in the future, I will certainly be buying hardware that is designed to run desktop/server Linux (and support features you mention like BIOS upgrades easily as well) or is otherwise known to support it well. No more random COTS buys for me without research first. I'm done burning hours on this sort of weird crap vendors want to give us. So if a vendor wants to sell me a PC or enterprise anything, it needs to support Linux well.

I also don't plan to purchase any brand new Apple hardware. I will only adopt Apple hardware late (and used) and will only run OS X if absolutely necessary for some reason.


> what advantage is there these days in Nvidia not open-sourcing the drivers?

AMD have demonstrated that trying to do the right thing will get you shit from the maintainers and abuse from bystanders, who will then go and buy Nvidia because it gets a better frame rate.


AMD's been in the news for trying to half-ass the right thing, and asking the community to lower their standards because AMD is too poor to do the job right.


I never understand this sentiment: AMD is still putting in effort for a token market (Linux is something like 0.3% of computers if you don't count servers/embedded). Would you rather have them do nothing and leave the cards poorly supported?


They're putting in token effort and trying to create an impediment for Linux kernel development even on other hardware. And they were warned months ago that it wasn't going to fly.

Having a driver is good, but putting it in-kernel is bad if the kernel devs can't read it and fix it.


>95% Just Work on non-apple x86 hardware for me

That has been the case for a decade or so now. 95% is not enough (for non-tinkerers)


That 100% support includes a research before purchase. You just cannot purchase random hardware and hope for the best.

For me, major distributions work out of the box. For example, with my T430s, everything works, including the fingerprint reader and WWAN.


Agreed. It's still frustrating -- see my other comment in thread. Even for non-Apple x86 stuff, I'm researching well going forward.


This is a really great article I should add and I do wish them luck in their endeavors to support Apple hardware. The developers tone is quite inspirational.


I wouldn't necessarily blame the "WLAN, sound, or sleep" thing on linux. I'd blame that on cheap parts manufacturers that refuse to distribute open source drivers for fear of releasing company secrets and lack of profit margins to operate on.


>I wouldn't necessarily blame the "WLAN, sound, or sleep" thing on linux. I'd blame that on cheap parts manufacturers that refuse to distribute open source drivers for fear of releasing company secrets and lack of profit margins to operate on.

I don't understand why would anyone not open-source their drivers?

What's NVIDIA's secret sauce? Being able to build great silicon.

What's Intel's secret sauce? Being able to build great silicon.

What's Samsung's secret sauce? Being able to put together great silicon.

So let's say tomorrow Nvidia open-sources all their drivers?

So now you know that which registers do you use to shade what. So now what's AMD going to do? Copy the interface?

It's already copied. Most users use standard OpenGL/DirectX to communicate with their board.

I mean look at Intel. You practically know what's what there. You can make an OS on (at least older) Intel chips without drivers.

Did it help anyone compete?


What's NVIDIA's secret sauce? Being able to build great silicon.

NVidia's defining characteristic has been designing good drivers.

Even when they've been behind ATI/AMD in hardware, they've been ahead in driver stability and performance. Especially on Linux where they largely own the CG/post/design market (which mostly uses Linux workstations) and they are the only GPU vendor that has somehow squeezed Windows-competitive performance out of Xorg.


> NVidia's defining characteristic has been designing good drivers.

This. When shopping for discrete video cards, I no longer look at AMD's offerings because I'm not willing to fight with their driver, fret over whether or not I should apply updates, etc.


What is the NVIDIA and AMD secret sauce?

First, read this: http://www.gamedev.net/topic/666419-what-are-your-opinions-o...

So basically, most 3D applications and games are broken to a certain degree. Both NVIDIA and AMD invested quite a lot resources into making those applications run at acceptable level and not looking broken.

Now, when you figure out how to unbroke something and put it into fast path, your competitor can lift that easily into their driver. Neither of these players want that.

The good news is, that DX12/Vulkan/Metal have quite a different approach, they don't have complicated state machine and validation in the driver, making the driver simpler and these workarounds unnecessary. If everything goes well, we may see drivers for these stacks.


As I understand it, Nvidia and AMD can't open-source their graphics drivers are parts of them are under third-party NDA.


>As I understand it, Nvidia and AMD can't open-source their graphics drivers are parts of them are under third-party NDA.

OK, so the question is upstream. Why would Qualcomm, ARM, or any upstream care about people finding out the interface to their hardware?

What secrets are there in the drivers?


Modern graphic card drivers contain full compilers, for the GPU's shading language. It seems reasonable that you can extract performance with better compiler technology.

On the other hand it also seems reasonable that a compiler for e.g. NVIDIA's hardware might be hard to re-target to another vendor's hardware while retaining the same advantage.


One argument I've heard is that they fear patent lawsuits because it might give hints as to how their hardware works.


How would it?

As I understand things (which I'm clearly not an expert at all), CPU opcode is like an ABI,

put this value into register A, this value into register B, and a magic opcode into register C,

and poof, the register D contains the value of A+B.

A driver will say that you put int X into memory point A, Y into memory point A+1, and call function C with argument of A and D will contain the multiplied number.

So how does #2 protect IP over #1?


Let's say your driver exports an API that computes f(x,y,z) where the math turns out such that g(h(x,y),i(y,z)) is an easier and/or numerically better way to compute it.

Let's also say your hardware implements g, h and i, but doesn't wire them together yet. So, your driver's code calls the three functions.

Unknown to you, the trick to use that decomposition is patented. if you release your driver's source code, you make it easier for the patent holder to figure out that you are violating their patent (your programmers might even have made it patently obvious by mentioning the paper describing the trick in a comment)

Unlikely? Maybe, but the way patents are written, who knows? For an almost random example, I searched for "driver code and hardware patents". The first link I clicked was https://www.google.ch/patents/US20090006831. Reading that, I wouldn't know what driver would _not_ infringe on it.

Also, there are many patents on ways to move data around efficiently. Avoiding all of them while still writing a performant driver at not be possible.


Mostly because open source would make them easier targets for patent trolling.


"Blame" might be the wrong word since nobody involved with Linux has committed any moral failing here. It can certainly be analyzed as a drawback of the open-source model.

Microsoft has a lot of resources to incentivize hardware manufacturers to go to great lengths to make sure their drivers work well with Windows. No organization connected with "Desktop Linux" has that.


Or it could be that Apple just really doesn't care and most Apple customers don't either.


And don't want to reveal the number of ugly hacks they have implemented to get things working in the first place.

the reason Linux drivers are a "mess" is that the hardware they are trying to support are a mess, and said mess is papered over in Windows by the OEM drivers.


If you just realized this year that they are entertainment focused and devs/pros are a secondary concern, you haven't been paying attention. Since 10.6 when dual full screen was removed I say the shark has been jumped.


I wish to thank the dedicated people who make this happen. I'm forced to use Apple hardware at work, but thankfully can run whatever OS I choose. Here's to the crazy ones.


> Here's to the crazy ones.

That one hit home.. Based on my lack of options (Windows sucks, macOS is OK but the hardware sucks, Linux lacks a desktop ecosystem) I feel just like a crazy one, a misfit, the ones Apple addressed in that legendary ad.


>Linux lacks a desktop ecosystem

Granted, I'm more of a terminal power user, but what's missing (honest question)?

GNOME and KDE are both very refined and mature, and there are plenty more stable, if a bit less full-featured DEs.

There's the LibreOffice suite, which is stable, well-maintained, and feature-rich.

VLC is arguably an extremely stable and full-featured media player with Linux support.

Firefox and Chromium are both widely available on Linux.

Thunderbird and Slack ship for Linux, too.

I can scarcely think of other software you might need to run on Linux.


I was thinking more in terms of an ecosystem for desktop app developers.

Let's say you want to make a game for Linux. How do you package it? Do you have to make RPMs, debs and/or something else? Do you rely on dependencies available on the system, or do you build a binary with all dependencies statically linked? Do you care about inclusion in the package managers for the various distros? What do you use for Audio? You might want to use Pulse, but what if the user does not have Pulse installed? What if your game includes a launcher with a login form etc, do you choose QT or GTK? Do you have to test your game on the top 3 popupar distros? Top 5? And so on.


It actually does sound kinda crazy. If you're forced to use Mac at work, why mess around with some buggy, craptastic Linux-on-the-Mac when you can use use the perfectly good Unix already on the Mac.


What kind of office lets you install a new OS on their machines?


I work for a major enterprise software company. We can install whatever we want on our computers as long as we use full disk encryption.


Small companies usually. It helps being somewhere that runs a cross platform stack like Java and/or targets Linux servers anyway.

Once I'd been here a few months, they trusted me not to brick the laptop, they're fine with me installing Linux on a Mac if I say it works better for me and I can leave OS X on there as a dual boot.

Big companies are usually less willing, either due to some OS specific software or policies that enforce some specific encryption/security/monitoring.


Why not?


IT burden.


Weird, I would never imagine locking down development workstations -- how would you ever get anything done? Just one example would be Docker, having access is equivalent to root so why beat around the bush?

Our office basically treats development boxes the same as BYOD except the company buys the hardware. IT will do best effort diagnostics and recovery on those machines but otherwise you're on your own.


Eh I don't even necessarily mean locking stuff down. But having standardized images to roll out makes things simpler, keeping patches up to date, etc.


If Linux could utilise battery as well as MacOS or OS X I'd switch in a heartbeat. Obviously Apple have some proprietary battery magic as neither Linux, BSD or Windows can seem to get 50% of the battery life out of MacBook Pros and this is a significant cost on a laptop - otherwise dual booting for me will remain the best compromise.


First of all, macOS power manages devices too aggressively: I've noticed that if I ssh from another computer into my MacBook Pro (booted into macOS and connected to wired Ethernet), the connection becomes unresponsive after a few seconds of inactivity. I have to ping from the MacBook Pro to a machine on the LAN to make the ssh connection responsive again. It looks like macOS suspends the BCM 57765 Ethernet controller and doesn't resume it upon reception of packets from the LAN. Apple seems to consider macOS purely a client OS from which one ssh's to the outside world but not the other way round.

Second, there's no proprietary battery magic in macOS, as stated in the article I already achieve 10.5W idle power consumption on my Ivy Bridge MacBook Pro with discrete GPU, Thunderbolt and AirPort suspended, wheras macOS achieves 7W. I've noticed a GPE in the ACPI tables for the Firewire controller which presumably signals hotplug when the controller is asleep. We're not using that on Linux yet, we keep the Firewire controller active all the time. Adding that would probably save another 1W.

So we're inching ever closer to macOS levels of battery life, and I think eventually we may be able to surpass it because of the sophisticated CPU pstate management that Intel developers continue to tweak with every new release.


> Apple seems to consider macOS purely a client OS from which one ssh's to the outside world but not the other way round.

I'm not knowledgeable enough, but this seems a pretty reasonable assumption for 90% of use cases. And if this assumption holds, is it fair to say. I think what you're trying to say is "First of all, macOS power manages devices too aggressively for me"

> It looks like macOS suspends the BCM 57765 Ethernet controller and doesn't resume it upon reception of packets from the LAN.

What are the trade-offs of this decision? Could apple not do this and not change the battery life for anyone? Or would it benefit this case and adversely affect other cases?


For a Linux driver it wouldn't be acceptable to compromise functionality like this as it's expected to work equally well on big iron as on laptops. So we have to tweak a lot more to achieve the same level of battery life.


Just to understand this, why wouldn't that be an acceptable compromise? As someone who uses linux on both a server and a laptop, I can tell you that when my laptop is unplugged, I'd much rather have that battery life than inbound ssh connection stability. I would imagine this is much the same for the vast majority of linux users. Could the lack of compromise be part of the problem?


If nothing's plugged in then many devices can indeed be suspended in some way, either by putting them in PCI power state D3hot or by cutting power if the platform supports it. Plug events are then signaled e.g. by an ACPI GPE (general purpose event, an interrupt sent by the platform to the OS).

The case I was referring to was with the Ethernet cable still plugged in. They seem to suspend the Ethernet controller but don't wake it when packets come in, presumably because the controller doesn't support that. The machine appears dead from the outside after a few seconds of inactivity.


> They seem to suspend the Ethernet controller but don't wake it when packets come in, presumably because the controller doesn't support that. The machine appears dead from the outside after a few seconds of inactivity.

It seems like that would affect more than ssh connections, e.g. you wouldn't be able to receive email notifications.


Thank you for clarifying the case.



50 percent might be an exaggeration. I use linux on the 2013 MBP and get about 8 hours using Arch Linux and around 9 using macOS Sierra


I want to extend thanks to the diligent developers and engineers doing work on this. I'm running Arch on a 2012 Retina MBP with a GNOME desktop, and ignoring the shortened battery life -- which in my situation is generally possible -- this is an amazingly streamlined and refined experience.

I'd gladly contribute to the work being done if I had the background, (which I'm currently working on)!



Sadly, I haven't heard/read anything good about running Linux on the iMac 5K, which I'm tempted to buy when/if it is updated in 2017. I don't understand Apple's vision at all, and it wouldn't surprise me if they ruined macOS in the next five years. Being able to install Fedora in that case would really buy me some peace of mind.


There was a thread on dri-devel a year ago:

https://lists.freedesktop.org/archives/dri-devel/2015-Octobe...

Back then the probed panel resolution (EDID retrieved by radeon) was 3840x2160, which doesn't make any sense because the panel actually has 5120x2880. It's unclear if this was because of missing/broken MST support in radeon (Multistream Transport) or if Apple used some proprietary mechanism to squeeze the panel resolution in a single stream transport. You may want to contact the person who started that thread (Andreas Tunek) and ask if the situation has improved with contemporary kernels.


If memory serves Apple didn't actually use MST for the 5k iMac and instead developed a custom timing controller which allowed them to get the 5k resolution working over a single stream.

It's the same reason why, for example, Windows sees it as a 3840x2160 panel as well.


Oh, I wasn't aware that Windows was affected too. Apparently it took Apple some time to bring 5K support to Boot Camp for 2014 iMac model [1].

I guess I'll wait for Apple to actually announce new desktops before I seriously look into the driver situation...

[1] http://forums.macrumors.com/threads/bootcamp-update-5k-windo...


Serious question. What advantage is there to running Linux on a Mac? OS X is already certified Unix. Is there any open source software that you can run on Linux that you can't run on OS X?


For really old Intel Macs there is the hack of hardcoded framebuffer addresses, which they did instead of supporting UGA.


Why not adopt the Windows Subsystem for Linux (i.e., Ubuntu and bash.exe) strategy?


It needs a lot more time before it's usable beyond demo use cases.

If you're interested I wrote all about it[0] when trying to change my development environment recently:

[0]: https://nickjanetakis.com/blog/i-almost-rage-bought-a-macboo...


I bought a Thinkpad recently and was intending to use that for day-to-day development. It was pretty good but important parts of tmux just didn't work, so I just gave up and installed Linux on the machine (which I'm enjoying a lot).


Personally, WSL's existence temps me to touch Windows again in future if there's a particularly attractive machine.

But, well, Windows with WSL is still Windows. Much like Linux with a hypothetical flawless version of WINE would still be Linux. The desktop experience is still… poor.


The basis of this article makes no sense. What developers love about Macs is the OS and the nice integration between OS and hardware. Even if you hate the new hardware, it makes sense to continue juicing your current one and see what comes along next year; it makes zero sense to install Linux on a fringe hardware (Mac).


I guess it depends on what kind of developers you think of. Unless you're developing macOS or iOS Software why bother with an inferior software eco system? Brew is a hot mess. MacPorts is slow as fuck.


Personally I've been using fink for >12 years now, I enjoy being able to use the familiar apt-get and dpkg to manage packages on macOS:

http://finkproject.org/

I even backported the then-current version to MacOS X 10.4 (Tiger) last year. :-)

https://github.com/l1k/fink/commits/branch_0_38_tenfour


I develop backend components that run on Linux servers. The editors, the terminal, the debuggers on Mac OS are pretty good for me. Brew works just fine for my line of work.


Well, it turns out there are folks who like Apple's form language but can't get (all of) their work done on macOS or don't want to use it for some reason. So there's a target audience for Linux on the Mac. I'm not saying it's a huge crowd.


There are lots of developers happily running Linux on Apple laptops. It's traditionally been a good choice if IT dept makes you pick between a standardized random business laptop and a Mac.


I'm not saying there aren't; in my experience, Linux on a Mac didn't go smooth at all. Connecting Retina display and 1080p monitor at the same time, specifically, was a huge pain.


That is hopefully changed now with Wayland


If I'd wanted Linux, I would prefer a windows laptop with flipable touch screen. The Lenovo Yoga would have been great if they'd only support Linux: http://venturebeat.com/2016/09/21/lenovo-confirms-that-linux...


Apparently you can now install Linux on yogas: https://forums.lenovo.com/t5/Lenovo-Yoga-Series-Notebooks/Yo...


Macs have a good hw/sw-integration, but there are other features as well; I bought my first MBP for its trackpad, which is still unparalleled, and (as l1k puts it) its form language. I'm running Linux, which doesn't offer the same hw/sw-integration quality, but it's pretty good!


The trackpad and also the retina display. If you work with text, how could you not love a high-dpi display? After you get used to it, moving back to a standard 1080p screen feels like you are reading a laser printed page through a screen door.


>how could you not love a high-dpi display?

When the software doesn't handle it well. I don't particularly like any implementation of high DPI scaling. OS X's scaling caused performance issues for me, and programs would get confused moving between monitors. Linux doesn't have any way that I know of to cope with monitors with significantly different resolutions - eg, I can make the window manager scale everything 2x so it looks good on the 'retina' display, but I still have a 1080p monitor attached too, so then everything is huge on that.

Even without multi monitor issues, scaling tends to be picky with some programs on most OSs. OS X seems to be the most consistent, Linux is a roulette wheel depending on your GPU/window manager/program/day of the week. Windows works sometimes but looks blurry.

I have Linux installed on my work Mac and just run the 'retina' screen at 1920x1200.


Couple of guys in my office did the same, but came back to OS X after weeks of suffering problems with high-DPI support, multiple screens, bluetooth etc.


High DPI on Linux is indeed a mess, but I find running the screen at a lower resolution preferable to Apple's implementation of DPI scaling anyway.

Bluetooth is very broken too. Not sure if that's an issue with the drivers or the Linux bluetooth stack in general. Fortunately I only use it for my wireless headphones, which I can live without.

WiFi used to be a big problem when I tried Linux on a Mac at a previous job, thanks to terrible Broadcom drivers, but that all seems fine now.

Been running Linux on a Mac for 2 and a half months now, for me personally the improvements are significant and the issues minor.


can't even put linux on a pc, how gonna do a mac?


Seriously?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: