The reasons for not open-sourcing the drivers don't make sense to me. Sure, the community might not have the knowledge necessary to improve your drivers, but then no harm done. Drivers might be taken in directions you don't want them to go? These companies release new chips multiple times per year. With them come new drivers. Who cares what directions the old drivers take?
And furthermore, chip companies might just get free driver debugging. That makes their chips look better. It makes them look better. More sales.
Is there any possibility that the real reason for the lack of open-source drivers, is that in many cases "different" graphics chips are essentially the same, and what you're really paying for with the more expensive ones, is a driver that enables high-end functionality? Open-sourcing would wreck that.
I agree with you that GPU drivers should be open sourced. In fact, all driver source code should be released for all operating systems. They also should ship with instruction manuals and data sheets with circuit diagrams.
I write GPU driver software (in userspace) for living. I'd be much more happier if I could push my code to GitHub for everyone to see.
However, I do understand that keeping the drivers secret can be a competitive advantage. GPU's are devices unlike any other, they're all different from vendor to vendor and model to model. The driver source code will reveal crucial parts of the hardware design. Knowing what your adversaries have in their chip is a disadvantage to them. I guess some reverse engineering probably takes place, but that is at a whole different level than actually seeing the source code.
OpenGL is also an issue. It's a nasty legacy API from Silicon Graphics, dating back to 1992 or so. It's not only a very crappy API for the programmer, it's also hell to implement. If you don't believe me, try to look for texture completeness rules in the GL spec (the problem doesn't exist in Direct3D, because they don't have backward compatibility or legacy API's to deal with).
So, having a working OpenGL implementation that passes the 30,000+ conformance test cases is also a competitive advantage you don't want to give away. It's worth several years of programmer effort.
However, I still think that we and everyone else should open source their GPU drivers and OpenGL implementations and compete in who makes the baddass-est silicon chips.
"The reasons for not open-sourcing the drivers don't make sense to me. "
It makes total sense to me:
1-Every couple of years they can sell you a new card as the one that you have does not work with the new OS.
2-Competitors can not look at the code, understand and copy it, so technology leaders like NVIDIA are not reached by followers without significant investment.
3-Software Patents holders(patent trolls) can not look at the code and demand them exorbitant fees because using a buffer to paint on a computer screen is patented by them. This is very common in the USA and the main reason companies asked me for an NDA if I saw their drivers code.
4-People can not unblock features of cheap cards to make the same things expensive cards does(it is very common to make only one chip to get mass production cost and then to use some cheap hardware or software switch to deactivate features on cheap cards).
5- Video decoding and other stuff could be out sourced to specialist companies that demand their code to be secret by contract.
>> 4-People can not unblock features of cheap cards to make the same things expensive cards does(it is very common to make only one chip to get mass production cost and then to use some cheap hardware or software switch to deactivate features on cheap cards).
This is also sometimes done with slightly defective chips. E.g. AMD has sold some quad-cores as three-core cpu's when one of the cores has failed some quality control check. Sometimes they can be enabled and they work fine but there may be a risk involved.
I know this is sometimes also done because of marketing reasons. Some years ago, IBM was selling additional Java or DB "accelerator" chips to high-end servers. They were in fact the same kind of processor that the server shipped with but they were crippled with microcode so they could only run the JVM or a DB2 server or something. I very much dislike this practice, talk about wasted engineering effort.
These companies release new chips multiple times per year. With them come new drivers.
Not really. Because different GPU generations tend to be similar, they tend to have "unified" drivers that support multiple GPU models. This leads to problems if the community has diverged from your last code drop and now merging is really expensive and you'll get blamed if you don't do it (see Android).
Then there are additional problems because the vendors want to use the same driver core on Windows and Linux, but they definitely don't want to GPL the Windows version so they can't accept any GPL patches from Linux people, etc. (see Broadcom)
they definitely don't want to GPL the Windows version
Why not? Do the drivers provide them with a competitive advantage over anybody else? In the case of a GPU, I imagine they might. In the case of a wireless card, modem or something of that sort it seems pretty unlikely.
And furthermore, chip companies might just get free driver debugging. That makes their chips look better. It makes them look better. More sales.
Is there any possibility that the real reason for the lack of open-source drivers, is that in many cases "different" graphics chips are essentially the same, and what you're really paying for with the more expensive ones, is a driver that enables high-end functionality? Open-sourcing would wreck that.