Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What do you mean by your data being protected by vast amounts of encryption? Can you verify those claims beyond trusting what Apple tells you? Isn’t the commenter above insinuating that a targeted individual can be compromised anyway?


The amount of money you can get for a iPhone 0day confirms it.

If they were playing fast and loose with cryptography and encryption, we'd have a lot more exploits in the open.


What do you mean by that? I don't think this follows at all.


If iPhones had flaws in the encryption or security, they WOULD be exploited and monetised.

A zero day remote attack on an iOS device is worth so much money that you have to be _really_ ethical as a hacker not to sell it and report it to Apple for a small reward instead.

The last time one was deployed "publicly" was against Jeff Bezos (or his wife) - one of the top10 richest people in the world anyway. And then it was patched for everyone.


Apple owns the code and all it's visibility - you're not allowed to see it.

They don't need a 0 day to compromise your device, they can just choose to do it at any point. You're simply trusting they don't.


And you think not having a the source available hiders security researchers?

It’s kind of what they do.


If the attack is targeted, it doesn't matter.

Again, you're just trusting Apple not to do that. Please bear in mind that if the government asks, it's not like they have a choice.


If the government "asks", EVERY company will fold.

Apple is the only one building stuff so that they can't fold, even if they wanted to.

You can turn on the extra protections and encryptions yourself at the cost of user experience.


Apple is the only one that effectively knows the Real Name of all their users, because you cannot do anything on an iOS device without signing up for an Apple Account first.

It's virtually impossible to sideload anything on an iOS device without extensive developer know-how; but for Apple itself to do a targetted attack, would be a trivial task.

Android is the privacy heaven by comparison.

It's relatively trivial to get started with F-Droid and Aurora Store, and then you can install whichever apps you need, without providing any identifying information system-wide, without needing anything beyond the Android device itself.

No PCs, no mandatory 0days, no exploits, no specific software/hardware requirements, no warez, no copyright infringement, just pure free software and a few warning dialogues from Google about the dangers of installing the third-party apps, before you can do whatever you wish with the hardware you paid for, on any Android device of any vendor.


The difference is the open source software is auditable - Apple necessarily isn't. Its not the same.

Its not a user interface problem either, that's just a lame excuse. iMessage is end to end encrypted and is arguably one of the most pleasant to use messengers.


Apple is very explicitly and deliberately building their systems to forcibly collect massive amounts of user data that they can and do provide to the federal government.

While it is true that close to all companies will comply with lawful orders (but not EVERY company, FWIW: Lavabit famously shut down instead of handing over SSL keys to feds), it is possible to design systems in such a way as to protect FAR more user data privacy than Apple does. Case in point: review the contents of Signal's subpoena response a few years ago:

- https://signal.org/bigbrother/cd-california-grand-jury/

This isn't a sham privacy claim like the kind made by Apple that requires you to trust the provider, either. Signal's clients are famously open source - something Apple does not do for pretty much any part of iOS or Mac OS:

- https://github.com/signalapp/Signal-Android

- https://github.com/signalapp/Signal-iOS

- https://github.com/signalapp/Signal-Desktop

Additionally, most of the Signal server's source code (nix the anti-spam components) is open source, as well as the libsignal library used across the clients and server alike:

- https://github.com/signalapp/Signal-Desktop

- https://github.com/signalapp/libsignal

Apple could be this transparent if they wanted to. They choose not to be, because the truth is, they do not actually care about user privacy, they are constantly collecting massive amounts of telemetry, user data, and user metadata from every single device they make, and they have been proven to share this data extensively with the federal government via the Snowden leaks, even in spite of the few actions they take publicly to maintain the marketing illusion of being a company that cares about user privacy, such as in the wake of the San Bernardino shooting.


Why is Apple taking the harder route then? Like having Maps go through proxies and get the route in small bits so that Apple's servers don't know who is going where, for example?

Meanwhile Google is giving you notifications about "would you like to review <this exact tiny shop you were just in>", because they are the good guys?


For the same reason the TSA exists: theater.

The TSA performs security theater, where they take the harder route, yet fail to even detect, let alone stop 95%+ of yesterday's threats, to say nothing of today's or tomorrow's threats:

- https://www.theverge.com/2015/6/1/8701741/tsa-screenings-hom...

Apple performs privacy theater, where they take the harder route, yet extensively log user data and share it with federal intelligence agencies:

- https://www.theguardian.com/world/2013/jun/06/us-tech-giants...

As for the deeper why: it's more important to the US government for passengers to feel safe than it is for passengers to actually be truly safe.

Likewise, it's more profitable for Apple to make its customers feel their data is private than it is for Apple to make their customers data actually be truly private.

Apple is not privacy-preserving company.

Apple is marketed as a privacy-preserving company.


>The amount of money you can get for a iPhone 0day confirms it.

Less than for an Android 0day then. Yes, it's quite telling.


You got it wrong.

An iOS 0day would be far more valuable, confirmed by the rest of the thread above.


No, an iOS 0day _is_ less valuable. Every exploit acquisition program pays out more for an Android RCE than it does for an iOS RCE. And it's not surprising: give iMessage a mean look and a .png that looks funny and it breaks under the pressure.

The rest of the thread above is merely the delusions of an Apple fanboy, followed by dozens of people listing out reasons why an iPhone is more vulnerable to attacks, both from external actors _and_ from Apple collecting massive amounts of data and having total remote control of "your" device.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: