Are tightly closed environments like this where we want personal computing to go next?
No point even asking questions like this, until you address what to do about the millions of idiots who click on every attachment they get and answer 'Yes' to every dialog without reading it. Because social efforts have failed to fix this problem, it's apparently a technical one, and the technical solution is to make it difficult for all but expert users to treat new mass-market devices as wide-open platforms.
What does security have to do with a tightly closed environment?
You're conflating two separate issues, and when you come to the (obviously correct) conclusion that users are generally stupid and shouldn't be able to easily break their computer, that it has something to do with Apple deciding who gets to build applications and how one uses the machine. Apple does not provide better security in Mac OS X or on the iPhone than Open Source operating systems, in general. As far as I know, Android has at least as good a security history as iPhone for the time it's been in existence.
Sucks, but what else can they do?
They can do the right thing, and not act as gatekeepers. Or, they can do what they've always done, and maximize profits, minimize consumer and developer freedoms, and charge a lot for it.
> What does security have to do with a tightly closed environment?
It's the (false) assumption that because someone controls what software goes onto the machine that it's impossible for 'unapproved code' to run on the machine (or that 'approved code' has been thoroughly vetted against all possible inadvertent or malicious security risks).
> The can do the right thing, and not act as gatekeepers.
I wonder what people will think once the first AppStore app gets approved by Apple but turns out to be a piece of malware. I'm sure Apple has indemnified themselves against liability in such a case, but there are plenty of people who feel it is an impossible scenario.
I wonder what people will think once the first AppStore app gets approved by Apple but turns out to be a piece of malware. I'm sure Apple has indemnified themselves against liability in such a case
Apple has stated that they have a kill switch they can use in a situation like that. It would be extremely irresponsible not to on a device like the iPhone imho.
So is an Android phone. You're conflating two different concepts, as I've explained. Openness leads to good security; possibly better security than being closed. We have many years of evidence of that in desktop and server systems (where Apple does not have the best security record, by a long stretch).
iPhone can have good security, and be a walled garden. But, the two are not closely related, and it's disingenuous, or at least misinformed about computer security history, to suggest that they are closely related. Open systems can have good security, and be very open. They are orthogonal issues, and I'm surprised that people here would make the mistake of believing they are the same thing.
And you're conflating two different definitions of security. We're not talking about remote root holes or buffer overflows when discussing phone security; we're talking about trojans and other maliciously designed apps which are downloaded and run voluntarily by the user. A walled garden does provide better security in this regard, as demonstrated by the iPhone having a perfect track record at almost 3 years in, and Android already having malware in it's App Store despite the platform still being in it's infancy.
No point even asking questions like this, until you address what to do about the millions of idiots who click on every attachment they get and answer 'Yes' to every dialog without reading it. Because social efforts have failed to fix this problem, it's apparently a technical one, and the technical solution is to make it difficult for all but expert users to treat new mass-market devices as wide-open platforms.
Sucks, but what else can they do?