After Google disclosed the vulnerability, more than two thousand apps with the vulnerability were removed from the Google Play Store, although this list is still growing and will probably grow as more apps download the vulnerability.
But all that is good news for developers. Unfortunately the same cannot be said for users. I have had much conversation online with people (mostly from the Coding Horror Facebook group) who felt that the news of Google’s security vulnerability had caused most of us to jump to conclusions based only on a mere technicality. I understand their frustration, but the same fears about Google could have been expressed by them about Apple without the same consequence, and in many ways more dire results would have occurred.
And perhaps there’s some truth to that (and I wouldn’t be surprised if there is), but at some level it’s clear that the number of bugs the world’s biggest companies suffer from is not exactly high. It was almost 2,000 when I was writing this post, and I think that in 2016 it’s been around half of that figure. It’s actually pretty amazing that Google, Facebook, and Apple have all managed to have so many bugs in their respective apps, each of which should have been exploited by the same party without the need for a significant “spine” for each company.
This trend raises the question of what percentage of developers’ apps come with vulnerabilities, and I’d like to offer my personal experience of this. It was a couple years back that I was working with people at a software company who were fixing their own apps. When the bugs did not get fixed it would lead to an entire cycle of fixing the “bug”, then fixing the “bug” again, and re-injecting the bug into a new app. This would happen several times until a fix was found which would be pulled from their own App Store (and thus have no negative impact on the app), which would put an end to the cycle of fix and re-inject. These days, most apps will no longer get a “fix” as soon as they are released, and the only possible reason they continue to get fixes is to get them in the hands of the “inferior” customers who are too lazy to do what it takes to break code.
At this point, it comes down to the question of “can they be kept safe?” At the moment, every single “fix” or “fix-fix” that Apple makes for their iOS app is for their “inferior” customers. This makes one question: can Apple really be trusted? One only has to look to the “back door” security flaw in the new iPhone (and in Safari by extension) to really get an idea of this issue. Apple is also notorious for their secrecy in regards to what sort of security they provide in new OS X or iOS apps (that may actually end up in a future product), and their “secret sauce” is actually incredibly simple, but the general public may not get their hands on it until the time is right.
My experience of iOS 6 security has been a bit more mixed. There are two things that I like about the new “big four” iOS devices which appear to be on track to be released soon the iPhone 5 and the iPad mini. These are great smartphones on a very reasonable price. But on a more serious note, for a device that is meant to be an entertainment or business device, these devices have been plagued by many of the same issues. I think it’s probably a fair statement to say that Apple has largely neglected security in its phones. As such, it’s hard not to feel uneasy about an operating system to which they hand over 95% of their software, and which the rest of the industry is doing so well.
There is a more positive takeaway here with regard to iOS 6 security. Yes, the company clearly has been neglecting security, but I believe it’s simply because they have more resources than their competitors, and they simply haven’t invested what little is needed to fix the problems.
So I don’t think that it’s necessary for us to take any sides between Apple and Apple Security. I think it’s very difficult to predict which side will be the winner. I think that both companies have the tools in place, and the same risks will probably affect one of them or the other, and thus we should not assume they’re equally responsible. But the truth is (and this shouldn’t be taken as some kind of official condemnation of Apple either), security isn’t going to come cheaply, and the risk of the “security hole” being “dirtier” is a serious one. The real question is whether security is a product, and whether we are willing to accept that we’re buying into some of the worst possible business model in the history of businesses.