There’s been a lot of ink spilled about Apple’s refusal to help the FBI unlock the iPhone of San Bernardino terrorist Syed Farook, citing concerns of privacy rights. Many are hailing them as heroes against an intrusive, “Big Brother” government looking to force Apple into building a back door into everybody’s phones. Other people are siding with the government, saying that this is a clear cut case of the government issuing a valid search warrant to access the phone of a known terrorist (not merely a suspect). At The Federalist, you can see both sides of the argument, both for Apple, and against Apple. Apple’s motives behind this move, in my opinion, are much less likely to be as altruistic as they seem. First, a little background.
The FBI is trying to access the contents of the iPhone used by terrorist Syed Farook in order to find any information that could help them identify other individuals involved in the attack, or whether there may be other attacks planned. The iPhone (which is actually owned by the San Bernardino County Dept. of Public Health), had not updated itself to the iCloud storage since October, and unfortunately (or coincidentally), the iCloud password was reset by a county IT worker just hours after the phone entered government custody. Because the iCloud password was reset, the phone can no longer by synced to the cloud simply by bringing it to a location where the phone recognized the wi-fi network. That leaves the FBI with the only option of attempting a brute force hack on the phone’s password.
A brute force hack is where the phone is connected to a computer that will run through thousands of possible password combinations per second until it hits the correct one. Unfortunately for the FBI, the new operating system on this version of the iPhone has three safeguards to prevent this from happening. First, you must enter the passcode by hand on the phone’s touch screen (you can’t use a computer). Second, there is a built in delay if you enter the wrong code. Third, and most important, there is an option to erase the phone if more than 10 unsuccessful code entries are made.
Apple can disable these features by sending a software update to the phone. This would allow the government to then attempt a brute force hack without running the risk of accidentally deleting all the information. Apple is refusing to do this, however, by saying that this would create a “back door” that could then be used against anyone who has the new iPhone. There’s a problem with this argument, though. The only way an iPhone will accept an update to the operating system is if the update carries a valid digital signature from Apple, and that signature would never leave Apple’s control. So Apple could update this specific phone, and the government would not be able to use the same technique to hack into any iPhone they wanted.
So why has Apple decided to refuse a valid court order in this case, especially when they have helped the FBI in the past, and have a dubious record of protecting their users’ privacy in, say, China?
The answer is simple. Marketing. Apple’s snub of the FBI is Brilliant Marketing
The attention this issue is bringing to Apple and its new generation of iPhone is nothing short of a public relations bonanza. In the post-Snowden era of NSA snooping, and concerns of digital privacy, Apple’s refusal to comply with this court order is doing two things:
1.) It is highlighting the security available in the new iPhone. “You like security?” Apple asks. “Our new iPhone is so secure not even the government can hack it!!”
2.) It is casting Apple in the role of looking out for it’s customers. “We’ll go to bat for you,” Apple says. “We won’t let the government go snooping through your data!”
Apple is a business, plain and simple, and that business is built on selling lots of iPhones, not just to new customers, but also getting current customers to upgrade to their latest and greatest product. In the cutthroat world of smartphone competition, earned media like this is gold. Every time a new story hits the airwaves about this issue, Apple probably sells a few thousand extra new iPhones.
I fully expect this to go on until Apple has milked this story for all the new market share it can get, or until the Government gets serious in making Apple comply with the court order, whichever comes first. Then they will announce they have reached an agreement with the FBI that will unlock this specific phone, and that’s all (which is all the FBI is asking for in the first place), and Apple will be hailed as a hero of digital privacy.
Well played, Apple. Well played.
15 comments
It’s been beautiful watching @davhenry55:disqus roast @mickstaton:disqus multiple times in these comments.
This is a puzzling post. It doesn’t really take a position on the issues, although it seems to put Apple down by ignoring the real legal and policy issues and focusing on the marketing. If you’re going to talk solely about the marketing angle, talk about the marketing on both sides. The FBI chose this case to push the issue because of the San Bernadino facts & optics. The government chose to file the case publicly (after Apple had asked them to do it under seal) to pressure Apple. The government has recruited relatives of San Bernadino victims to file a brief in the case (although they have little to nothing to offer that’s legally relevant). There’s marketing all around here.
I am wondering if Apple even has the ability to build this backdoor. I wonder if they are resisting because they simply do not know how to do it. Looking at their product development since the beginning and while they market well, their product are not as fantastic as their commercial’s. Perhaps they are not as good technologically as they could be.
They can do it. In fact, they probably already have the software update written to do it, No software company would produce a product they could not later get access to if they needed to.
What you fail to mention Mick is that Apple does not have possession of the phone. It was a county government phone. How do you see it Apple’s responsibility to spend the labor and invention and time – for free – to invent a back door for that phone? Forced labor is ok now? And you’re wrong about software companies creating back doors. They don’t. They’re not supposed to create them. And most government contracts actually state they can’t have them on their devices or software.
The FBI has already offered to give the phone to Apple, let them push the software update to the phone, and then the FBI would attempt the brute force hack. Furthermore, the point that the phone belongs to the County Government should bolster the government’s side, not Apple. I am quite certain that San Bernardino County wants this phone unlocked, and as the owner, they should be able to make that request.
We’re not talking about creating a back door to the phone. We are talking about removing safety features that will allow the phone to be hacked, and I can guarantee you that Apple can already do this if they so choose. Apple owns the operating system to their phone, and you have to agree to their terms of service. If they want to enable or disable a feature of their operating system, they can do it.
Either way, the point of this post is not whether they should or shouldn’t unlock the phone. They will eventually. The point is they are milking this situation to sell as many new phones as possible.
Why are you not addressing the fact the government is demanding Apple do free labor? You seem to imply Apple already has this back door invented and it’s just a matter of them installing it on the phone. Apple has said quite clearly they do not have it invented yet. Or do you think they lie? You assume too much if you think software vendors create back doors for their software. None that I have worked for have done that for government clients.
Yes, I believe Apple is lying. I work as a software developer, and I have some experience with these kinds of things. Turning off these features on the phone is most likely as simple as change a setting in a file from “true” to “false”
Again, this is not creating a back door. It is about disabling a feature. You are correct that most software vendors do not build “back doors” into their products that would allow them to access your systems, but I guarantee you that they could read the contents of the data files their software uses. That is where your information is stored, and that is the information the FBI is looking for.
Mick, your vast software development experience notwithstanding, there is nothing at all to support your belief. These aren’t user-configurable options. These are security features built in to the operating system. (And this isn’t Android that people are readily able to tinker with; this is a company that jealously guards its closed OS.) Even the FBI doesn’t claim what you do, and you can be sure that if it were as simple as you claim: (i) the FBI would already have done it, or (ii) the FBI would be trumpeting it all over the place.
Read the post again. I didn’t say anyone could do it. I said Apple could do it, and they could do it quite easily. They have already admitted publicly they can do it.
Interesting. The FBI messed this phone up themselves when they reset the phone password. Now they want Apple to spend their time and money to fix their mistake. This guy has some other really good points: http://reason.com/blog/2016/02/22/does-anybody-believe-the-fbi-isnt-out-to?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+reason%2FHitandRun+%28Reason+Online+-+Hit+%26+Run+Blog%29&utm_content=Netvibes
The FBI didn’t reset the password. A county IT worker did (which I mentioned above). He was not asked by the FBI to do that, and if you ask me, that person needs to be looked into for possible ties to Farook.
Where are you getting that, Mick? The county IT worker did it in the course of working with the FBI, and it seems very possible that the FBI did it knowing that they’d be pursuing a different strategy with Apple (i.e., eliminating another option to get data aside from the one the FBI wants Apple to use).
http://www.techinsider.io/fbi-confirms-shooters-icloud-password-reset-2016-2
Well, that is a different story than the one I linked to above in the post. In that article it said,
“Federal investigators only found out about the reset after it had occurred and that the county employee acted on his own, not on the orders of federal authorities, the source said.”
That’s simply not true. Producing a product that only the user (and those he/she permits) can access is the entire purpose of encryption.