Apple and privacy.

If you somehow managed to miss it yesterday, the internet is abuzz with news that Apple is refusing to submit to a court order that it write the necessary code to allow the FBI to break the encryption on the iPhone owned by one of the terror suspects in the San Bernadino shooting. One side is applauding Apple for standing up and trying to protect the privacy of its customers. After all, they use the encryption option to prevent their data from being stolen. The other side is pointing out that the government isn’t asking Apple to put a backdoor in the iOS that would permanently allow the government to grab your data. They go on to say that this is an issue of national security. The line is drawn in the sand and Apple is promising to appeal the order.

My layman’s understanding of the problem is that you are only allowed so many times to try to hack the iOS before the encryption coding kicks in and wipes the data. The FBI techs, who you would assume are some of the best in the country — if not the world, should be able to hack the system without triggering the wipe. The concern the government has is that there is information on the suspect’s phone that could 1) lead to others who mean to do this nation ill and 2) information that could point to other terrorist activities in this nation or elsewhere in the world.

I’ll admit that I’m torn on the issue. I hate the very thought that the government might have a backdoor into my phone or computer or any other aspect of my life. But I also know that, in this day and age of instant communication and digital everything, privacy is an illusion. Yes, there are ways to make it more difficult for the average Joe on the street to steal your data but that gets more and more difficult with every day that passes.

I don’t believe the government should have a hand in my personal life. It shouldn’t tell me what I can do or can’t do in the privacy of my own home as long as no one else is harmed — and I do mean harmed. I don’t mean it hurts someone’s feelings or makes someone feel marginalized or uncomfortable. I most certainly don’t believe the government should monitor my communications or economic transactions without a court order that requires them to prove probable cause to do so.

However, the government has one duty that we should all keep in mind — to protect the citizens of this nation. Whether we like it or not, whether we want to admit it or not, we are at war. There are people out there who have made it their goal, both personally and as a nation, to destroy the United States. It appears that the San Bernadino shooters were at the very least sympathetic with those enemies of our nation. It doesn’t appear to be a stretch to believe the woman was one of those who have sworn to destroy us and that she radicalized her husband.

So it isn’t hard to understand why the government wants to see what might be on the woman’s phone. Hell, to be honest, I want to see what’s on it if it means potentially saving lives.

Still, is it asking too much to order Apple to find a way into their own iOS for this reason?

That was the question the local news discussed this morning. The analogy one of the anchors came up with kind of stretched the point some but I understood where he was coming from. He compared the iPhone in this instance with a safe deposit box. In the latter case, the government has enough probable cause to have a judge issue a search warrant for a single safe deposit box at a certain bank branch. That doesn’t mean the government gets to open all the safe deposit boxes at that branch, much less at very branch of that bank. No, it means that one particular box will be opened.

That is a simplistic approach to the issue, even if a valid point of view. The government is saying that it isn’t asking for a backdoor to be put into all iPhones. It is instead asking Apple to write code that will let it get into this one particular phone without wiping the data it contains. A single target, so limited impact.

However, as Apple points out, once the code is written, it can be stolen and/or exploited. The only way to prevent that is to not write the code.

On the other hand — and how many hands does that make me having? — there is the argument that Apple could destroy the code after it is used in this particular case.

I’m not sure where I land on this issue. No, I do know. Unless and until Apple says it will write a cone time only code and then destroy it, I don’t want to see the code written. There are too many possibilities of it being misused, either by the government or by someone else. Still, there is that nagging voice in the back of my brain that reminds me there might be important information on the phone that we need to know about.

What this really points out is the issue we are going to face more and more often in this digital age. When does the interest of the government in protecting our nation outweigh our right to privacy?

(On the writing front, I have finally figured out why the first two chapters of Honor from Ashes (Honor and Duty Book 3) have been bothering me and, more importantly, figured out how to fix it. So that is what I’m going to be doing today. Then I will return to the edits.)


  1. The FBI and the rest of the alphabet agencies, want the *precedent* a million-billion times more than the data.

    So for me – not just no, but HELL NO! – this is exactly the sort of government over-reach – i.e. seeking the ability to legally compel the *creation* of something that does not exist – that should rightfully be stomped on HARD. Then set on fire, and everyone associated with pushing it should be terminated from government service AND permanently barred from it at any level.

    Because once they have the precedent, it can and absolutely will be used, then it will be over-used, then it will be horribly *abused* as the various security apparati seek to make their job easier and easier, with utterly zero regard for any kind of Constitutional rights.

    How do I *know* this?
    First off, it is the historical pattern of our government. It’s what they do unless checked HARD and repeatedly.
    Secondly -for a recent example – Can you say “Stingray”? How about “Asset Forfeiture”, which clearly fits the pattern…

  2. I am unsure it is even truly possible to write that code. If it’s something that utterly needs a truly secure passcode to decode, there should be no reasonable (unreasonable translates as: takes centuries if not millenia) way to break into that phone – it’s already been encrypted. If that’s not the case, then it’s already not really secure, just that the vulnerability is, as far as we know, not yet exploited and “merely” difficult to exploit.

    And it’s a painful quandary, but history seems to indicate that no matter how well-meaning initially, the way to guarantee liberty is to allow governments as little power as possible. When in doubt, err on not giving a government more power. Some results will still be unpleasant, alas. The other option, as appealing as it might be in the short term, leads to the greater disaster.

  3. I agree with your points. I am also torn by this. The protection against unreasonable search and seizure has been extended so far now, in large part, due to technological advances. Do I have a reasonable expectation of privacy in “the cloud”. Certainly yes I do. If I don’t then no one would use the cloud. Does the gov’t have a compelling national interest to target specific accounts in the case of security or criminal activity? Maybe, almost certainly yes, but limited to only those instances. But how is it limited to those instances? Traditionally we would ask for a warrant that protects the right of the individual being searched. But now we have a technology that makes it impossible for a warrant to be served. In this manner is Apple (or Google, Microsoft, Amazon, etc) allowing criminals or terrorists to operate? Could they be liable if a criminal is able to continue their activities because there is no way to get around that encryption? Libertarians and those who support the privacy rights rarely talk about the price that may be paid for that privacy. Oh, but the same could be said for gun rights supporters. Like I said, this is a good meaty debate, and one that we should enter with our eyes open and our minds fully engaged.

  4. If the code is written, there’s no way it will be used once and not again. Making a copy of the code would be child’s play. I don’t know what the answer is here, but once that code is written, it won’t just be used once, even if it’s “destroyed”.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.