This isn’t exactly a dad topic, but I can’t stop thinking about it.
If you’re following the news yesterday, you know that Apple is in a major fight with the FBI over their court-ordered request to unlock an iPhone5c used by one of the terrorists in the recent San Bernardino attack in which 14 innocent U.S. civilians were murdered. Apple said no, and posted the following public letter for the entire world to read, explaining why they don’t plan to comply with the order. This is going to be some hell of a fight given the stakes and may end up on the desk of the yet-to-be-named new Supreme Court Justice.
I’m no digital expert, or national security expert or privacy expert, or really an expert of any kind. Except, I do think I’m an expert at being a law-abiding citizen just like you. There will be many other articles — you can read a few here, here, here, here — about this topic that better characterize the legal arguments on both sides of this issue than I can. But that won’t stop me from trying.
So what exactly is happening? My simple reading of the Apple vs. FBI throw-down is that the FBI wants Apple to create a backdoor to their iPhone operating system so that the FBI can unlock the terrorist’s cell phone and find out what’s on it. Perhaps information about known associates or maybe even upcoming attacks is stored on that phone. I really have no idea. You see, every iPhone has a security mechanism you can use whereby it automatically deletes the content if 10 password attempts are made unsuccessfully. And this isn’t the movie Swordfish, so unless you know someone’s password, it’s pretty hard to guess what it is in fewer than 10 tries. Too bad the terrorists didn’t have an older child since no parent uses that function if they have a toddler for fear of losing their data as little Johnnie tries to unlock the phone to watch Paw Patrol episodes on YouTube.
Anyway, the FBI wants Apple to create new software whereby they can bypass this “rule of 10” so they can “brute force” a winning password — I would imagine they hook the phone up to some giant computer that runs millions of passwords through it really quickly until they find a winner. And holy crap, I just figured out how to win PowerBall!
But seriously…best I can tell, Apple is saying no because if they create this magic key, there is no way to keep it out of the hands of the bad guys. The government’s positioning sounds like something my five year old would say “But…Daddy…I promise…just this one time…I’ll only do this once…seriously…I promise, you can trust me..I just want to see what will happen, just this once.” I don’t believe my son, and I don’t believe my government because sometimes it’s just too damn tempting to step in that big puddle to see how deep it is.
And I’m in no way saying the FBI and the government aren’t well intentioned. I bet the people who have been working day-and-night on this case truly believe this phone holds some important clues — and I want them to want that data so badly. They’re dedicating all of their time to keep us safe — I believe that.
But, here’s my fear, and it’s the same one that Apple seems to have. I don’t believe that this magic key will only be used for good, and I strongly doubt it will stop a terrorist attack — remember they are looking at the phones of people after an attack happened. Someone with bad intentions will get this magic key and use it. Maybe it will be to steal celebrity photos, but maybe it will be to steal my identity or yours or everyone’s. Didn’t we once give lots of military stuff to the good guys in Afghanistan that were later used by the bad guys? The force is used by both the Jedi AND the Sith. Guns are used by police and soldiers to protect us, but also by mass murderers (and three year kids accidentally shooting their siblings, but that’s for another day). If something is important enough for the good guys, you can bet the bad guys will want it as well. That’s fact.
Of course, Apple has its own interests at heart, just like any other corporation would. And as one of my friend’s on Facebook pointed out, it’s not exactly a bad thing for Apple for everyone to read about how the FBI can’t hack one of its phones — but I trust Apple. Just like I trust Mark Zuckerberg. Maybe shame on me, but I do. Companies answer to shareholders, they answer to us and they get hurt when they do wrong — walk into a Chipotle today and tell me if you think the recent E. coli disaster hasn’t affected their business.
So what’s going to happen? What should happen? Damned if I know. I spend my day driving two boys around and buying them Doritos (my older son calls them carrots now, so even he understands the game). Perhaps there’s a place to meet somewhere in the middle. They could create some kind of Level 5 bunker with top-secret clearance on the Apple campus where the phones can be unlocked without any code ever leaving Apple.
I really, really want the government to have all the tools they need to catch terrorists. I’d be sick to my stomach over the prospect that intercepting iPhone communications could have stopped the Paris attack, for example. But how do they know who the terrorists are before an attack without invading the privacy of people who may not be terrorists. And what happens when a case is important, but not one of national security. It seems like a slippery slope to me, and that’s one reason why this demand scares me more than the others.
I think I would start here.
If the FBI or any governmental agency wants to put the entire world’s personal privacy at risk — our photos, our bank information, our secrets (Yes, I will admit now for everyone to see that I follow the Kardashians on Instagram), then perhaps they should go first — not the people doing the work, but all the government leaders involved in this case or talking about it publicly.
If they have to open up their phones for the entire world to see, perhaps they will understand the argument better as to why the rest of us don’t want to.