Apple vs. the FBI: Critical Thinking in a First-Year Seminar

E Is for EnigmaEarlier this week a federal judge ordered Apple to comply with an FBI request to assist in unlocking an iPhone owned by one of the shooters in the December 2015 San Bernardino terror attack. Apple said no. This conflict, and the resulting national debates about encryption, privacy, and security, are squarely on-topic for the cryptography seminar I teach, so I thought I would weigh in with a few thoughts here on the blog.

First, some background. If you have an iPhone, you’re probably used to entering a four-digit PIN to unlock your phone. If you’re slightly more paranoid than the average iPhone user, you may have enabled the option to use a PIN consisting of both letters and numbers, with more than four characters. If someone (say, the FBI) wanted to unlock your phone without your permission, they could, in theory, try every possible PIN, eventually hitting on the one you chose. This is called a “brute force” attack, since it’s not particularly clever–it’s just extended guess-and-check.  For a long PIN, it could take an interloper quite a while before they hit upon it, even if they’re able to enter guesses very quickly.

The FBI’s problem is that they can’t enter guesses on an iPhone very quickly. They have to enter guesses by hand, and the operating system makes them wait longer and longer between incorrect guesses. (You get four guesses more or less for free, then you’ll have to wait fifteen minutes or more between guesses.) The dealbreaker, however, is that after too many incorrect guesses, the phone’s operating system deletes all the data on the phone. As a result, there’s no way a brute force attack by the FBI would work on this or any other iPhone.

As a result, the FBI asked Apple to design a version of their iOS operating system to be installed on that particular phone that would remove these restrictions. Specifically, the FBI is asking for the ability to enter PINs through some connected device (instead of by hand), the removal of the delay between PIN attempts, and the removal of the feature that auto-deletes phone data after too many failed attempts. The FBI has to ask Apple to design this hack for them because Apple designed their iPhones not to accept operating system updates from anyone other than Apple. That is, Apple has the digital keys that allow for this new “FBiOS” (as some have called it) to be installed on the San Bernardino iPhone.

Since the first time I taught my cryptography course back in the summer of 2009, my students and I have discussed the role of encryption in modern conversations about security, privacy, and surveillance. I try to encourage my students to consider incidents like this week’s FBI-Apple conflict with something of a cost-benefit mindset. The benefit to complying with the FBI’s request is that data on that iPhone might turn out to be critical to stopping future terrorist attacks. The cost? I see two main concerns:

  • It’s possible that once this new operating system is developed, the one that allows third parties to circumvent iPhone security, it could fall into the hands of criminals, foreign powers, or others who would use it to bad ends. This is the reason a dozen leading cybersecurity experts have argued that companies like Apple shouldn’t build “back doors” into their products so that U.S. government officials can get around device encryption. This week’s FBI request isn’t really for a “back door.” That is, the FBI isn’t asking Apple to modify its operating system to allow the FBI to get around encryption on all iPhones, just this one particular iPhone. It’s still possible that FBiOS could get “into the wild” and be used by others with evil intent, but that’s relatively unlikely.
  • Much more likely is that the current situation would create a precedent by which law enforcement agencies could regularly or event routinely compel tech companies to circumvent their own security and privacy mechanisms. This is the argument that most critics of the FBI move are making this week. Once there’s a legal precedent for this kind of thing, it’s hard to know where it will stop. Might the FBI or the NSA compel a company like Apple or Google to design a real back door into its product? Might other governments do the same, given the U.S. precedent? This concern is why the Electronic Frontier Foundation, among others, is supporting Apple.

My blog is ostensibly about teaching, so I’ll mention that I work with my students to bring nuance and complexity to their arguments. We practice this in class, for instance, through in-class commenting on peers’ blog posts in ways that add complexity, through debates and role plays, and through collaboratively created visual mappings of complex debates. Out of class, this kind of critical thinking takes place in argumentative essays that the students write. There’s a category on my grading rubric that calls for complexity, and I regularly provide feedback on student rough drafts prompting them to consider possible objections to their arguments. I can’t say that I’ve figured this out as a teacher, but building these kinds of critical thinking skills is certainly one of my goals in the course, and I’ve developed activities that explicitly target this goal.

Here are two points that I think my former students (especially my fall 2015 students) would make regarding the FBI-Apple conflict:

  1. On the benefit side of this cost-benefit analysis, it’s not a given that the iPhone owned by the San Bernardino shooter has any actionable information on it. Back in 2014, FBI Director James Comey gave a speech in which he argued for “back doors” by citing four cases where data from cell phones was critical to an investigation. Journalists at The Intercept were able to explore three of those cases, and they concluded that, in fact, cell phone data was not critical. Those three cases could have been prosecuted just as successfully without unlocking cell phones. This, I think, is an important point to consider in a cost-benefit analysis; the benefits are, at this point, only hypothetical and perhaps not even that likely. I would proud to see that several of my students examined this thread of the debate in their final papers last fall. (This is also an example of why I think statistics should be taught to all students, since this kind of thinking about probabilities is important to decision making, both of the national security and everyday varieties.)
  2. For the first few offerings of my crypto course, I would shorthand these debates as “security vs. privacy.” Thanks to a Twitter conversation I had with author and activist Cory Doctorow, who wrote the novel Little Brother that we read in my crypto seminar, I’ve moved away from that language. It’s misleading to pit privacy versus security in a situation like this. Sure, Apple is trying to protect its users’ privacy, and the FBI is trying to enhance national security, but what of the average iPhone owner’s personal security? If Apple is indeed compelled to build a back door into its operating system, that iPhone owner will be at much greater risk of identity theft, bank fraud, ransomware, and more. When you hear “security” concerns invoked in these kinds of debates, it’s important to ask whose security and from whom.

Finally, I’ll note that this cryptography seminar is the only math course I teach where we regularly deal with current events. It pained me that I didn’t teach the course back in 2013 when Edward Snowden went public, and, this week, I’m once again wishing I was actively teaching my crypto course!

Leave a Reply

Your email address will not be published. Required fields are marked *