Susan Landau, a former distinguished engineer at Sun Microsystems and now a professor at Tufts. I had an opportunity to check in with her at a recent event in Washington, D.C.
She has a new book, Listening In: Cybersecurity in an Insecure Age, which is well worth reading for both long-time and new followers of computer security and encryption. As I've written previously, the debate has shifted (see The US government bids adieu to Clipper Chip and Encryption back doors: Is there more to this debate?).
In this interview, Susan explains how and why that shift has occurred.
Mark Bohannon: Susan, please tell us about yourself.
Susan Landau: My training was originally in theoretical computer science; I worked on fast algorithms for algebraic problems. But in the 1990s, I got caught up in the Crypto Wars. First, I got involved in writing Codes, Keys, and Conflicts: Issues in US Crypto Policy, one of the earliest looks at encryption, produced by the Association for Computing Machinery (ACM). From there, Whit Diffie [of Diffie-Hellman fame] and I wrote Privacy on the Line: The Politics of Wiretapping and Encryption.
I began working at Sun Microsystems doing a combination of technical and policy work. I slowly became more and more of a policy wonk during this time. I worked on privacy and security issues in federated identity, helped establish Sun's policy on digital rights management, and was briefing Congress and at the European Commission. I also wrote my second book, Surveillance or Security? The Risks Posed by New Wiretapping Technologies, which examined the risks of building wiretapping into communications infrastructure (think of the Communications Assistance for Law Enforcement Act).
When Sun folded in 2010, I spent a couple of years at Harvard (one as a Radcliffe Fellow), a brief time at Google as a senior staff privacy analyst, and several years as a professor of cybersecurity policy at Worcester Polytechnic Institute. Now I'm at Tufts University as a bridge professor in cybersecurity and policy at Tufts Fletcher School of Law & Diplomacy and the School of Engineering's Department of Computer Science. I'm establishing courses and efforts in cybersecurity and cybersecurity policy at Tufts. And I've published my new book, Listening In: Cybersecurity in an Insecure Age.
Mark: You've seen the encryption debate evolve over the decades. How far have we come from those early days?
Susan: In the 1990s, both the NSA and the FBI strongly argued against the public's use of end-to-end encryption. In 2000, the US government loosened export controls on computer and communications devices with strong encryption, understanding that this would result in the public having greater access to end-to-end encryption. The NSA took that change in stride; in large part, the agency understood that encryption was coming and moved its focus to computer network exploitation. But the FBI never really accepted the change. It has continued to fight the encryption battles since then.
But there's been a real change during this time. No one is really arguing about whether there should be control of end-to-end encryption. (Indeed, many former senior intelligence leaders have come out in support of its use in the civilian sector.) So while the FBI will talk about the threat posed by end-to-end encryption, really the fight is about locked devices, as per the fight over the iPhone in the San Bernardino case.
Mark: In your new book, you see the discussion moving from "privacy vs. security" to "security vs. security." Can you elaborate on that?
Susan: Let me start with a story about smartphones. When the iPhone was introduced, it turned out to be a great target for street theft: a small, very expensive device that could be easily taken. So, Apple developed protections to secure the phone, including Find My iPhone and Activation Lock. These limited the use of a stolen phone, and thefts dropped. But hackers in China found another use of smartphones: the data on the device. In the late 2000s, criminals in China were hacking lost and stolen phones, selling the data. Then they began selling the hacks.
Consider the wealth of data on the phones: account information, email, etc. That data makes identity theft trivial. So, Apple began protecting the phone data. That, plus the fact that we all carry our phones with us all the time and would know quickly if they're missing, made the smartphones a perfect device to use for two-factor authentication. That's a pretty big deal.
Now let me get back to your question. The FBI presents its conflict with Apple over locked phones as a case as of privacy versus security. Yes, smartphones carry a lot of personal data—photos, texts, email, and the like. But they also carry business and account information; keeping that secure is really important. The problem is that if you make it easier for law enforcement to access a locked device, you also make it easier for a bad actor—a criminal, a hacker, a determined nation-state—to do so as well. And that's why this is a security vs. security issue.
One of the important things to factor in here is Russia's attacks on democracies (there's an interesting minority Senate Foreign Relations Committee report that discusses just how Russia has weaponized the use of information against European democracies). The January 2017 Intelligence Committee report noted that the "Russian intelligence services collected against the US primary campaigns, think tanks, and lobbying groups they viewed as likely to shape future US policies."
Civil-society organizations are really critical to the functioning of democracies, and many of them lack big budgets or the ability to do strong security. So, to make them secure, systems that provide strong security, like end-to-end encryption and second-factor authentication devices, should be easily available and easy to use. That's where locked phones come in. If John Podesta had been using two-factor authentication, his email would have been much harder to compromise. Had the DNC been using Signal for messaging—or any other system with forward secrecy—there wouldn't been old communications to steal (yes, sometimes you need to keep old communications, but not nearly as much as we do at present). The point is that readily available consumer products with good security protect all of us. They thus protect society at large. That's important given the threats we now face, from the Russian efforts to undermine democracies to the [intellectual property] data thefts that undermine US industry.
Mark: I found your chapter on investigations in the age of encryption very thoughtful. Is our problem still the risk of "backdoors," or have the challenges facing law enforcement evolved beyond that?
Susan: I'm glad you asked that. Let's go back and think about the world the NSA faced in the late 1990s. The volume, variety, and velocity of communications had changed, creating many challenges for the signals-intelligence agency, and many nations were encrypting communications using hard-to-break cryptography. Indeed, in 1999, the investigative reporter Sy Hersh described this as an "intelligence gap"; there was talk that the NSA was in danger of "going deaf." But NSA adapted. The agency developed techniques for getting into networks and in milking information from communications metadata.
The FBI did not jump into handling the new technology with the same alacrity. While the FBI has some very technically sophisticated investigators, it doesn't have nearly enough—by at least an order of magnitude. But there are even more problems than that.
State and local police departments are overwhelmed by the complexities of the technologies they're facing. In 2011, when I testified before Congress on hearings on encryption, the president of the International Association of Chiefs of Police described law enforcement's difficulty with new communications technologies. The problem wasn't encryption; it was just the myriad different types of phones and technologies that state and local law enforcement had to handle. This was in the days before locked iPhones.
One aspect of modern investigations is using alternative forms of investigatory techniques when a communication is encrypted or data is on a locked device. Sometimes the same data can be found elsewhere (e.g., in the cloud). Sometimes alternative sources of data—communications metadata and the data supplied by IoT devices (which increasingly provide evidence in criminal cases), automated license plate readers, transit passes, and the like—can provide investigators with what they need. There's also "lawful hacking," using vulnerabilities in systems to break into devices and access data. It's important here to acknowledge getting a conviction in court (law enforcement's job) is different from developing insight into the conduct of a foreign adversary (NSA's role). That doesn't mean that law enforcement can't take advantage of the wealth of data that all of us, including criminals, are littering all over.
To do so means increasing [the] capabilities of law enforcement. As I've said, the FBI needs to vastly grow its technical capability. And law enforcement needs to develop a much better way for sharing technical information with state and local police, who will never have the resources to adequately develop the depth in their own departments. I'm not talking here about the feds taking over investigations, but rather providing the technical resources will enable the state and local police to run those investigations themselves. It also means vastly increased funding; the fact is, many crimes now have a digital aspect. We need to provide resources for police to conduct such investigations as well as develop the capabilities for police to handle this. (I've described some of this in a recent Lawfare blog post.)
Mark: You have worked in and around the open source community for much of your professional life. What have you seen that has changed over the years?
Susan: Open source was a new and radical thing in the 1990s; now it's the accepted way of doing business. That's the biggest change. Many of the companies that in the 1990s couldn't imagine participating in open source are now among its biggest users—and contributors. In the 1990s, we bought hardware and software. We still buy hardware, and we still buy some software. But we get a lot of other software as services, with subscriptions (and sometimes advertising) paying the cost of development. Attracting users to your communities and platform—and thus attracting developers to write applications—is the game now. Open source is clearly a winning way to attract the developers.
There are still issues about which open source license to use. But on the other issues, the bottom line is that the business models changed—and so did the model of software.
After I spoke with Susan for this article, the Washington Post published a Q&A about the dispute between the FBI and the cryptographic community that readers might find interesting.
2 Comments