Even as the US government bids adieu to Clipper Chip, an infamous episode that influenced the cryptography debate for years, there is renewed focus in a number of quarters that it should not repeated.
The most recent evidence comes from a new report from the United Nation's Office of the High Commissioner for Human Rights (OHCHR). A Special Rapporteur, David Kaye, was appointed to look into the use of encryption and anonymity in digital communications. In preparing the report—which will be presented to the U.N. Human Rights Council later this month—he drew from research on international and national norms and jurisprudence, and received input from governments and civil society.
Mr. Kaye concludes:
"compromised encryption cannot be kept secret from those with the skill to find and exploit the weak points, whether State or non-State, legitimate or criminal. It is a seemingly universal position among technologists that there is no special access that can be made available only to government authorities, even ones that, in principle, have the public interest in mind."
The report acknowledges the frustration voiced by that law enforcement and counter-terrorism officials, but notes that "State authorities have not generally identified situations—even in general terms, given the potential need for confidentiality—where a restriction has been necessary to achieve a legitimate goal." Instead, it finds that "efforts to restrict encryption ... also tend to be quick reactions to terrorism, even when the attackers themselves are not alleged to have used encryption or anonymity to plan or carry out an attack.
The report from the OHCHR follows on recent oversight by the U.S. Congress in April. Rep. Jason Chaffetz, Chairman of the House Oversight and Government Reform (HOGR) Committee put it succinctly in his opening statement:
"First, it’s impossible to build a backdoor only for the good guys ... you think, just the good guys can get this? ... strong encryption prevents crime and is a part of the economy.”
The Chairman was responding to reports that law enforcement agencies were looking for a 'regulatory or legislative fix' which could compel companies to manufacture their devices (especially mobile) in such a way that law enforcement could access data on those devices with a warrant or court order.
At the hearing, Amy Hess, Executive Assistant Director, Federal Bureau of Investigation (FBI) outlined her agency’s view, "To be clear, we obtain the proper legal authority to intercept and access communications and information, but we increasingly lack the technical ability to do so. This problem is broader and more extensive than just encryption."
She elaborates in her testimony: "Any lawful intercept or access solution should be designed to minimize its impact upon the overall security. But without a solution that enables law enforcement to access critical evidence, many investigations could be at a dead end. The same is true for cyber security investigations; if there is no way to access encrypted systems and data, we may not be able to identify those who seek to steal our technology, our state secrets, our intellectual property, and our trade secrets."
Noted cryptographer and security expert Matt Blaze pointed out that law enforcement agencies "for at least two decades been warning that wiretaps and other forms of electronic evidence gathering are on the cusp of 'going dark'." Blaze is well known for his public critique of Clipper Chip shortly after it was adopted by the USG. Writing while he was at AT&T Labs, he documented how from the beginning the lack of transparency and openness permeated the process and contributed to its failure.
In his testimony in April, Blaze reiterated why this approach is designed for failure:
"At first blush, a 'lawful access only' mechanism that could be incorporated into the communications systems used by criminal suspects might seem like an ideal technical solution to a difficult policy problem. Unfortunately, harsh technical realities make such an ideal solution effectively impossible, and attempts to mandate one would do enormous harm to the security and reliability of our nation’s infrastructure, the future of our innovation economy, and our national security."
Overall, the Committee members who participated in the hearing agreed. Chairman Chaffetz went further, noting that some have pointed out that this is the "golden age of surveillance" for law enforcement which "have never had more tools a their disposal to help detect, prevent, and prosecute crime."
The Committee includes one member, Rep. Ted Lieu (D-Calif.), who has a bachelor's in computer science from Stanford University. At the hearing, he captured the point succinctly: "It is clear to me that creating a pathway for decryption only for good guys is technologically stupid. You just can't do that."
The frustration has shown up in recent legislation. The US House considered an appropriations bill for the Departments of Commerce, Justice, Science and various agencies the first week of June—just as the President signed the Freedom Act into law. An amendment put forward by Rep. Thomas Massie (R-KY)—with the support of other lawmakers like Rep. Zoe Lofgren (D-CA)—sought to restrict funds for the National Institute of Standards and Technology (NIST) to consulting with the National Security Agency (NSA) only to improve security measures. It passed by overwhelming majority of 383-43, and now the bill heads to the Senate.
In the wake of the Clipper Chip episode, when NIST announced its global challenge to replace the outdated Data Encryption Standard (DES) with a new Advanced Encryption Standard (AES), it designed a process that was open, transparent, and global. The process—and result—received notable praise from the cryptographic and open source communities.
While AES remains a viable algorithm against practical attacks, it will soon be time to consider the next generation. It will be essential that the process again be open, transparent and global, and that the open source community participate actively.
& Open Data
A collection of articles about the latest in open government and open data.
2 Comments