Chris Riley is Head of Public Policy at Mozilla, where he works on open Internet policy initiatives and developments. I had a chance to catch up with Chris about a new effort launched at Mozilla called the Secure Open Source Fund. The goal of the Fund is to support security audits and remediation for open source software projects.
Chris, tell us about yourself and Mozilla.
Thanks Mark. Mozilla is a unique institution—it's both a nonprofit mission-driven organization and a technology industry corporation. We build open source software (most notably the Firefox Web browser) and we are champions for the open Internet in technical and political fora. We've been a global leader on well-known policy issues like privacy and net neutrality, and we're also very active on most of today's big topics including copyright reform, encryption, and software vulnerabilities.
I have the distinct privilege of running Mozilla's public policy team. My career has been built on Internet policy from a law and technology background. (I’m one of a rare few joint Computer Science PhD’s/JD’s, in fact). I sometimes like to say that I'm a disruptive Internet advocate, and you can draw the internal parentheticals grouping that however you like.
I understand that Mozilla has recently launched something called the Secure Open Source Fund. Tell us about it.
The Secure Open Source Fund, or SOS Fund, is a new effort at Mozilla to support security audits and remediation for open source software projects. It's part of our broader Mozilla Open Source Support (MOSS) program, which is focused on supporting the open source and free software movement. Open source software is at the core of the Internet, and much of it is developed without institutional support. In the context of security, bug bounty programs aren’t enough. They’re successful for Mozilla and other larger organizations, but for many open source projects, there aren’t any resources to pay the bounties, or to fix the problems that arise.
So, building on our history as one of the first institutions to create a bug bounty program, we created the SOS Fund to help fill that gap. Our approach is to work with an external auditing firm to conduct a point-in-time evaluation of the project, and then offer support to the project maintainer to manage remediation. Finally, we work with the original auditor to verify the changes that were made.
Defense is harder than offense when it comes to software security, yet our economy depends on secure networks and systems. We all need to step up and contribute to secure the internet for the future, and with the SOS Fund, we’ve contributed some resources of our own. We encourage others to join us in this movement. For our part, we are continuing to provide support to organizations, and learning lessons along the way to optimize our practices and processes.
You can learn more about the program on the Mozilla blog.
What has been your biggest success so far?
We’ve learned a lot with the first handful of projects that we’ve supported. I think the biggest success is that the project has identified critical and high-rated vulnerabilities in its first two iterations, and then fixed them. But the best story is with the audit of a JPEG image processing library. Our audit identified two problems that were ultimately determined to be a problem with the JPEG standard itself. In other words, any reference standard compliant JPEG image processing library—whether open source or not, regardless of the source or context—is subject to the same (low-rated) vulnerabilities. Specifically, certain kinds of legitimate images trigger CPU or memory overconsumption, potentially leading to system crashes.
It’s important, and interesting, that these vulnerabilities were to date unknown. So, in the spirit of openness that pervades our work and this project, we published a writeup, so that others could fix their implementations as well.
Mozilla is at the forefront of open source software leadership. Share with us some future issues catching your attention.
Thank you for that! We like to think we’re built on, and for, open quite broadly—through our mission and our policy work, in addition to, and through, open source software.
Building the open future of the Internet continues to be crucial for us, and we're seeing "good" trends like the global adoption of strong net neutrality protections, with the United States and Europe following on earlier victories in Latin America. On the other hand, we're seeing efforts like the European Commission considering adoption of "ancillary copyright law" and other changes that would put normal technology practices like hyperlinking in jeopardy.
One area of focus for us now is the Vulnerabilities Equities Process (VEP), the U.S. government's process for reviewing and coordinating the disclosure of vulnerabilities that it learns about or creates. A couple years ago, White House Cybersecurity Coordinator Michael Daniel wrote in a blog post that the Obama Administration has a presumption in favor of disclosing vulnerabilities, and gave some criteria he personally considers when engaging in the process. But, policy by blog post is not particularly binding on the government, and as Daniel even admits, "there are no hard and fast rules" to govern the VEP.
Since that post, we haven’t seen improvement in the way that vulnerabilities disclosure is being handled. One recent example is the alleged hack of the NSA by the Shadow Brokers, which resulted in the public release of NSA "cyberweapons." including vulnerabilities that the government knew about and apparently had been exploiting for years. Cybersecurity is a shared responsibility, and that means we all must do our part—technology companies, users, and governments. The U.S. government could go a long way in doing its part by putting transparent and accountable policies in place to ensure it is handling vulnerabilities appropriately and disclosing them to affected companies. We aren’t seeing this happen today. Yet, with some reforms, the VEP can be a strong mechanism for ensuring the government is striking the right balance.
More specifically, we recommend five important reforms to the VEP:
- All security vulnerabilities should go through the VEP and there should be public timelines for reviewing decisions to delay disclosure.
- All relevant federal agencies involved in the VEP must work together to evaluate a standard set of criteria to ensure all relevant risks and interests are considered.
- Independent oversight and transparency into the processes and procedures of the VEP must be created.
- The VEP Executive Secretariat should live within the Department of Homeland Security because they have built up significant expertise, infrastructure, and trust through existing coordinated vulnerability disclosure programs (for example, US CERT).
- The VEP should be codified in law to ensure compliance and permanence.
This is an area we'll be focusing on in the months to come, as these changes would improve the state of cybersecurity today.
Comments are closed.