The federal government’s march to the cloud has, at times, seemed more like a cautious ballet than anything else. While cloud-based projects are slowly rolling out, much of the agency emphasis is on private or community clouds as opposed to public providers. Security and data handling concerns play a role in this "tiptoeing," but another reason is far more insidious: the fear of lock-in.
The Federal Shared Services Implementation Guide, the agency blueprint to the cloud, makes it very clear that government entities engaging in cloud computing need a clear “exit strategy” for anything as a service. It might seem ridiculous to consider how one should migrate from a technology before it is even implemented, but when it comes to the cloud, being able to get your data out is just as important as getting it in. It's about choice and control.
The best and brightest solutions... for now
Obviously, failing to have a clear cloud exit strategy leads agencies right into the same trap that they have long tried to avoid. Being locked into a single proprietary cloud provider means the best vendor at the best price may no longer be an option because the time and effort to exit the service is so high.
The "gravity" of data also is a problem. If an agency’s data now lives off-premise and can only be accessed through that environment, the agency's options have suddenly narrowed considerably. The provider now has a functional monopoly on that agency data, and new workloads requiring that data will naturally accrete to the incumbent. It is now significantly harder to move to some incredible new service or more efficient vendor.
Agencies that ignore these lock-in issues are betting that the cloud provider they choose right now is the right provider for the next five to 10 years. When it comes to any technology, but especially emerging technologies like cloud computing, this will always be a loser’s bet.
Because the choices made now are effectively setting an IT agenda for a decade, agencies need to not look at cloud computing as a tactical choice, but rather a huge architectural decision. This means that they need their exit strategy in place before they even think about putting federal data in a cloud.
Getting out: more important than getting in
When it comes to cloud computing (and any emerging technology, really), agencies need to focus far less on entry costs and use a defined exit strategy as their buying principle. In a perfect world, a cloud exit strategy should be as simple as putting data in the cloud, but this is far from the case, especially in proprietary public clouds.
Not only could leaving a particular cloud provider be incredibly expensive, it can jeopardize sensitive data. What if an agency’s data is stored in a proprietary format and extracting it will not be 100% lossless? Or what if extraction isn’t even possible? These problems aren't specific to cloud, of course, but cloud environments make these problems significantly more complicated.
These issues need to be considered before jumping headfirst into the cloud. As such, agency IT decision-makers should use the following as a checklist to ensure that they have a clear exit strategy from the cloud:
- A standard operating environment, which should be in place well before cloud computing is considered. This gives consistency across operations and provides much more portable “interchangeable parts” for cloud and on-premise applications.
- Consistent management and monitoring strategies that allow agency IT teams to monitor what’s happening in the cloud as easily as on-premise activities. No matter where an application or specific data live, agencies need to be able to manage it consistently. This makes it much easier to incorporate and switch services into the agency's portfolio, lowering the barrier to entry for efficiency and new capabilities.
- Control over agency data, while a cliché, is still very true. Budgets should be designed for putting data in and pulling data out of a cloud.
Despite these challenges, public clouds are useful for agencies, offering vast amounts of processing power and storage for fractions of the on-premise cost. The key, however, lies in approaching the cloud in a way that gives agency the power of public clouds and the relative safety of private clouds. This means that hybrid should be the way forward for federal cloud computing.
Go open, go hybrid
Hybrid clouds promise the best of both public and private cloud computing in a single package, coupling the power/elasticity of public clouds with the security and control of private. Creating a solution that links the two together through standardized operations, however, requires a level of transparency and flexibility that’s simply not available through proprietary technologies.
Enter open source, often touted as the building block for cloud computing and the Information Age in general. Open clouds allow the free movement of work from cloud to cloud by enabling unprecedented integration between offerings; rather than relying on a single vendor to deliver a monetized solution, open cloud technology is fueled by the wisdom and experience of a skilled community, one that exists at the bleeding edge of technological innovation.
Nearly every public cloud, from Google to Facebook, is built on open source, and to the extent that agencies use those same open-source tools, they can take advantage of the all the engineering and research those organizations bring to the challenges of cloud computing. If federal agencies truly do want to set themselves up for success in the cloud age, they need to build on open solutions as well.
In the end, "cloud" is just the stepping-stone to the next great technological breakthrough. By adopting an open, hybridized approach to the cloud, federal agencies can ensure that they’re ready for whatever that innovation is, whenever it might occur.
Originally posted on the GCN blog. Reposted with permission.
Comments are closed.