How the NSA is preventing another Snowden (and why you should do the same)

How the NSA is preventing another Snowden (and why you should do the same): On December 11th, the director of the National Security Agency (NSA), Gen. Keith Alexander, stood before the Senate Judiciary Committee to discuss actions being taken in the wake of Edward Snowden’s disclosures.

Gen. Alexander didn’t disclose all of the preventative measures the NSA are taking to address the Snowden breach. However, the two measures he did highlight are ones your organization should also consider, particularly if you have a highly virtualized environment or are entrusting your data to a third party or public service provider.

1. Implement the ‘Two Person Rule’ for sensitive operations

The U.S. military has leveraged this model for highly significant activities like launching nuclear missiles (remember The Hunt for Red October and War Games?) Ideally, this prevents a catastrophic event if someone accidentally hits the wrong button, say after a 3:00 am shift, or a substantive mental break with reality. With this methodology, two separate individuals are required to use their ‘keys’ in order to complete an operation.

National security is no longer defined by missiles. It is defined by data.

Edward Snowden was a systems administrator. In this capacity, he had broad access to networks, passwords and, obviously, data. This is a lesson that most organizations need to take more seriously.

As we look at virtualization and the cloud, the lesson becomes even more important. Where traditional datacenters were comprised of air-gapped servers with separate hardware, applications, and administrators, the cloud collapses hundreds or even thousands of applications into one centrally-managed infrastructure. This gives those who manage this environment – quite literally – the keys to the kingdom.

The what-if’s are nearly endless. What if an administrator decides that they could make an extra bonus by selling source code or other company proprietary information? Would you even know this data had been copied? Adobe didn’t.

What if an administrator accidentally suspends or deletes a virtual machine running the company’s payment processing application? The implications in cost and downtime are highly concerning.

What if a malicious attacker used phishing to gain the credentials of your vSphere administrator?

Instituting policies such as the Two-Person rule, or secondary approval, for sensitive administrative operations just makes sense. Even better, find tools that can help you automate this process. Ideally, you want to be able to prevent damaging actions from being taken in the first place, but additionally, you need to know when they are being attempted. This level of visibility will become the new norm in virtualized environments.

2. Use encryption.

This time, it’s not just me advocating encryption! During his Judiciary Committee presentation, Gen. Alexander indicated encryption is one of the technologies they will employ to add further security to the NSA infrastructure.

When you look at your cloud initiatives, especially if they involve use of the public cloud, make sure you are building in the appropriate security measures. Encryption, as long as it is properly implemented and includes a scalable, secure key management system that you control (not your cloud service provider), is one of the best methods to ensure data privacy.

As I have discussed in previous blogs, one of the many advantages of encryption is that it essentially makes the default state of your data secure. This means that you can build a security program from the inside out, rather than from the outside in, starting with what you are trying to protect in the first place: your data.

Best practices for preventing insider threats in the cloud

Using administrative control and data encryption in tandem can go a long way to preventing the kinds of threats we’ve witnessed with Snowden and other ‘insider’ incidents.

Consider defining a group of administrators who will be responsible for your virtualized infrastructure, and make sure you have appropriate 2-factor authentication, access controls, policies and monitoring to define and enforce acceptable behavior.  The administrators who manage your encryption policies should ideally be a different set of people. For example, the owners of the databases where sensitive information is held can use encryption to prevent any VM administrator from ever being able to access to the data.

Once you have locked down your admin control and your data, the next step on your path to a secure cloud may involve building a root of trust down to the hardware level. For example, today it’s possible to define good known hosts using Intel’s TXT technology. Imagine if you could tag your sensitive VMs and ensure that they only run on specified hosts, and within a specific geo-location.

I believe this level of control and visibility will make all cloud infrastructures dramatically more secure, paving the way to even greater adoption. What steps will you take in 2014 to secure your cloud?