Got sensitive data in the public cloud? Encryption is essential, but there’s more to it.
Using the public cloud has become the de-facto standard for many organizations, many of which use two or more public cloud services providers, along with on-premise data centers and the private cloud. IT environments for many organizations are now more distributed than ever before, with no clear perimeter whatsoever. Combined with various regulations becoming stricter, the importance of data security in general and encryption in particular has risen significantly.
And indeed, the major public cloud service providers (CSPs) have made it even simpler to use encryption. With many of the native cloud services, data is either encrypted by default or with a click of a button. The challenge of key management is mostly abstracted and handled by a native CSP key management service.
Making encryption and key management easier and expanding the use of encryption is always welcome, however the baseline provided by CSPs may not be sufficient for the confidential, regulated or sensitive data that many organizations store (think banks, healthcare services, government, etc).
There are three primary challenges associated with this:
From a data security point of view, a fundamental best practice is to store the encryption keys segregated from the data they protect. However, by design, presently the CSP is a single entity holding both the data AND the encryption keys. While the CSPs put great emphasis on secure operations and have tight controls concerning key management, such a setup may be problematic for highly sensitive data – as unexpected circumstances such as government warrants and subpoenas may force CSPs to disclose such data to government authorities. Only over the course of last year, thousands of such warrants were granted by the major US CSPs.
Secondly, compliance with various data protection and data privacy regulations such as GDPR may require tight control of the data, thereby requiring ultimate control of the encryption keys to the data. A good example to illustrate this is about data shredding: assuming an organization uploads highly sensitive data to the cloud and is obliged to delete it after a certain period of time. How can one ensure that the data indeed was completely eliminated, from all instances, backups, servers in all relevant regions? This is a difficult undertaking; some would argue it’s impossible to guarantee the data was completely deleted. However, holding the master encryption key separately from the cloud would have made it whole lot easier: just delete the master key, and the encrypted data in the cloud is instantly shredded and cannot be recovered.
Lastly, from a data governance point of view, organizations with mature key management processes would want to apply the same set of policies and rules on their sensitive data, no matter where it resides: multiple public clouds, the private cloud or on-premise. In that sense, the default option of using keys that are managed separately in a CSP key management service creates inherently inconsistent data governance, which is growing larger as more and more sensitive data is used in the public cloud, making key management even more complicated task.
Google, for example, is aware of this problem and recently revealed that keys can be managed outside of GCP with Cloud External Key Manager (EKM). Cloud EKM lets you encrypt data in BigQuery and Compute Engine with encryption keys that are stored and managed in a third-party key management* system that’s deployed outside Google’s infrastructure, maintaining separation between data at-rest and encryption keys.
*Unbound is the only third-party provider that lets you manage encryption keys outside of Google Cloud without the need for dedicated hardware. Learn more about how to control your own keys in GCP while still leveraging the power of cloud for compute and analytics.