The Very Sad State of Data Privacy

Over the last week, two major events pushed the topic of data privacy aggressively into the headlines. On both shores of the Atlantic, Facebook is facing harsh criticism due to a massive leak of 50 million users’ private data to Cambridge Analytica, a UK-based political consulting firm that was involved in Donald Trump’s campaign. At roughly the same time, Telegram was forced by Russia to provide encryption keys of its users to the Russian Federal Security Service (FSB) intelligence service after its appeal to the Russian supreme court was rejected.

While completely unrelated, these two events highlight the very grim situation of data privacy today (or lack thereof). Using Internet services for free in return for comprehensive collection of private information enabling targeted advertising seems natural. However, Facebook’s mishandling of user data is a stark demonstration of the real dangers to our society due to the massive amount of personal data collected, combined with low standards of protecting it and its usage by third parties for illegitimate purposes. The nature and volume of data collected is very worrying, and even includes call and text logs for Android users. This scandal shows no signs of dying down and may eventually be a turning point in how users perceive and value their data privacy.

The Telegram issue raises the significant challenge that individuals have controlling their data in a globally interconnected world where governments use their power to obtain private data from digital services. While in some cases the control of data for national security may be necessary, there is not enough transparency in the, and decisions regarding the balance between privacy and security are typically made in the shadows, to the extent that users are not even aware of subpoenas issued or government interference.

In light of these high-exposure events, there is a good chance that the General Data Protection Regulation (GDPR) slated to come into effect in Europe by May 2018 will be the harbinger of active discourse about data privacy and its value to users. GDPR is changing the game in that regard, with strict data protection rights for European citizens that are enforced with unprecedented fines of up to 4% of global annual turnover. A very important part of the GDPR is about “returning the control to the individual”, with requirements such as the following:

  • Provide users with mechanisms such as requiring explicit user consent for collection and usage of personal data.
  • Complete transparency and disclosure of data collected – upon a user’s request, she will be eligible to receive full disclosure of all of her personal data held by the. Click here to see a live example of the massive amounts of data collected.
  • Allow the “right to be forgotten”, providing users with the assurance their data is completely removed from the service upon request.
  • Strict restrictions on the transfer of EU citizens’ data outside of the EU jurisdiction.

The GDPR is indeed a big step forward in protecting privacy. However, while it specifies these much-needed requirements on part of enterprises, it doesn’t specify how to implement them. Implementation is far from trivial, especially for highly data-driven services that collect rich private information from end users and apply complicated processing. One approach for achieving compliance is by using granular, app-level encryption – where each user is associated with an encryption key. Accessing the user’s data would be impossible without using the key.

However, even when using a tool like encryption, one may still question what guarantees exist for her data once it is collected by a service provider (SP), even if it claims to be following strict data privacy policies or a regulation like GDPR. As the SP holds both the key and the data, they have all that is required to access the data at will; this key can be compromised, rendering loss of control in the data. On the SP side, they would like to have very high assurance that compliance is met, and their users’ data is safe, even in light of threats such as data breaches or malicious insiders.

How will it work? Today, when using a service like Facebook, there are only two parties engaged in a transaction: the SP and the user. In these cases, the user gets a service for free, and “pays” in today’s ultimate currency – her data.

Facebook Data Transaction
Two parties engaged in a transaction: the SP and the user. In these cases, the user gets a service for free, and “pays” in today’s ultimate currency – her data.

Now, we add the privacy escrow, to create a three-party relationship. The privacy escrow must be a separate and independent entity that stands between the SP and the user, is integrated with the SP and has a direct interface to the end user. It is imperative to note that this is an escrow that is first and foremost trusted by the user (it is NOT a key escrow, i.e. an escrow that authorizes a third party to gain access to keys). Its role is to ensure the user that data privacy is adequately protected according to a set of predefined policies, e.g. along the lines of the requirements mentioned above. Such entity would need to provide assurances on its trust model and ideally be based in a neutral jurisdiction that is considered “privacy-safe”. The user will have absolute freedom of choice selecting the escrow entity they want to use for their services and its jurisdiction.

privacy escrow
The privacy escrow must be a separate and independent entity that stands between the SP and the user

Today, based on recent breakthroughs in cryptography, the technology to achieve such assurances exist within reach. Using advanced techniques such as multiparty computation (MPC), it is possible to efficiently share encryption key between two or more parties without ever having a complete key in plain, creating a scheme where none of the parties alone has the ability to obtain the key or use it. Applying this principle to the privacy escrow concept, it is possible to share the decryption key to the user’s data between the SP to the privacy escrow. Whenever the SP needs to use the data, it will have to use the decryption key, which will require communication with the privacy escrow and thus immediately trigger an audit event, including all the details of the transaction to the level of detail desired (e.g. what data was used, for what purpose). Neither the privacy escrow or the SP ever hold any the key, so assuming they’re located in different jurisdictions, no force including a single government subpoena could lift the decryption keys or attempt to use the key unnoticed.

Trust model of a privacy escrow
Trust model of a privacy escrow

While this constitutes a solid foundation to the trust model of a privacy escrow, of course there are additional challenges in establishing trust in a privacy escrow and means for validating it that remain to be discussed.

Now, while this concept offers clear advantages and significant improvement in the assurance of data privacy, a great shift in our thinking needs to take place for things to change. Even following last week’s severe events, it may very well be that most end-users will not be pushed so far out of their comfort zone that they will stop paying for free services by providing personal data, but rather pay money for services while ensuring their data remains private. A public call to provide such privacy-conscious services may spawn a new generation of solutions and services that enable this, such as the privacy escrow.  Alternatively, a change may come from the SP direction, in response to regulations like GDPR that define very heavy fines for non-compliance, affecting EU and global companies alike. While the future is not ours to see, for sure – it will be very interesting to see how it all plays out.

Oz Mishli

Oz Mishli

Oz is a cybersecurity expert, specializing in malware research and fraud prevention. He’s held both business and tech roles in the industry, and served in an elite intelligence unit of the IDF.

Subscribe to BLOG