International Institute of Communications

Shaping the policy agenda: TELECOMMUNICATIONS • MEDIA • TECHNOLOGY
Tel:+44 (0)20 8544 8076
Fax:+44 (0)20 8544 8077

social twitter sm  social linkedin sm  social youtube sm  social facebook sm

Dark Clouds?

Dark Clouds?

It is clear that cloud computing offers society many potential benefits. However, its take-up is still being held back by much fear, uncertainty and doubt among not just potential cloud users, but also some policymakers and regulators. The position is exacerbated by the fact that current laws are not technology neutral. Indeed, arguably, European Union (EU) laws are being applied so as to discriminate against cloud computing, in part perhaps because of fears regarding US technology companies’ dominance in the cloud market and/or their excessive collection of EU residents’ personal data.
This article gives some illustrative examples, and argues that the situation needs reconsideration. While the focus is on EU laws, the ways in which they have been applied have broader relevance to technology neutrality generally.

cloud – no one sIze fIts all

Essentially, cloud computing involves the self- service use of IT resources over a network, scalable up and down with demand/need. Based on the US National Institute of Standards and Technology (NIST) service models, where the IT resource used is a software application, such as email, word processing, social networking, photo sharing or a customer relationship management application, the type of service is termed SaaS (software as a service). Where the IT resources used over a network comprise ‘raw’ computing resources, ie. computing infrastructure that may be used for storage, computation and/or networking functions, the type of service is termed IaaS (infrastructure as a service). Where the IT resources comprise a ‘platform’ for the development and deployment/hosting of a software application of the cloud customer’s own choice, the type of service is PaaS (platform as a service).

These resources are provided ‘as a service’ – customers need not be concerned with exactly how hardware/software infrastructure resources are marshalled behind the scenes to provide them with the requested service. Typically, public cloud involves the shared use, by separate customers simultaneously, of standardised commodity hardware or even software. The efficiencies and economies of scale, and resultant cost savings, that typify public cloud are enabled by this shared use (and the ability to redeploy underlying hardware/ software for use by other customers, when one customer’s usage ceases).
So it can be seen that the term ‘cloud computing’ encompasses a huge variety of different services.

This means that a one size fits all approach should not be taken to cloud. Although these services have some common characteristics, reflecting the cloud service delivery model, each type of service often merits separate consideration, particularly when it comes to their regulation, because their differences may be as significant as their similarities, and these differences need to be taken into account by policymakers and regulators in order to regulate them appropriately. Nowadays, in an attempt to future-proof laws against subsequent technological developments, policymakers and regulators often aspire towards technology neutrality. However, a core problem with many existing laws and regulations is that they are far from being technology neutral. As data protection law issues often come up in the cloud context, examples from that field will serve well to illustrate many of the problems that arise from laws not being technology neutral – in this case the Data Protection Directive 95/46/EC, together with national implementing laws under the directive and regulators’ and courts’ interpretations of such laws.

treated as ‘Processors’

The first example is regulators’ insistence that many cloud providers must be treated as ‘processors’. Recall that, under the directive, data protection obligations (and liability) are imposed on the controller, the person who controls the ‘purposes and means’ of processing personal data. A controller may engage a processor to process personal data on its behalf, but the controller remains primarily liable, including for its processor’s actions or omissions in processing the data. Recall also that, under the directive, ‘processing’ is very broad, and includes merely storing personal data passively, or transmitting personal data mechanically. This means that, strictly, a regulator’s approach is correct: if a cloud service is used by a controller for processing any personal data, eg. file storage or sharing where the file contains personal data, then the provider is a ‘processor’, because it is at least storing personal data, even if only passively.

This also means that the directive’s rules governing the use of processors apply when a controller uses a cloud service to store or otherwise process personal data, including a requirement that the controller must ensure it has a contract with the processor whereby the processor agrees to comply with the controller’s ‘instructions’ in processing the personal data.
However, the processor provisions of the directive are based on 1970s outsourcing models. Then, and indeed in the 1960s, controllers used computer service bureaux, which were handed personal data (stored on magnetic tape or even punched cards) to process actively for the controller in accordance with the controller’s instructions, typically for payroll or accounts receivable processing. This is a far cry from a cloud provider, or indeed non-cloud hosting provider, passively storing personal data which the controller uploads, operates on and retrieves in self-service fashion, using the provider’s software made available as part of the provider’s service, without requiring any active action on the part of the provider. The analogy I suggest is that of cooking. If we liken the processing of personal data to the cooking of food, data protection laws assume that either you cook food yourself in your own kitchen (controller), or else you hire a caterer (processor) to cook food for you as per your instructions. But using IaaS, PaaS and certain SaaS cloud services is much more like renting a kitchen in which you then cook food yourself, or getting take-out or a ready meal which you then microwave yourself in your own kitchen. It seems obvious that laws intended to regulate the use of caterers would be difficult or impossible to apply to kitchen rentals or microwaving – they were not designed for the latter. So too with data protection laws’ processor provisions and cloud computing.

In particular, the contractual ‘instructions’ requirement makes no sense in self-service public cloud, which involves the use of standardised commodity resources that could not realistically be tailored to different customers’ individual instructions.8 If we look behind the instructions rule, its legislative objective was in fact to prevent unauthorised disclosure or unauthorised use by the processor. So, the policy aim of that rule could have been met, without needing to refer to any ‘instructions’, by requiring a contractual term prohibiting, more generally, any unauthorised use/ disclosure by the processor (or by imposing a similar statutory prohibition). However, because that rule was based on outdated assumptions regarding outsourcing models/ processes, cloud customers and providers are in the difficult position of either agreeing a meaningless contractual term, or breaching data protection laws.

Another unspoken assumption underlying the instructions rule is this: the rule assumes that processors must always have access to personal data in intelligible form. Again, that was certainly true in the days of computer service bureaux, which needed access to intelligible data to perform the functions for which they had been engaged, such as payroll processing. However, this assumption does not necessarily hold true in cloud computing, because with many types of cloud application, such as file storage, customers are able (if they so choose) to encrypt their data before upload to the cloud, such that the cloud provider has no access to the decryption keys. In such cases, it seems pointless to require the provider to follow the controller’s instructions regarding such data, or even to prohibit the provider from using or disclosing such data, because it cannot access intelligible data – no privacy risks arise from a provider that has no access to intelligible data, as it can’t disclose or misuse data that it can’t understand.

Some might argue that a cloud provider should be legally obliged to follow any instructions given by the controller to back up the controller’s data. I suggest that this argument is misguided, particularly with encrypted data. Suppose that a controller of personal data decides to encrypt that data and then upload that encrypted data to a file storage service (cloud-based or not) offered by a service provider. The controller knows the nature and content of that data, and took the decision to use the storage service. The provider does not; it simply makes a storage service available for self-service use by customers, perhaps even as a free service. In terms of logic and fairness, who ought to be legally obliged to look after that data, such as by ensuring that the data are backed up to another service or to the controller’s own facilities, or even by paying the provider extra fees for a contractual commitment from the provider to backup that data elsewhere? Surely it should be the controller that is legally obliged to protect that data, not the provider, which has no idea what data are being stored by its customers using its service.

Forcing all cloud providers always to back up all their customers’ data at all times in all cases would be too blunt a requirement. It would interfere with freedom of contract and controller choice (as to exactly how it wants its data to be backed up and at what price), raise costs generally, and even be detrimental for data protection, as ideally data ought to be backed up with a different provider at a different (and preferably far distant) geographical location in case of a provider’s failure or insolvency or a natural disaster affecting the primary location. Although the EU’s proposed General Data Protection Regulation (GDPR) looks set to impose certain obligations and liabilities directly on processors, it seems unfair to do so in situations where the processor is unaware that the data are personal, because the data are encrypted and the processor has no access to the key.

Indeed, it’s arguable that, even if personal data are uploaded to the cloud in unencrypted form, with many types of cloud services (which I term ‘infrastructure cloud’), notably IaaS, PaaS and pure storage SaaS services, the provider would still be ignorant of the nature or content of data uploaded to its services – unless and until it ‘looks’. In most cases, it will not bother to look. An infrastructure cloud provider is most like a computer rental company. If you rent a computer from a rental company, then what you use the computer for, what type of data you process using that computer and how, is entirely your own business. No one would seriously suggest that the rental company must be treated as a processor for data protection law purposes, should you choose to process personal data using its computer. The same logic ought to apply to infrastructure cloud.

True, a rented computer is legally owned by the rental company, not by you, and the rental company could well plant spyware on the computer to monitor you and even read the data you process using its computer. But if it does so (as happened with Aaron’s, a computer rental chain in the US) then it would become a controller in its own right, and liable as such. However, the potential for a computer rental company to install spyware on its rental computers does not mean that all computer rental companies should automatically be treated as ‘processors’. And surely the same argument should apply to infrastructure cloud.
Going further, I argue that obligations should be imposed only on those with access to intelligible data, unless the policy decision is made to impose strict liability of some kind, such as for security measures. However, any such policy decision should be taken only after full consideration of the implications, including open discussion with all relevant stakeholders. 

Currently, infrastructure cloud services, as substitutes for buying/renting computing resources and provisioning/deploying app hosting services in-house, have a very important role to play in enabling innovation. A European technology startup seeking to become the next Facebook or Google, or some novel type of service we may not even have considered yet, is very likely to want to use IaaS or PaaS to service its end users, because infrastructure cloud services offer speed to market, low upfront costs, and high flexibility and agility. Many mobile apps are built on top of IaaS or PaaS services; for example, Finnish company Rovio’s popular Angry Birds game uses Amazon Web Services. Some cloud providers may well be processors in the active sense, depending on the type of service. But constraining the use of computing resources (in the form of infrastructure cloud services) by deeming cloud providers to be processors, even with infrastructure cloud services or when data are encrypted pre- upload, seems unnecessary and counter-productive. Tarring all cloud providers with the same ‘processor’ brush could even deter innovation.

Furthermore, the use of encryption by cloud customers (and indeed cloud providers, to prevent intelligible access by their sub-providers) should be encouraged by legally recognising that encryption may render data unreadable to unauthorised persons. Suppose you find a USB flash drive in the street but it contains encrypted personal data, so you can’t read it. Now, you do control the purposes and means of processing the data on that drive. Should you be treated as the controller of that personal data (that you don’t even know is personal data), with corresponding obligations and liabilities? If you give that drive to someone else to look after, should they become your processor? I argue not: surely legal obligations should only be imposed on those who can access intelligible data. Similarly, treating cloud providers as processors if they hold encrypted personal data, where they have no access to decryption keys, makes little sense.


 "An infrastructure cloud provider is most like a computer rental company."


Yet the proposed GDPR would impose obligations and liabilities on such providers as processors. Many non-technologists seem to mistrust encryption. Yet former US National Security Agency (NSA) contractor Edward Snowden, who revealed mass digital surveillance by the US National Security Agency and other intelligence/security agencies, has noted that encryption, used properly, could withstand “brute force attacks” from powerful spy agencies and others. “Properly implemented algorithms backed up by truly random keys of significant length… all require more energy to decrypt than exists in the universe”.
Security experts such as Bruce Schneier believe that encryption, if adopted en masse by internet users for storage and transmissions, should not only help to protect data against theft or loss, but also make wholesale state surveillance of internet users more difficult and expensive. Policymakers need to recognise the critical role that technical measures such as encryption could play in protecting personal data, and encourage its use more widely. So where are the incentives for controllers (and processors) to apply encryption?

use of sub-ProvIders

As another example of non-technologically neutral laws adversely impacting cloud, consider data protection regulators’ approach to the use of sub-providers in cloud computing. In cloud computing, if a SaaS service is provided using underlying IaaS or PaaS infrastructure, the IaaS/ PaaS provider is treated as a sub-provider. Regulators want all sub-contracts between cloud providers and their IaaS/PaaS sub-providers to mirror the controller-processor contracts, complete with the (meaningless in cloud) ‘instructions’ requirement. Again, however, cloud is completely different from traditional outsourcing.
Suppose you outsource data processing to a third party service provider, which buys or rents computing resources (servers, storage appliances, networking equipment) to provide you with that processing service. Regulators would not require ‘mirror’ contracts from the vendors of such hardware infrastructure in that situation, so why do they require them from infrastructure cloud sub-providers? Requiring mirror contracts where computing resources are sourced from cloud infrastructure service providers, but not when those resources are purchased or rented for exclusive use in the classic equipment rental sense, discriminates against the cloud model.

If the justification for the different approach to cloud is that infrastructure cloud providers could access data processed using their services, the same could be said of computer rental companies, and equally hardware manufacturers/vendors could also install ‘backdoors’ in their equipment to access data processed using that equipment. Indeed, reportedly the NSA intercepted routers in transit to targeted destination companies, to install such backdoors.

So why is it that regulators don’t require controllers who use routers and other equipment for their own internal processing to check for possible backdoors? Why don’t they require controllers, when using non-cloud processors, to compel the processors to check the hardware they use (eg. servers in the processor’s own data centres) to process personal data on controllers’ behalf?
The EU directive and related laws do require controllers to take appropriate security measures, and also require that controller-processor contracts must oblige processors to take certain security measures. It could be said that those general security requirements would, or could, implicitly extend to such checks, so no explicit requirement is necessary.
However, if those general security requirements are considered sufficient to address the risks of backdoors in hardware used by controllers and non-cloud processors, why aren’t they also considered good enough to address the risks of cloud provider/sub-provider access? Why should mirror contracts be required from cloud sub- providers in addition? Couldn’t technical measures such as encryption, and specific contractual provisions (narrower than full mirror contracts) suffice to protect against such risks?

As mentioned, technology startups and other SMEs may wish to use IaaS/PaaS from cloud sub- providers for speed to market and cost-efficiency. However, SMEs rarely have the bargaining power to force large cloud sub-providers to enter into such mirror contracts, and it’s a similar situation with European cloud providers that base their offerings on the services of large sub-providers. Large cloud providers, which have more control over their supply chain, are far more likely to be able to obtain mirror contracts from their sub-providers, and therefore are more able to offer law-compliant data protection processing. So, while regulators of course have the protection of data subjects in mind, when insisting on mirror contracts the (unintended) consequence is to favour large providers, most of which are not European. Has the impact of this approach on competitiveness and innovation been considered, as well as its effectiveness to achieve the underlying policy objective?


"Where are the incentives for controllers (and processors) to apply encryption?"


A related issue is the legal uncertainty regarding whether a data centre provider is or is not a cloud sub-provider, from which a mirror contract would also be required. Only the largest providers can afford to build their own data centres. Most providers, particularly SMEs, rent space/servers from third party data centre operators, many of which are large global organisations. If mirror contracts are required from such data centre operators, again SMEs are unlikely to be able to secure such obligations. Yet again, this approach seems to discriminate against cloud computing.

Similarly, suppose that, in a traditional outsourcing model, a controller engages as its processor a service provider that happens to use a  third party data centre. If a non-cloud service provider uses a third party data centre operator, would a mirror contract not be required, and if not, why should it be required if the service provider uses the cloud model? The data centre operator’s position and rights/liabilities in relation to the service provider are not likely to differ with whether the provider’s service involves cloud or not. And are telecoms operators that provide connectivity to data centres to be considered as sub-providers that must also sign mirror contracts? If this is required for cloud providers, why not for non-cloud services too?

consuMer Issues

I have focused mainly on infrastructure cloud services, but SaaS also merits mention. It seems that understandable concerns regarding the massive collection of EU residents’ personal data, particularly by internet companies and advertising networks, have resulted in strong reactions on the part of policymakers and regulators, such as (arguably) parts of the proposed GDPR, which includes recitals that “consent should not be regarded as freely given if the data subject has no genuine and free choice and is unable to refuse or withdraw consent without detriment”, and “Consent is presumed not to be freely given, if it does not allow separate consent to be given to different data processing operations despite it is appropriate in the individual case, or if the performance of a contract, including the provision of a service is made dependent on the consent despite this is not necessary for such performance.”

Such concerns may also have influenced regulators’ attitude towards cloud. Indeed, reactions have been triggered by consumers too, including the increasing use of ad blockers: the Interactive Advertising Bureau recently admitted that “we messed up… we built advertising technology to optimise publishers’ yield of marketing budgets… our scraping of dimes may have cost us dollars in consumer loyalty.”
Consumers do enjoy some benefits from free, ad-funded services – cloud-based or otherwise, many of which use personal data in return for providing free services. It may be counter-productive to prevent such services completely, as could be the result if the proposed GDPR’s recitals are taken to prohibit conditional consent altogether. Although that issue is not cloud-specific, again the difficult question is how to strike an appropriate balance: how to allow free services to be provided without excessive collection or use of consumers’ personal data. The recitals quoted may reflect policymakers’ understandable reaction against many free services’ excessive collection/use of personal data, but care must be taken if consumers are not to be deprived of free services altogether. A more granular exchange of personal data for services might be ideal, if it can be achieved in a way that is not too time-consuming or burdensome for consumers or service providers.

In summary, fears about personal data collection/tracking may well be behind strict approaches to cloud computing. Furthermore, it seems to be inherent to assume that new things are risky and to be feared. However, it is important not to take a one size fits all approach to cloud and bear in mind its potential uses for innovation. Policymakers and regulators need to be better informed about the technological, commercial and social environments to strike the right balance.

a question of balance

Conflicts between different rights and freedoms

Another vital issue is how to strike an appropriate balance between the different rights and interests which democratic societies strive to safeguard and foster, but which may in certain situations conflict. A classic clash is that between privacy/data protection and freedom of expression, or freedom to conduct a business or indeed European Union citizens’ freedom to work and provide services in any member state – all of which are enshrined as fundamental rights under the EU Charter of Fundamental Rights. Cloud brings these clashes to the fore.

Consider the policy objective of fostering e-commerce, to which end the E-Commerce Directive (2000/31/EC) introduced certain ‘notice and take-down’ defences for neutral intermediaries, whereby such
intermediaries are liable only if they know about infringing material and do not remove it. A web hosting provider does not know the nature of the content hosted on its servers, unless it looks or is notified that the content is copyright-infringing. If it takes down the content on receiving such notice, this provides it with a defence against liability. However, the E-Commerce Directive does not apply to personal data. Therefore,
if the content hosted is personal data, unlawfully posted to the website concerned, it seems that the hosting provider could be a ‘processor’ (and liable as such under the proposed GDPR).
If so, neutral e-commerce intermediaries, cloud or otherwise, would be liable for personal data regardless of their knowledge or control.

Is this truly the intended policy objective? Have the full ramifications of such an approach been considered, such as increased costs for EU customers or even the potential withdrawal of certain services from
them? The proposed GDPR would simply state that it is ‘without prejudice’ to the application of the E-Commerce Directive, which fails to clarify the uncertainty (‘B is without prejudice to A, but A shall not apply to B’- so what’s the law?). Policymakers should make it clear whether neutral intermediary defences will be available for personal data.

 

 

Stay up to date with the IIC

We will give you a monthly round up of up-coming events, where we’ve been as well as interviews and selected articles from InterMedia.

Subscribe to Policy World

Follow us on Twitter