Read this quarter’s Intermedia here

BLOG

Platforms Are Public Spaces, So Let’s Treat Them Like One

14.02.2019
Share this

Applying a ‘duty of care’ provides flexibility as well as protection

When considering the regulation of the internet, specifically social networks, two issues have challenged policymakers trying to reduce harm: whether platforms should be seen as publishers or intermediaries, (i.e. liable for the content of others) and how content can be monitored at scale. There is, however, another approach: to focus on that for which the service providers are responsible – the platform that each provides.

Platforms can be seen as forms of public space. Each has its own expectation of appropriate behaviour and risks of participation, for which owners or controllers have a responsibility to their users. The developer of the platform controls the environment, since the code they produce is the enabler for everything that happens within it. Actions which take place on a platform result from corporate decisions about the software, the terms of service and the resources available to enforce those terms. So, just as ‘persuasive design’ can nudge users to act in one way or another, a service provider can also design in safety features to reduce the risk of reasonably foreseeable harm to its users.

At Carnegie UK Trust, we suggest that service providers should be under a statutory ‘duty of care’ to their users to provide an appropriately safe space. This does not mean that service providers must eradicate all harm. But they should take care to minimise it by adopting a precautionary principle focused on the risk of types of harm arising, rather than the details of causation.

This approach has a number of advantages. Specifying the high-level objectives to safeguard the general public allows room for service providers to act by taking account of the type of service they offer, the risks it poses (particularly to vulnerable users) and the tools and technologies available at the time. The approach builds on the knowledge base of the sector and allows for future proofing. The steps taken are not prescribed but can change depending on their effectiveness and on developments in technologies and their uses. A final advantage applies to the content itself. Although the duty of care does not directly regulate content (though some national rules doing so may well remain in place), users might respond by changing their behaviours. This could mean that problematic content and behaviours (e.g. misogynistic abuse) do not become normalised and, perhaps, not even arise in the first place.

In the emphasis on the architecture of the platform there are similarities with the ‘by design’ approach seen in data protection and information security (for example in the EU General Data Protection Regulation). This approach is often thought of as ‘baked in’ at the beginning of the service or product design and thereafter not considered. This contrasts with the statutory duty of care approach. While there may be some decisions and structures that are fundamental to the service and therefore left unchanged, others may need revision, adaptation or to be abandoned altogether. This ensures that the statutory duty of care approach is not a one-off action but an ongoing, flexible and future-proofed responsibility that can be applied effectively to fast-moving technologies and rapidly emerging new services.

Applying a ‘duty of care’ provides flexibility as well as protection

Theme:
Privacy, Safety, Security
Series:
International Women's Day 2020
Lorna Woods Lorna Woods Professor of Internet Law, University of Essex, UK
You may also like... Blog
Regulatory Watch – November 2024 26.11.2024
Blog
Regulatory Watch – October 2024 22.10.2024
Blog
Regulatory Watch – September 2024 25.09.2024

Latest

Publication
IIC Roundtable – A new EU Industrial Policy for the Communications Sector – November 2024 04.12.2024
Publication
IIC Brussels Chapter Meeting: Final Countdown – Preparing for the NIS2 Directive and its Implementing Act – October 2024 03.12.2024
Blog
Regulatory Watch – November 2024 26.11.2024
Publication
IIC USA Chapter – The rise and growth of streaming video services around the world – October 2024 30.10.2024
View All
Back to the top

The IIC is the world's only policy debating platform for the converged communications industry

We give innovators and regulators a forum in which to explore, debate and agree the best policies and regulatory frameworks for widest societal benefit.

Insight: Exchange: Influence

We give members a voice through conferences, symposiums and private meetings, as well as broad exposure of their differing viewpoints through articles, reports and interviews.

The new website will make it easier for you to gather fresh insights, exchange views with others and have a voice in the debate

Take a look Learn more about our updates
Please upgrade your browser

You are seeing this because you are using a browser that is not supported. The International Institute of Communications website is built using modern technology and standards. We recommend upgrading your browser with one of the following to properly view our website:

Windows Mac

Please note that this is not an exhaustive list of browsers. We also do not intend to recommend a particular manufacturer's browser over another's; only to suggest upgrading to a browser version that is compliant with current standards to give you the best and most secure browsing experience.