The dangers of pluralisation: A singular duty of care in the Online Safety Act

Rys Farthing
Lorna Woods

With Australia’s Online Safety Act under review, regulators should take heed of the UK’s experience in imposing duties of care on digital platforms.

The dangers of pluralisation: A singular duty of care in the Online Safety Act

22 April 2024

The progenitor of the Australia’s Online Safety Act, the Enhancing Online Safety for Children Act, was world leading when it was passed in 2015. The Act was an exemplar in its attempts to reduce harmful content on digital platforms through strong take-down powers and in establishing the world’s first dedicated digital safety regulator. More importantly, it showed that regulating platforms was indeed possible.

Despite successive updates and reforms however, Australia’s legislative online safety framework has struggled to keep pace with the dynamics and growth of the digital risks Australians face online. The continued focus on harmful content, and taking it down piece by piece, overlooks the risks created by the increasingly sophisticated and powerful systems digital platforms deploy. These include risks posed by friend and content recommender systems, to risks posed by extended use design features (that make platforms feel ‘addictive’), to name just a few. Not every risk online comes from content that can be taken down, and indeed taking down harmful content isn’t especially effective where recommender systems have already amplified it to the hilt. Australia is not alone in facing this challenge.

Given this, there was much excitement in April last year when the Government announced it was planning to bring forward the comprehensive review of the Online Safety Act, scheduling it for the first quarter of this year. And just last month, the terms of reference were announced, proposing a root and branches review of how to ensure the Act continues to protect Australians against the new scales and dynamisms of digital risks. One specific proposal has caused much excitement across not-for-profits: whether the Act should introduce “a duty of care requirement towards users (similar to the United Kingdom’s Online Safety Act 2023 or the primary duty of care under Australia’s work health and safety legislation)”.

A duty of care requirement – placing broad obligations on digital platforms to safeguard end users – is an exciting possibility. It could replicate the strength of the take-down powers in the current Online Safety Act, but turn them onto the systems platforms deploy. No longer would we have a dual track online safety act, with strong requirements about content take down, but less-stringent requirements for safeguarding platforms’ systems and processes (such as algorithms and content moderation systems). We could have an online safety act that effectively creates legal responsibilities for platforms to address content risks and systems risks. If platforms had a duty of care to their users, they would have an obligation to ensure that all their systems were safe, for example. This would largely be achieved by requiring risk assessments and best practice risk mitigation measures to be implemented (see Carnegie’s submission). Such a systematic approach, requiring risk assessments and mitigations of systemic risks, replicates the approach in the EU’s Digital Services Act and the UK’s new Online Safety Act.

However, the experience of the UK is instructive. What began as a proposal for an overarching duty of care was eventually implemented as a series of overlapping duties of care largely regarding illegal content, content that is risky to children and, for larger platforms, content that is risky to adults (see table 1). This approach requires distinguishing between different types of content – such as criminal content, content harmful to children and content harmful to adults (for larger platforms) – and then associating specific duties to each type of content.

Table 1: Duties of care in the UK Online Safety Act
All user-to-user systems have duties regarding:

  • Illegal content risk assessments
  • Illegal content
  • Content reporting
  • Complaints procedures
  • Freedom of expression and privacy
  • Record keeping and review
All services likely to be accessed by children have duties regarding:

  • Children’s risk assessments
  • Protecting children’s online safety
The largest online services also have additional duties regarding:

  • Adult risk assessment duties
  • Duties to protect adults’ online safety
  • Duties to protect content of democratic importance
  • Duties to protect journalistic content

While inevitably this was the preferred approach for technology companies, as it reduces the breadth of their obligations, it has created “gaps” in protections for users. It is unclear, for example, how the UK’s Online Safety Act is going to address harms arising from overarching abusive designs that do not fall into a particular sort of content, such as dark patterns that deceive users or abusive design techniques deployed at children, for example.

It also introduces an unusual tension that stops the obligations being truly systemic and preventative. A singular duty of care approach acknowledges that, for digital platforms, systems are developed and business decisions are made before such platforms are actually populated with content. Platforms decide how their content recommender systems will work, or how their moderation teams will be staffed, etc., without knowing what content they will recommend or moderate each day.

A singular duty of care approach encourages platforms to safeguard these systems before any harm has happened and before any designated content has been posted. However, implementing duties of care tied to particular sorts of content requires platforms to risk assess their systems after they are ‘populated’ with designated content. This seems at odds with the sort of “upstream” and preventative approach that a duty of care seeks to enable.

Implementing duties of care rather than a singular duty of care moves the regulation away from a focus on the systems and back into specifying particular types of content. This skews the focus of compliance towards a content-first rather than a systems-first approach. This was present in much of the Parliamentary debate in the UK, which became very focused on what content would be removed and what would not.

As the Government explores the possibility of a duty of care in our Online Safety Act, this is definitely  something to watch out for. The capacity of a duty of care to reduce online harms will be greatly hampered if it is splintered into multiple duties of care. The review team of the Online Safety Act is expected to release its first paper for public consultation shortly. The questions around a singular duty of care, or multiple duties of care, will hopefully be open to more interrogation then.

The review is being led by Delia Rickard, former Deputy Chair of the Australian Competition and Consumer Commission, and must report back to parliament by 31 October 2024.

Dr Rys Farthing is Director at Reset Australia and Associate Investigator at the Center for the Digital Child. Reset Australia is an independent think tank, and the Australian affiliate of the global Reset initiative. We accept no funding from tech, and are funded by trusts and foundations, including Reset Global, Luminate and the Internet Society Foundation.

Lorna Woods is a professor in the Law School at Essex University and a member of the Human Rights Centre there. She received an OBE for her influential work at Carnegie UK that underpinned the UK government’s Online Safety Act.

Image credit: metaworks/Getty Images

Features

  • Amanda Tattersall

  • Veena Sahajwalla

  • Alicia Mollaun

Subscribe to The Policymaker

Explore more articles

  • Jehonathan Ben, Amanuel Elias, and Rachel Sharples

  • Nelson Ma, David Brown

  • Travers McLeod

Features

  • Jasmine Miller and Colina Reuben

  • Jack Wilson, Emily Stockings, and Maree Teesson

  • Jack Wilson, Emily Stockings, and Maree Teesson

Explore more articles

  • Jehonathan Ben, Amanuel Elias, and Rachel Sharples

  • Nelson Ma, David Brown

  • Travers McLeod

Subscribe to The Policymaker