New draft codes from tech companies could pose risks to Australian children. A better approach—led by government—is needed instead.
New research released this week, to coincide with Human Rights Day, shows that the best way to advance children’s rights online is to ensure strong regulation when it comes to safety and privacy. For too long, Australia has allowed the tech industry to write their own rules when it comes to online safety and privacy.
Even our world-leading Online Safety Act ultimately allows industry to draft the codes that outline exactly how the Act will realise basic safety expectations for Australian children. It’s time for the Australian Government to ditch this approach—called co-regulation, where industry drafts a code and then a regulator oversees it—and allow regulators to draft robust codes themselves like they do in most other countries.
Recently, we witnessed just how badly this co-regulatory approach could fail children and young people. In October, we saw the draft Online Safety Codes written by industry representative groups, which are currently with the eSafety Commissioner for consideration for registration in accordance with the Online Safety Act. Comparing these drafts to similar codes that have been written directly by regulators or legislators highlights the failings.
Codes drafted by regulators drive up safety and privacy standards for children
Codes written by regulators and legislators in the United Kingdom, Ireland and California have served to drive up safety and privacy standards for children. For example, these regulator-written codes all insist that where a digital platform allows children under the age of 18 to open an account, these accounts must have the privacy settings “turned up to the max”. And this has translated to action. Every time a 17-year-old opens an Instagram account in the UK, it defaults to a private account. When they open a new account on TikTok in Ireland, the app sends them a nudge and a simple “one click” pop up notification to go private. This does not happen in Australia.
Alarmingly, we see industry-drafted codes driving down these reduced standards, meaning that the lesser protections afforded to Australian teens could be turned into regulation. The draft codes released in October propose only having privacy settings “turned up to the max” for those aged 15 and under. The Australian code is deliberately proposing no privacy protections for 16- and 17-year-olds, in contrast to regulator drafted codes (see Figure 1).
Figure 1: Age under which young people’s accounts must ‘default to private’
Who wrote the code? | On social media | On online games | |
---|---|---|---|
UK | Regulator drafted, pass by legislators for extra teeth | 18 | 18 |
Ireland | Regulators | 18 | 18 |
California | Legislators | 18 | 18 |
Australia | Industry | No minimum stipulated (but we assume 16) | 16 |
Privacy settings are important for children and young people. Meta, the parent company of Facebook, have outlined the critical value of private accounts, stating:
“Wherever we can, we want to stop young people from hearing from adults they don’t know or don’t want to hear from. We believe private accounts are the best way to prevent this from happening. So starting this week, everyone who is under 16 years old (or under 18 in certain countries) will be defaulted into a private account when they join Instagram.”
It is extremely disheartening to see that only 16- and 17-year-olds in “certain countries” will be protected from unwanted contact with adult strangers. Protection for children in “certain countries” appears to be reserved for where strong regulation, written by regulators and legislators, demands it.
The impact of this on young people’s lives cannot be overstated. Where a young person’s account is private, they are not recommended as “friends” or accounts to “follow” to strangers. An employee at Meta noted, in files leaked by Frances Haugen and used in court cases, that 75 per cent of all “inappropriate adult-minor contact” (i.e., grooming) on Facebook was, at one point, a result of their “People You May Know” friends recommendation system.
Likewise, requirements around children’s precise geographic location appear much weaker in the industry-drafted Australian codes than requirements laid out by regulators when they wrote their codes. Where regulators have written codes in the UK, Ireland and California, they stipulated that companies must not collect children’s precise geolocation data (GPS data) by default. In Australia, again, industry has driven this down and is proposing only not to broadcast location data by default (see Figure 2).
Figure 2: Protections for children’s precise location (GPS Location)
Who wrote the code? | On social media | On online games | |
---|---|---|---|
UK | Regulators/ Legislators | Must not collect by default | Must not collect by default |
Ireland | Regulators | Must not collect by default | Must not collect by default |
California | Legislators | Must not collect by default | Must not collect by default |
Australia | Industry | Must not broadcast by default | Must not broadcast by default |
While the difference between “collection” and “broadcasting” may seem pedantic, it is not. Stopping a social media company from broadcasting a child’s location is a significantly weaker step than preventing a company from harvesting kids location data in the first instance. It overlooks the risks that emerge from digital services storing troves of young people’s GPS data, including security breaches. Following the Optus data breach, it is unrealistic to suggest there are no risks associated with digital service providers holding detailed personal information like children’s GPS data ad infinitum. It also ignores the commercial harms arising from allowing online services to harvest this data, including geo-locating targeted advertising to children. Again, targeted advertising to kids is poised to be banned in the EU as regulators and legislators drive up protections for children, but again, appears to be a precaution driven down by Australia’s draft codes.
The public does not support co-regulation
Recent polling suggests that the Australian public does not find co-regulation an acceptable approach. A poll of 1,508 adults found that 71 percent did not trust the social media industry to draft codes in general, and specifically 73 percent thought regulators should draft codes around children’s online safety. 76 percent said regulators should draft any codes around online privacy for young people.
Likewise, only 14 per cent of teenagers polled earlier in April this year said they trusted social media companies to write the rules about online privacy.
Civil society seems equally unimpressed. “The reliance on self- and co-regulation in the past has demonstrably failed many of us, including children and young people, and new approaches are required”, said the Australian Children’s Rights Task Force in a statement released on Human Rights Day.
Where regulators draft codes, children are better protected
Allowing industry to draft their own codes must be understood as part of a broader systemic failure to effectively regulate the digital world. Where Australia, and other countries, have relied on self- and co-regulation, we have allowed tech to dictate their own terms and practices. The multiple failings—from those highlighted by Frances Haugen to those uncovered by the ABC—have demonstrated that the tech industry does not consistently prioritise the interests of children.
The tech industry is not prevented from improving online safety at all, but they choose not to. They choose not to roll out out safety precautions they are required to turn on in countries where regulations demand it (like defaulting to private accounts for under 18-year-olds, or not collecting geolocation data), to all children.
Co-regulation appears to be a loophole that could see weaker protections for Australian kids enshrined into our regulatory framework, hamstringing our regulators and harming kids in the process. Alternatively, countries where regulators have written their own codes have seen safety and privacy standards for children comprehensively driven up.
There are few other countries where co-regulation is allowed that we know of, largely in Africa. Here too, regulators are rapidly moving away from this faulty approach. At a meeting of African data regulators in November, one of South Africa’s four information regulator members noted their broad resistance to industry-drafted codes, outlining that the last time the South African information regulator registered one, it took the threat of a court case to encourage the industry to finally register it. Likewise, regulators present from Uganda to Mauritius said they wouldn’t even consider it. It is past time for Canberra to move on too.
The draft Online Safety Codes, written by industry, should not be registered by the eSafety Commissioner. Instead, the eSafety Commissioner herself should be empowered to draft a robust code that adequately protects Australian children and drives up safety standards. Likewise, proposals for improving children’s privacy online should be written by the Information and Privacy Commissioner herself.
If we want to see real improvements, it’s time we stopped letting social media companies write their own regulations, and let our regulators get on with it.
Dr Rys Farthing is Director at Reset Australia and Associate Investigator at the Center for the Digital Child. Reset Australia is an independent think tank, and the Australian affiliate of the global Reset initiative. We accept no funding from tech, and are funded by trusts and foundations, including Reset Global, Luminate and the Internet Society Foundation.
Judith Bessant AM is a professor at RMIT University. She researches and writes in the fields of politics, youth studies, policy, sociology, media-technology studies and history.
Image credit: Getty Images