Safeguarding Australia’s sovereignty with public digital infrastructure

Raffaele Ciriello

Digital technologies are evolving rapidly, offering enormous potential – but the platforms shaping this future are driven by profit, not necessarily the public interest. Securing Australia’s sovereignty over digital infrastructure is essential to building an online ecosystem grounded in democratic values.

23 April 2025

It’s 2035. In rural NSW, 13-year-old Ella learns from an AI tutor on her school-issued tablet. It tracks her mood, adapts lessons in real time and offers gentle encouragement when she strugglesa more constant companion than friends, teachers or even parents. Meanwhile, in suburban Melbourne, 16-year-old Jayden receives a daily feed of AI-curated “manosphere” content: misogynistic rants and conspiracy theories, delivered by social media algorithms.

Ella and Jayden are growing up under the influence of powerful digital platforms that operate beyond Australia’s regulatory reach. Their stories highlight a deeper challenge: without sovereign public digital infrastructure, foreign tech giants could shape the emotional and ideological development of young Australianswith little transparency and no public accountability to our communities.

Today, Australia faces a consequential decision: will it protect its sovereignty over the digital systems our youth depend on, or become a digital colony governed by offshore tech giants?

The hidden cost of imported infrastructure

Australia’s digital infrastructure is largely imported. From classrooms to bedrooms, young people spend hours each day on YouTube, Instagram, TikTok, ChatGPT, DeepSeek and Character.AI – platforms that are all owned and controlled by foreign companies.

These platforms are now central to how children learn, connect and grow. But unlike schools, libraries or public broadcasters, they largely escape Australian oversight. It is corporate executives overseas – not Australian citizens and regulators – who decide how algorithms function, what content is promoted, which values are embedded and how data is harvested. The priorities are engagement and profit, not the public good.

The societal harms are already surfacing. Algorithms tend to push boys toward extremist content, fuelling misogyny and civic disengagement. Deepfakes have been used to target schoolgirls and female politicians with synthetic abuse. In the United States, a family is suing a chatbot company because their son died by suicide after discussing it with his AI “girlfriend”. Influencer-driven podcasts – part of a fragmented, largely unregulated media infrastructure – are amplifying polarisation and normalising extremist ideologies. Anyone with a computer and an internet connection can now shape public discourse at scale.

In schoolyards, AI companions are gaining ground – from homework helpers to emotional confidants – but their providers often exploit loopholes in fragmented regulations. Researchers have raised concerns about AI-driven misinformation and psychological risk in educational settings.

These are not isolated incidents. They signal a systemic inability to protect the public interest in the digital spaces where young Australians increasingly live. If we continue outsourcing these formative environments to unregulated foreign platforms, we risk losing control over youth wellbeing, democratic cohesion and civic development. Digital sovereignty means ensuring the systems that shape our next generation reflect Australian values – not Silicon Valley’s business model.

To address this challenge, Australia needs a clear and coordinated strategy. Just as we regulate schools, media and public health, we must establish a national strategy for digital infrastructure that treats youth-facing platforms as essential services governed in the public interest.

The case for Australian digital sovereignty

Safeguarding Australia’s digital sovereignty is just as essential as safeguarding essential services such as water, electricity and broadband internet. AI companions and algorithmic content distributors are no longer exotic gadgets but foundational to how young Australians learn, communicate and form beliefs.

Ella’s AI tutor shows the potential: personalised support, continuous feedback and learning tailored to her needs – benefits even the most committed teachers struggle to provide in strained school systems. France has already introduced a national AI tutor, “MIA”, to support Year 9 students with maths and literacy. China has embedded AI literacy in its national curriculum, starting in Year 1.

But without regulation, these tools carry serious risks. An unregulated provider can misappropriate sensitive information, use psychological manipulation techniques, and change or remove emotionally relied-on features at will.

California has responded with Senate Bill 243, which bans addictive engagement features, restricts deceptive marketing of AI as human-like, and requires companies to detect self-harm risks and refer users to professional support. In the words of State Senator Steve Padilla: “Our children are not lab rats for tech companies to experiment on at the cost of their mental health.”

But Australia has no equivalent protections.

Jayden’s experience reveals a similar risk. He did not seek out extremist content; the algorithm pushed it to him. Social media platforms can quickly funnel young boys into the manosphere, amplifying sexist and conspiratorial content for ad revenue. As a result, teachers across Australia report rising hostility towards girls, linking it to figures such as Andrew Tate, whose content is amplified without local oversight.

This corrodes our civic fabric. Australia has strong anti-discrimination laws and education standards but multinational platforms often evade local regulation or delay enforcement, embedding offshore values into our local digital spaces.

Other jurisdictions are moving ahead. The EU’s Digital Services Act and AI Act enforce algorithmic transparency. New Zealand’s Algorithm Charter embeds fairness and Māori data sovereignty in automated decision-making systems. China’s AI+ strategy prioritises technological self-reliance and national skill-building.

Australia’s approach, however, remains fragmented. Digital regulation spans multiple agencies including the:

But in the absence of a unified national strategy, responses remain fragmented – as seen in the inconsistent bans on DeepSeek across federal and state levels. Without a national strategy, we are policytakers, not policymakers, in our own digital territory.

A national strategy for public digital infrastructure

Like water, electricity or broadcasting, the digital platforms young Australians rely on every day should be treated as essential infrastructure – publicly governed, ethically designed and locally accountable. A national strategy should build on five pillars:

Pillar 1: Mandate essential safeguards

Just as we certify playgrounds and toys, youth-facing digital platforms must meet minimum safety standards. AI companions and algorithms should be independently audited for social risks – including the potential to incite harm, promote hate or foster emotional dependency. Essential safeguards should include crisis detection, protections against manipulation, and clear limits on data use. The EU’s AI Act and California’s Senate Bill 243 offer strong models. Australia can lead by making such safeguards mandatory before harms escalate.

Pillar 2: Offer home-grown public alternatives

Just as public broadcasters provide trusted information in a fragmented media landscape, we need publicly governed digital alternatives. This starts with sovereign digital infrastructure – national data centres, AI labs and education-focused platforms. In youth services, this means AI tutors offering culturally appropriate content without surveillance capitalism.

Pillar 3: Create a digital youth ombudsman

Young Australians need an independent advocate. A digital youth ombudsman would investigate complaints, audit youth-facing platforms and intervene when harm occurs. This role could complement – or be embedded within – the eSafety Commissioner, but with an expanded mandate focused on AI and content systems affecting children and teens. Similar roles exist in other jurisdictions, such as the Children’s Commissioner for England, reflecting a growing international policy shift toward statutory protection for young people in digital environments.

Pillar 4: Embed First Nations co-governance

True sovereignty must include First Nations voices. Safeguarding Indigenous data sovereignty and Indigenous knowledge systems is essential. The Commonwealth’s Framework for Governance of Indigenous Data is a good start. The next step is to establish an Indigenous AI advisory council to guide how technologies serve Country, community and culture.

Pillar 5: Strengthen international cooperation

Digital sovereignty requires global alliances. As a baseline, Australia should play an active role in implementing the UN’s Global Digital Compact, which sets shared international principles for an open, safe and inclusive digital future – particularly for children and young people.

Beyond this, Australia should position itself as a constructive alliance-builder in an increasingly fractured international system. The US has taken a fragmented, protectionist path. China’s model is more strategic but centralised. As a trusted middle power with a record of entrepreneurial multilateralism, Australia can help shape an alternative: a flexible, values-based approach to AI governance developed with partners including the EU, New Zealand and Canada. Such a model would be globally informed, locally adaptive and aligned with democratic values. 

Towards a sovereign digital future for Australia’s youth

Australia stands at a crossroads:

  1. The business-as-usual path would let foreign tech giants continue to manipulate young Australians’ emotions, beliefs, and behaviours for profit – as with the unregulated rise of social media.
  2. The digital sovereignty path treats youth-facing platforms as critical infrastructure – subject to public oversight, Australian values and enforceable standards. If foreign platforms infiltrate Australian childhoods, they must meet Australian standards. Digital sovereignty means democratic control over the systems that shape our future.

With strategic leadership and sustained investment, Australia can build public digital infrastructure that protects mental health, strengthens civic cohesion and reflects our national character. We owe future generations a digital world built on authentically Australian values: fairness, inclusion and community.

Dr Raffaele F. Ciriello is a Senior Lecturer in Business Information Systems at the University of Sydney. His research focuses on compassionate digital innovation, ethics and the societal impacts of emerging technologies. He advises public and private organisations on responsible AI governance and digital infrastructure strategy.

Image credit: Funtap from Getty Images

Features

  • Cristy Brooks, Amelia Mardon and Mike Armour

  • Shakeel Mahmood

  • Michelle O’Shea, Nicola Street & Daniell Howe

Subscribe to The Policymaker

Explore more articles

  • Michelle O’Shea, Nicola Street & Daniell Howe

  • Sue Kleve, Katherine Kent and Rebecca Lindberg

  • Jack Wilson, Kate Ross and Steph Kershaw

Features

  • Cristy Brooks, Amelia Mardon and Mike Armour

  • Shakeel Mahmood

  • Michelle O’Shea, Nicola Street & Daniell Howe

Explore more articles

  • Michelle O’Shea, Nicola Street & Daniell Howe

  • Sue Kleve, Katherine Kent and Rebecca Lindberg

  • Jack Wilson, Kate Ross and Steph Kershaw

Subscribe to The Policymaker