The Federal Government wants to compel tech companies to actively prevent harm to users of their platforms. 

The government has put forth a proposed “Digital Duty of Care”, aligned with regulations in the United Kingdom and European Union, which would impose stringent, ongoing obligations on digital platforms to reduce risks associated with online content and activities. 

“The duty of care will put the onus on industry to prevent online harms at a systemic level, instead of individuals having to ‘enter at their own risk’,” Minister for Communications Michelle Rowland says.

She said the duty represents a proactive shift away from “set-and-forget” content regulation, marking an enhanced focus on prevention.

This duty of care would require tech giants such as Google, Meta, and X (formerly Twitter) to perform regular risk assessments. 

According to the government, these platforms would need to take “reasonable steps” to address potential online harms such as content promoting self-harm, cyberbullying, and other illegal activities. 

The proposal is part of a broader package to update Australia’s Online Safety Act, introduced in 2021, which the Albanese government reviewed a year early to ensure relevance amid evolving online threats.

The government’s approach has gained support from advocacy groups, including the Human Rights Law Centre (HRLC), which welcomed the duty as a “first step” in addressing risks associated with social media platforms. 

“Digital platforms are eroding Australia’s democracy and getting away with treating people as products,” said David Mejia-Canales, a senior lawyer at the HRLC.

He argued for an expansive, rights-based duty of care that protects users’ rights, including freedom of expression, while enabling safer online environments. 

Mejia-Canales also cautioned against the government’s earlier proposed blanket ban on social media access for minors, arguing that such restrictions could inadvertently limit young people’s access to valuable online resources and communities.

Critics of the duty of care, however, have raised questions about its enforceability. 

Observers note that the UK and EU models are still being tested, with significant challenges in both monitoring compliance and determining what qualifies as “harm”. 

The current European Union Digital Services Act, for example, allows consumers to submit complaints directly to digital platforms, with potential fines of up to 6 per cent of global turnover for non-compliance. 

Despite the alignment of Australia's proposed laws with these international standards, questions remain over how compliance will be ensured and what penalties may apply if tech companies fail to meet their obligations.

The proposed legislation forms part of a wider set of measures aimed at modernising Australia’s digital regulatory framework. This includes plans to introduce a minimum age limit for social media access, alongside other protections to curb misinformation and disinformation. 

This email address is being protected from spambots. You need JavaScript enabled to view it. CareerSpot News