In recent months the government has proposed cracking down on online anonymity. The idea is that attaching online posts to a person’s real name will reduce abuse and increase accountability.
Online bullying and misinformation are growing problems, and government action to address them is overdue.
However, limiting anonymity alone won’t make social media less toxic. It will only work combined with broader reforms to platform design and business models, which drive polarisation, negativity, abuse and misinformation.
Reforms must also protect free speech and account for power imbalances between citizens and the state. The mooted changes come alongside suggestions of public funding for defamation actions by parliamentarians. Cynics might view these two suggestions together as an effort to silence reproach.
Potential anonymity reforms
In April this year, a parliamentary committee recommended requiring users to provide ID documents before opening social media accounts.
In September, the High Court held that media outlets can be liable for defamatory third-party comments on their social media posts.
Government comments indicate intent to further regulate online anonymity. Prime Minister Scott Morrison recently described social media as a “coward’s palace”, pressuring platforms to expose the identities of anonymous trolls.
Deputy Prime Minister Barnaby Joyce also criticised platforms professing to be “vessels of free speech” while enabling users to conceal their identities.
Second, reforms must be scrutinised to ensure they serve public rather than political interests. While the state stifling dissent may seem less of a concern in a democracy like Australia than in authoritarian regimes, it is important to ensure new measures won’t unreasonably compromise free speech and privacy.
In combination with Australia’s defamation laws, removing online anonymity may further expose users and chill democratic debate.
Complex drivers of toxicity
Anonymity is only one factor contributing to online toxicity.
Most current platforms are designed to maximise user engagement. Platform algorithms, in combination with human behaviour, mean negative and angry content outcompetes positive content. This promotes negativity, polarisation and extremism.
Research further shows sharing of political misinformation is driven by partisanship more than ignorance. Online polarisation therefore propels misinformation in aid of the culture wars.
For example, the COVID-19 hashtag “#Danliedpeopledied” was driven by hyper-partisan and fake accounts. An anti-vax “infodemic” now spreads online, propelled by tribal influencers and anti-vaxxer communities.
Online toxicity is exacerbated by social media’s addictiveness. Each “like” and comment gives users “a little dopamine hit”. Outrage and negativity equal more engagement, which means more dopamine rewarding the behaviour.
Tribalism can encourage group attacks, reinforcing tribal connection. Social media “pile-ons” can be devastating for the target. Such bullying would probably not occur in person. But online, we have fewer physical and visual cues to encourage empathy.
While some (especially anonymous trolls) find courage on social media, others are frightened off. Negative online encounters can create a “spiral of silence”, discouraging moderate users from participating. This creates more room for fringe voices emboldened by the echo chamber.
What reforms are needed?
Anonymity regulation will only help with bullying and misinformation if part of broader reforms tackling other drivers of toxicity, like engagement-driven polarisation. This means addressing platform business models and design – a complex task.
Reforms must also be fair.
Second, if anonymity is regulated, it is even more crucial to ensure citizens are not gratuitously sued or threatened by politicians for voicing opinions online.
Protection of reputation and accuracy are important, but we must safeguard fair debate. Politicians enjoy free speech bolstered by parliamentary privilege and media platforms.
Any anonymity regulation must be balanced by free speech protections, including more robust defamation defences accounting for power imbalances between citizens and the state.
Given their positions of power, politicians should accept a higher threshold of criticism.
- is a Senior Lecturer, Macquarie University
- This article was co-authored with Andrew Ball, who is an Associate Director at IT consultancy firm Accenture.
- This article first appeared on The Conversation