Online violence and misogyny are still on the rise – NZ needs a tougher response
Relying on voluntary commitments to reduce online harm puts New Zealand out of step with other countries with legally enforceable rules to protect social media users’ safety.
Cassandra Mudgway, Senior Lecturer in Law, University of Canterbury
According to the report from RNZ, the commission told NZ Tech and Netsafe that social media companies X Corp. and Meta failed to protect former prime minister Jacinda Ardern from misogynistic and dehumanising violence across their platforms.
The commission’s claim that the Code of Practice for Online Safety and Harms was not fit for purpose apparently drew a sharp legal response from the agencies, which argued the commission showed bias and had overstepped its remit.
But the historical incident raises important questions New Zealand has yet to grapple with properly.
Established in 2022, the code is a voluntary set of commitments co-designed with the technology industry, including some social media companies such as Meta and X-Corp.
Companies become signatories to the code and agree to its commitments. The current signatories are Meta, Google, TikTok, Twitch and X Corp.
Among other provisions, the code asks signatories to take steps to reduce harmful content on their platforms or services, including harassment (where there is an intent to cause harm), hate speech (which includes sexist hate speech), incitement of violence and disinformation.
The code is not legally enforceable. Compliance relies on willingness to adopt such measures. But there is an accountability structure in the form of an oversight committee. The public can lodge complaints with the committee if they believe signatories have breached the code, and the committee can remove a signatory from the code.
When it was launched, the code received some international acclaim as an example of best practice for digital safety. But its critics argued that because it was co-written with social media companies, the commitments were not as strong or effective as they might have been.
Jacinda Ardern was the target of extreme levels of online misogyny and violent rhetoric.Hagen Hopkins/Getty Images
This raises obvious questions about the code’s effectiveness. Since the Human Rights Commission cited the extreme online violence directed at Jacinda Ardern, former Green Party MP Golriz Ghahraman has spoken about the violent online misogyny and racism she experienced while in office.
The human rights implications also mean the New Zealand government has legal duties under international treaties to prevent online gender-based violence.
In its current form, the code is not effective. Its commitments aim to reduce harm rather than eliminate it, and it is not comprehensive about the kinds of harm it wants signatories to reduce.
For example, it does not include reference to “volumetric” attacks – the type of coordinated harassment campaigns against a person that were directed at Ardern.
Further, the code’s threshold for “harm” is high, requiring the online violence to pose an imminent and serious threat to users’ safety. This does not easily capture the types of gender-based violence, such as misogynistic hate speech, that over time normalise violence against women.
The code also emphasises the role of users in managing harmful content, rather than placing a responsibility on the platforms to investigate how their services and technologies might be misused to cause harm.
Relying on voluntary commitments also puts New Zealand out of step with other countries such as the United Kingdom and Australia which have legally enforceable requirements for social media companies to protect online safety.
Placing that burden on users – to block, report or remove content – is merely reactive. It does not prevent harm because it has already happened. And for some groups, such as MPs and public figures, the harm they receive can be overwhelming and seemingly endless.
Preventing online gender-based violence requires proactive measures that are legally enforceable. To fulfil its international obligations, the government should urgently review the need for legal regulation that places the burden of online safety on large social media companies rather than on users.
Cassandra Mudgway does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
This article is republished from The Conversation under a Creative Commons license.
RUGBY Hoskins Sotutu is determined to prove his doubters wrong, as he prepares to return from a knee injury to Super Rugby at number eight for the Blues against the Highlanders at Forsyth Barr on Saturday More...
BUSINESS The Child Poverty Reduction Minister attributes stagnant progress to tough economic conditions More...