Search
Close this search box.
News Social Media

Reform advocates point out flaws in proposed anti-trolling laws

identicon
2 min read
Share
Reform advocate points out flaws in proposed anti-trolling laws

Reform advocates have told a parliamentary committee that the proposed laws targeting online abuse and holding social media companies accountable are “utterly inaccessible” and narrowly targeted.

Noelle Martin, a lawyer and survivor of online abuse, stated that the financial penalties proposed for the companies that fail to keep people safe on their platforms are “chump change” and will not result in meaningful reform. 

“If this committee is serious about online safety, unmasking trolls and improving responsibility by social media companies … this bill contains fundamental flaws,” she said.  

As a teenager, Ms Martin found images of herself that had been edited onto pornographic pictures and distributed online. 

However, the perpetrators have not been punished after a decade has passed and many of the fake images remain on pornographic websites.  

Ms Martin said the government should focus on compelling the eSafety Commissioner to use its existing legislative powers to take action against abusive content online instead of focusing on the proposed laws.  

“Australia’s online safety laws and reforms, and in particular the office of the eSafety Commissioner, is woefully inadequate,” she said. 

“The regulator continues to underutilise its existing statutory powers, misguides the public on its perceived successes … and the whole regime is ineffective in providing meaningful support to survivors.”  

Meanwhile, anti-trolling campaigner and journalist Erin Molan shared her experience of being targeted with abusive direct messages on social media. 

During her recount of some of the abuse, Ms Molan stated that it was almost impossible to get help from law enforcement or the social media platforms themselves.  

“I reported some horrific messages from an account that kept being recreated no matter how many times I blocked it, but Facebook said the messages ‘didn’t meet the threshold’ for inappropriate content,” she said. 

“The consequences should lie with big tech. They generate a huge amount of money and with that comes the responsibility to ensure users are safe.”  

Criminologist Michael Salter told the committee Ms Molan’s experience of reporting abuse to social media companies was common among victims, as they are often failed by the lack of action taken by social media giants. 

“We’re asking for transparency because far too often what we’re provided from social media company reports on these issues … is statistics that are most friendly to them,” he said. 

“Having basic safety expectations built into platforms from the get-go is not too much to expect from an online service provider.”  

Prior to the hearing, Prime Minister Scott Morrison said people like Ms Molan who shared their stories of online abuse would help to hold social media to account. 

“(Tech companies) created these platforms and they have a responsibility to make them safe,” he said in a statement. 

Representatives from Facebook, Twitter and TikTok are expected to appear at an upcoming hearing. 

The committee will hold its next hearing on Thursday. 

This article was first published on CommsRoom

Website | + posts

Eliza is a content producer and editor at Public Spectrum. She is an experienced writer on topics related to the government and to the public, as well as stories that uplift and improve the community.

Tags:

Next Up