Snapchat has apologised while being questioned at a parliamentary inquiry about the tragic case of a NSW schoolgirl suiciding after years of being tormented by bullies, including on its platform.
The Daily Telegraph revealed the heartbreaking story on Tuesday about Bathurst teen Matilda “Tilly” Rosewarne, 15, who died last month and was farewelled at a funeral on Tuesday.
Her family revealed many disturbing accounts of bullying, including a tormentor circulating fake nude images of Tilly on Snapchat and a Belgian porn website.
Her mother Emma Mason said the complaint was lodged with police but they had “significant difficulties” confirming who the Snapchat account belonged to.
Police ultimately decided to pull the investigation.
At a hearing for the inquiry into social media and online safety on Wednesday, Liberal committee chair Lucy Wicks asked Snapchat what could be done to make sure the same thing didn’t happen to anyone else.
“Today, I opened a story from the Daily Tele about a young 15-year-old who took her life following a lot of bullying, including cyber bullying,” she said.
“It’s heartbreaking, from not being invited to parties to having fake nudes spread on social media.
“Her parents say they want to do everything they can to make sure no other little humans go through this.
“What do we do to make sure no little humans go through this?”
Snapchat’s head of public policy in the Asia Pacific, Henry Turnbull, said he had read the story and he was sorry for what Tilly’s family was going through.
“I just wanted to say how sorry I am for what they are going through right now,” he said.
“That case unfortunately highlights … how bullying and abuse can often take place among people who know each other very well in real life and are ostensibly friends or school mates and it is a real challenge across a whole bunch of areas.”
Ms Wicks, the member for Robertson, said she had many conversations with parents and young people in her community on the Central Coast about images being shared on Snapchat without consent.
“I totally recognise how distressing it can be for people to have their private images or videos of themselves shared, particularly without their consent,” Mr xjmtzywTurnbull said.
He said Snapchat images were designed to disappear after viewing but phones had the ability to take screenshots.
He said people were alerted when this happened and they also had the option to quickly report harmful content.
Mr Turnbull said education about acceptable behaviour was an important tool to fix the problem.
“Education is important; educating our community both on the fact this behaviour is unacceptable and also on the security of their images and their phones themselves,” he said.
Mr Turnbull also warned during the inquiry Australia had lots of different pieces of proposed legislation, regulation, codes and guidance for social media that made it difficult for smaller social media companies – like Snapchat – to navigate.
“At the moment it does feel very confusing from our perspective,” he said.
“If you have one clear regulatory framework it makes it simpler for companies to understand their obligations, it makes it simpler for the government and regulators to hold people to account and it makes it simpler for consumers to understand their rights.
“This, rather than complexity, is what leads to a safer environment for all.”
Mr Turnbull said the Online Safety Act should be broadened to include the Anti-Trolling Bill and the Online Privacy Bill in one piece of legislation with non-conflicting guidelines.
The Online Safety Act came into effect in January and gives the eSafety Commissioner the power to fine people or companies for material that’s deemed to be bullying or abusive if it isn’t removed within 24 hours.
The anti-trolling legislation gives people the ability to unmask anonymous trolls who post defamatory material about them.
That information would only be released with the consent of the poster – however, if they don’t agree, the victim has the option to obtain their identity through a new court order.
The online privacy code would prevent tech companies from accessing a child’s data without their parents’ permission.
Mental health support