Self-regulation for social media “simply doesn’t work” which is why the Government must introduce a regulatory framework for online safety, the head of education and innovation at the charity CyberSafeKids has said.
Philip Arneill told RTÉ radio’s Morning Ireland that the charity welcomed any moves towards raising standards and accountability. He was commenting after the Commissioner for Online Safety told an Oireachtas committee on Wednesday that a regulatory framework will be required to ensure protection of minors online against incitement to hatred, xenophobic behaviour, racist content, child sex abuse and terrorism.
The discussion followed news of the death of a 14-year-old girl in Co Clare who is understood to have died after inhaling an aerosol substance and may have participated in a so-called aerosol challenge on TikTok.
It is understood the girl, named locally as Sarah Mescall (14), from Kilnamona, Co Clare, fell ill last weekend and was rushed to Beaumont Hospital in Dublin and, later, Crumlin Children’s Hospital.
The girl, a second-year student at Coláiste Mhuire in Ennis, was pronounced dead at the hospital on Monday morning. Gardaí said they have notified the coroner of the death and an investigation file is being prepared.
Line of inquiry
One line of inquiry reportedly being pursued by investigators is whether the girl became ill after taking part in a so-called aerosol challenge on TikTok, known as “chroming”. It has claimed a number of lives in recent years, especially in Australia where it has been linked to the deaths of several young people.
Mr Arneill said: “It's very clear from the evidence that self-regulation, when it comes to social media platforms in particular, just simply doesn't work. So we welcome any moves towards raising standards and accountability. And these kinds of definitions are absolutely key to getting that consistency across the board when it comes to reporting content, detecting content, and then getting it removed in a timely fashion.”
Apart from the capacity to implement an individual complaints mechanism, responsibility and accountability had to lie with social media platforms to remove content quickly and effectively, he said.
'Children and young people were being targeted daily'
CyberSafeKids knew from their research that children and young people were being targeted daily. “The two main areas are harmful content and harmful contact. So they might be coming across material which is inappropriate or pornographic material.
“For example, there could be material around suicidal ideation, eating disorders, racist content and things like that as well. And then obviously with children, particularly underage children, when they're on these platforms and they're on different gaming platforms as well, there's also a real distinct possibility of contact from people they don't know.
“Sixty-one per cent of the children that we surveyed last year were contacted by a stranger in an online game. And some of those strangers may indeed be peers. They may be children of the same age. But obviously, if you don't know who your children are being contacted by, you know, that's not a safe situation to be in.”
He said the internet was constantly evolving and is a dynamic environment. “So even when things are put in place to remove content, users can find other ways using differently spelt hashtags, for example. And we would say that stuff is not coming down quickly enough.
“We are regularly contacted by teachers and parents because they are afraid of reporting content to these companies and it simply doesn't come down. It's very much a David and Goliath situation.
“If you're faced with a very large social media platform and your child's very distressed and you can't get any response, that's a terribly disempowering situation to be in. And so it's not happening quick enough. And sometimes even when it is reported, for whatever reason, it's not meeting those in-house community guidelines. And so as a result, the content is not being taken down.”
Tougher regulations
Any move towards tougher regulations and restrictions of this kind of content was welcome, he said. “We've got to see how quickly this works and see how effectively this works. But the only way it's going to work is if we have this consistency right across the board and we remove this sense of self-regulation where community guidelines are different from platform to platform.”
Imposing financial penalties on social media platforms could have a cumulative effect and would increase the pressure to better monitor harmful content.
“That's something that we would absolutely welcome because quite frankly, at the moment, the system is not working and there's a lot of individual users, parents and children and in some cases teachers and schools as well who just feel completely powerless. So any legislative and regulatory framework that will give them more power in this area, we absolutely welcome.”
By Vivienne Clarke
Keep up to date with all the latest news on our website Beat102103.com.