Tuesday, September 10, 2024
Year : 2, Issue: 36
by Hayley Sorensen
Social media is bad for kids. It’s not that great for adults either, but for vulnerable and impressionable children, the dangers are amplified.
There are the obvious and everyday risks. For one, cyberbullying from their peers. Once, the bell at the end of the school day signalled at least some respite for those kids who found themselves being tormented by others. Now their bullies can reach them wherever they are, at any time.
We know excessive social media use is associated with an increased risk of anxiety and depression. And it’s no wonder, when every time a child opens Facebook or Instagram they’re bombarded with messages that to be happier they must be thinner, prettier, better dressed.
It gets even more sinister. Boys in particular are exposed to violent, misogynistic or racist content. And once kids interact with that content — possibly out of nothing more than a childlike curiosity — their algorithms push more and more upon them. In its most extreme examples, it can push young people to embrace radical ideologies.
Then there are the risks posed by sexual predators who use social media platforms as hunting grounds for young victims.
The latest and growing threat is sextortion, the potentially tragic consequences of which were brought into sharp focus earlier this year with the suicide of a NSW teenager who was being extorted by members of a Nigerian crime gang he had been duped into sending intimate images to.
Parents know social media is bad for kids. Mental health experts have been warning of its dangers to developing young minds for years. And now, finally, the social media companies themselves are beginning to wake up.
Appearing in front of a Senate inquiry on Wednesday, executives from Mark Zuckerberg’s Meta outlined a new plan to implement age verifications on its platforms, which include Instagram and Facebook.
Meta’s vice president and global head of safety Antigone Davis used the hearing to push for legislative change which would compel app stores to get parents’ approval whenever a child under 16 downloads an app.
It’s a marked shift from June, when Ms Davis told the same inquiry that she did not believe social media was harmful to children, instead claiming that mental ill health in teens was “complex and multifactorial”.
Conveniently for companies including Meta, its fix would shift the responsibility for age verification onto companies such as Apple, which operates those app stores. Unsurprisingly, it’s a responsibility Apple doesn’t want. The company is currently fighting a push in the US to introduce such restrictions.
Meta says its solution isn’t a blame-shifting exercise but a way to create uniform standards across the industry to help keep children out of harm’s way. It’s also an admission that its apps aren’t safe places for kids. That’s something the rest of us have known for years. Now the onus is on these companies to take some responsibility.