If the US is standing up to social media then we need to talk content as well as algorithms
There’s been a lot of sturm and drang in recent weeks over the dangers of social media to young people. Australia kicked everything off by imposing a ban on teenagers using social media apps. In Europe, countries like France, Spain, Denmark, Greece and Portugal are moving ahead with similar measures. Here, in Ireland, the government is being slightly more circumspect about the whole issue, probably due to the fact that so many social media platforms have their EU headquarters in the country.
Now, a lawsuit has been launched in the US against Meta and Google, claiming that they deliberately designed their platforms to be addictive and used algorithms to keep users engaged for long periods of time.
This followed days after the European Commission accused TikTok of creating an addictive design and failing to assess how its addictive features “could harm the physical and mental wellbeing of its users, including minors and vulnerable adults”.
But while I don’t doubt the good intentions behind seeking bans on social media platforms for young people, I’m not convinced such bans are the most effective measure governments can take to curb the harmful effects of social media.
What I’m going to say now isn’t novel, other people have made the same argument, but I think it’s worth repeating because there’s a danger governments and their citizens are being diverted from enforcing meaningful change on social media platforms in the cause of protecting young people’s mental health without tackling the much wider problem of the role those platforms play in coarsening our discourse and disseminating fringe extreme political and racist views into the mainstream.
Let’s consider what these bans are supposed to achieve. Will they change the behaviour of the social media platforms? Will they modify their algorithms? Will they make their platforms less addictive? Why? Is there a prospect of those platforms being unbanned for under 16s in the future if they do?
Influence and responsibility
What the bans are effectively conceding is that governments are unwilling to take meaningful action against the social media giants by making them change their algorithms and addictive designs even though they believe they are potentially harmful to young people. They also fail to tackle the issue of social media platforms being addictive by design for people over 16 and potentially just as harmful. Possibly more so because adults tend to have greater influence, authority and access in their day to day life than teenagers.
I doubt there’s a single person reading this who doesn’t know someone who has ‘gone down a rabbit hole’ on Facebook, Twitter or YouTube and emerged a completely different person. And I bet few, if any of those people were under 16! It’s not just us ordinary Joes and Joannas that are at risk. High profile people in the arts, sport and politics have been radicalised by overexposure to toxic social media environments.
Now, we could argue that governments might be seeking to impose a measure of responsibility on social media companies if they make them liable for any under 16s that access their platforms, but how stringently will that be enforced?
Wouldn’t it be better if, instead, the platforms were forced to take responsibility for their designs, algorithms and for hosting content that, if they were a traditional media platform, such as a newspaper, TV station, radio station, magazine or website, would make them liable for disseminating it?
As long as that remains untouched, far too many people, young, old and in-between, rich and poor, will be adversely affected by prolonged exposure to social media platforms. When it comes to the likes of Facebook, Twitter or YouTube, age is no guarantee of wisdom. We are all vulnerable to the algorithms. The rabbit hole is big enough for everyone.







Subscribers 0
Fans 0
Followers 0
Followers