Something needs to be done about the dangers of social media

The world of social media is vast. Around 350m photos are uploaded to Facebook every day. 500 hours of video are uploaded to YouTube every minute. As many as 6,000 tweets are sent every second.

In among all those uploads, posts and messages are a huge number of extremely unpleasant bits of communication: terror recruiting videos, child grooming, harassment and .

We’ve known for a long time that something needs to be done to try to minimise the very real harm this does. On Wednesday, the UK government decided that it would be communications regulator Ofcom that would be in charge of trying to do just that. Whether it’s up to the challenge is another matter entirely.

The big social media companies, particularly Facebook, YouTube and Twitter, have so far largely avoided this kind of government . They already use automated processes to remove content deemed illegal or harmful, or even simply offensive. On top of that, they employ many thousands of people, often badly paid and working in difficult conditions, to moderate content posted by their users. Facebook took action against 3.2 million pieces of content deemed to constitute bullying and harassment alone in the third quarter of 2019.

The government has decided that this isn’t enough, and last year introduced a piece of legislation to deal with “online harms”.

The scale of the challenge is extraordinary. In the past, Ofcom, which currently oversees broadcasting and communications services such as mobile phone providers, has understandably been resistant to being given responsibility for the internet. Now it is being asked to get a grip on all the ways people are hurting each other online.

Also Read:  Chinese social media struggle to contain rumors

Ofcom employs around 1,000 people in total. Its new role will require additional resources, but unless the government is also planning to plough billions into a huge job creation scheme, the regulator will not be charged with looking at every single nasty piece of content online.

The proposals suggest that Ofcom’s role will be more focused on setting guidelines and rules, and imposing sanctions when it is clear that companies have failed to meet them. Whether those are fines or criminal charges for individuals, and exactly how either can be used against huge multinational companies based for the most part in the US, is not yet clear.

Exactly which content will be deemed harmful is also going to pose a huge problem. The social media firms are already relatively good at dealing with illegal or clearly harmful posts, such as child pornography or violent videos, though the spread of the Christchurch shooter’s live recording from New Zealand shows there remain some extremely big holes. But other areas, such as cyber-bullying, trolling and users discussing self-harm, begin to move into far more subjective territory.

There are some other thorny issues. It is not just the big companies that will fall under the legislation’s remit. It will apply to all “companies that allow the sharing of user-generated content – for example, through comments, forums or video-sharing”. The government’s insistence that this will only affect “fewer than 5 per cent of UK businesses” does not sound all that reassuring.

You might also like More from author

Comments are closed.