After much delay the UK Government will today publish a draft of their new Online Safety Bill (Online Harms), which among other things hands Ofcom a new “ legal duty of care ” that can be used to force websites into removing “ harmful ” internet content. Failing to do so may risk being fined or blocked by mobile and broadband ISPs.
The proposed legislation, which is intended to replace the old model of self-regulation that has struggled to keep pace with the changing online environment (i.e. the Government believes too much “ harmful ” content is allowed to slip through a fairly weak net), faces the unenviable task of trying to strike the right balance between freedom of expression and outright censorship.
Various examples exist for “ harmful ” content (e.g. terrorism, child abuse, self-harm and suicide imagery etc.), such as the rise of the ISIS terrorist group online a few years ago, as well as state sponsored propaganda from hostile countries, online bullying, the spread of COVID-19 related 5G conspiracy theories (encouraging criminal attacks against infrastructure and engineers) and so forth. Social media firms did eventually catch-up, but they’re often late to the party.