Regulation is coming and Facebook knows it, so it has reportedly persuaded Ofcom’s Director of Content Standards, Licensing and Enforcement to join the team.
The news comes courtesy of the Times, which reports that Tony Close resigned last week and was placed in gardening leave as soon as it became clear where he was headed. Close had been at Ofcom since 2003 and was most recently one of the people heading up Ofcom’s regulatory strategy with regard to social media, a role that became a lot more interesting when it was given new censorship powers earlier this year.
EXC Senior Ofcom executive involved in drawing up new online harms regulations poached by Facebook – to help them respond to the regime he helped devisehttps://t.co/rQsSGaT1AN
— Matthew Moore (@mattkmoore) April 29, 2020
Neither Ofcom nor Facebook seem to have confirmed the move and we hadn’t received a response to our enquiry to Ofcom at time of writing. However there’s no sign of Close on Ofcom’s content board page, which seems to confirm he’s legged it. Facebook seems to have a taste for UK establishment figures, having nabbed for Deputy PM Nick Clegg to head up its government relations in 2018.
Close continues the rich tradition of public servants taking lucrative positions late in their career in the private sector to help navigate their former beat. He will be able to fill Facebook in on the latest thinking when it comes to regulating social media companies, something Facebook insists it welcomes, but presumably also wants to ensure doesn’t get in the way of business.
The regulation of big social media will be a defining issue of the next few years. They are supposed to be neutral platforms that allow public discussion without any editorial involvement of their own. Increasingly, however, pressure from advertisers, politicians and regulators has compelled them to take an active role in censoring their platforms to ensure the ‘wrong’ kinds of content don’t appear on them.
That kind of activity is associated with publishers, not platforms, but the likes of Facebook, Twitter and YouTube still don’t produce their own content. That, along with the practical impossibility of editing every single piece of content before it’s published, means social media companies can’t be classified as publishers for the purposes of regulation.
So it seems clear that a new category needs to be created for services that facilitate publication but don’t produce their own content. Regulators would then need to create a unique set of rules and obligations for that category to abide by, such as parameters of acceptable speech, as well as a proper structure to protect the interests of those who publish on those platforms.
It’s very hard to see where the best place to draw those lines is. This publication would prefer minimal censorship combined with robust public challenges to contentious content, but we’re apparently in a minority. Mainstream sentiment seems to err towards a more censorious approach to ‘preventing harm’ and it will be the job of regulators like Ofcom to define that. Facebook has quite sensibly used some of its abundant funds to get a greater insight into what form that definition may take.