Few industries are more controversial, and for some the only real fix would be shutting this one down. But it’s booming, so it is worth tackling the most egregious faults that everyone should agree on as problems.
In December, the New York Times accused Pornhub—one of the biggest adult sites in the world, attracting billions of clicks each month—of hosting recordings of child abuse and rape. Pornhub denied this, insisting it has robust processes in place to remove illegal content, but nonetheless soon removed most of its videos.
Platforms like this work a bit like YouTube: anyone can upload content, and some get paid for popular videos. As in so many industries, the internet has demolished the old barriers to entry. Gone is the monopoly of top-shelf magazines and Soho stores. Now anyone can produce, consume and sell porn—and regulation of this multi-billion-dollar business is resisted as puritanical. Women who find explicit videos of themselves uploaded without consent, often by ex-partners, are left campaigning from the margins.
Ultimately, it was not regulation that forced Pornhub to change, but likely Mastercard, which said it would no longer allow payments to the site after its own review of the content. While claiming changes were already planned, Pornhub removed all “unverified” videos—those uploaded by ordinary users—and said it will introduce a new verification system.
But this is about more than just one site. Its changes are voluntary and there is still no system to hold platforms accountable. Illegal content on some sites abounds, but platforms have very limited liability for user-uploaded content. Similar criticisms have plagued social media sites, including Twitter and Facebook, which finally blocked Trump after January’s storming of the Capitol. Pornhub argues it is unfair to target adult sites and that mainstream social media platforms have a bigger problem with illegal content. The Internet Watch Foundation recently agreed on that point, although clearly the nature of adult sites makes it especially hard to spot violations of a sexual nature in their material.
Campaigners see a simple fix: make publishing or hosting sex tapes without consent a criminal offence. That would put the onus on platforms to ensure anyone who features in a video is of legal age and has consented to be there. Critics say that is too high a burden, requiring platforms—whether “adult” or not—to implement pre-publication checks on content. Without careful limits, some argue it would put disproportionate constraints on users’ speech.
But there is a compromise. At the very least, platforms should be required to have adequate safeguards, whether dedicated moderators or simple ways for users to flag harmful or illegal content, and a requirement—on pain of sanctions—to remove that content quickly. Just as other businesses are bound by a “duty of care,” platforms should mitigate the risks associated with their offering. According to plans announced just before Christmas, the government may take this approach in the Online Safety Bill—with possible fines of up to 10 per cent of annual turnover, and Ofcom acting as the regulator. It is not a perfect solution—a lot will depend on how stringent the rules are and how well they are enforced—but it is something. And it is not “anti-porn” puritanism. Non-consensual content is abuse, and the rules would apply to all platforms.
This is arguably just the start of the industry’s problems—some feminists argue the misogyny and racism of our world is distilled in its pornography, and then reflected back out, warping attitudes to sex. But this web of problems is not amenable to a simple regulatory tweak. Let’s start by making sure the only people naked on the net are those who have agreed to be so.