A mistake I made as a young man has haunted me for the rest of my life.
I was in my early twenties, working for an internet start-up called Tripod that offered free homepages—an early precursor to social media—to anyone who wanted one. Eventually, more than 15m people took us up on our offer, which forced us to find a way to pay for servers and bandwidth. Like virtually all start-ups in the mid 1990s, we gravitated to banner ads, gaudy digital billboards that users rarely clicked. We told advertisers they were useful for “brand building” even if they didn’t lead to transactions.
There was a major problem. Advertisers liked banner ads on sites like Wired, the technology magazine, where they appeared next to carefully edited content. But on our site, a homepage author might be talking about fringe politics or sexual fetishes, and our advertisers told us we were not “brand safe”. My boss asked me to figure out a way to “distance” the ads from our sometimes-sketchy users, and I came up with an idea: when we loaded a user’s homepage into a web browser, we would also open another window, independent from the page, with an advertisement in it. Yes, dear reader, I invented the pop-up ad.
Before you turn away in disgust, let me mention two things. First, I’ve apologised, extensively and in print. But second: you probably haven’t encountered many pop-up ads lately. Most modern web browsers block pop-ups by default, at least on your laptop. (As we’ll discuss in a moment, it’s another thing entirely on mobile phones.) Even if the website owner really wants you to see an ad, your browser is loyal to you, giving you a degree of control over your online experience that the business selling ads wishes you didn’t have.
This battle between what users want and what companies selling ads want has taken two dramatic turns in the past dozen years. First, as social media came to dominate the web, companies like Facebook and Twitter got very good at claiming lots of users’ attention. The updates from your friends turned into an infinite scroll, and, if your friends didn’t have enough to say, the algorithm would suggest other content to you, paired with an endless sequence of ads. That had major downsides, but if you were using a web browser and were tired of seeing promotional material, you could at least install an ad blocker. If you were worried about developing a dependence, you could install software to limit the time you could spend on a social media site.
The second revolution—the rise of the mobile phone—fundamentally shifted the balance of power. On our laptops, we interact with the internet through web browsers, which we can customise and control. But on our mobile phones, we mostly use apps. These apps are made by the same companies that run websites like Facebook or Twitter. While we might be able to use our loyal web browser to shape our experience of Facebook on our computer, we have no such freedom with the Facebook app. The app is loyal to Zuckerberg, his ad sales team and his engagement metrics—not to you.
In 2021, programmer Louis Barclay came up with a clever way to make Facebook less addictive, even if you accessed it through the disloyal mobile app. He wrote a little piece of software called Unfollow Everything. If you installed it in your web browser it would log into Facebook on your behalf and, well, unfollow everything. Your algorithmically curated feed of updates from your friends would be empty, though you could check in on any of your friends by visiting their pages. Barclay was delighted with his less-addictive Facebook experience and made his software free for anyone to download.
Meta, the parent company of Facebook, was not delighted. It deleted Louis’s accounts on Instagram and Facebook and banned him for life. And it threatened a costly legal battle unless he removed his software from the web. Barclay was forced to comply.
The Facebook app is loyal to Mark Zuckerberg—not to you
It seemed to me that Barclay should have a right to encounter Facebook the way he wanted it, much as you have the right to block my dreadful pop-up ads. My friends at the Knight First Amendment Institute, a non-profit law and research organisation that argues cases defending freedom of speech in a digital age, agreed with me, and over the course of many months we found a legal argument that looked promising.
In 1996, the US passed a piece of legislation called the Communications Decency Act, a law focused on combatting the perceived threat of obscenity online. Much of the law has been overturned, but a critical section has remained—section 230—which American internet scholars sometimes describe as “the 26 words that created the web”. These words limit the liability for platforms like Facebook by declaring that a user who posts content—rather than the owner of the website that hosts it—will be considered its publisher. Businesses like Facebook would be legally untenable if they were considered the publisher of everything their users—in Facebook’s case, three billion of them—say. By providing limits to liability, section 230 made businesses like Facebook, and like Tripod, possible.
But there’s more to section 230 than those 26 words. Much of the section concerns the idea that users have the right to control what they encounter online, stating that it is the policy of the United States “to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services.”
That sounded a lot like what Barclay was trying to do with Unfollow Everything. And so, on May Day, 2024, I sued Meta in US federal court.
The complaint, drafted by the brilliant lawyers at the Knight First Amendment Institute, explains that I want to release an updated version of Louis’s tool and conduct a research study to see whether users feel more in control of their Facebook experience when using it. Because Meta has forced Louis to take down his software, we are asking for a “declaratory judgement” that what we propose to do is protected and legal.
It would be great to recreate Barclay’s software, but there’s much more at stake. Should a court find in our favour, it might open a path towards limiting the power of the big web platforms not just through legislation, but through technology. My lab at the University of Massachusetts Amherst has been building tools that allow social media users to choose what algorithms they want sorting posts from their friends. We can use this technology only on a very few social networks—Mastodon, Bluesky, Reddit. The most popular networks—Twitter/X, Facebook and Instagram—prohibit tools like ours. But a finding that section 230 protects our right to develop tools to increase user control means we might be able to force these powerful companies to cooperate.
I made a mistake years ago when I tried to inflict pop-up ads on an unwary internet population. Facebook is making a mistake when it insists that its tools and preferences are the only ones that count. It’s possible that a US federal court will give us a chance to put control back in the hands of social media users.