Like a lot of middle-aged people, I am on Facebook not so much because I want to be, but because it feels essential to keep up with the minutiae of daily life in my circle of friends and family. Facebook gives me an ambient awareness of what’s going on in the lives of people I don’t see very often: their children’s sporting achievements, who’s been accepted into what colleges, anniversaries and other milestones and—increasingly these days—the deaths of parents and occasionally of friends.
Facebook let me know the other day that the elderly mother of a friend had passed away after a long illness. I scrolled down to the comments to leave my “care” emoji and a note of sympathy and then I noticed that Facebook had added a new feature: it offered to write a sympathy note on my behalf, giving me some useful controls. I could click to make sure my sympathy note was “heartfelt”, or I could risk going it alone and have Facebook check my grammar and spelling.
This new feature—AI-assisted grieving—was not exactly a surprise. Like many tech giants, Meta (Facebook’s parent company) has invested heavily in artificial intelligence, producing the Llama open-source model, a useful technology that underpins much research in the field of generative AI, the technology behind chatbots. Determined to bring about an AI age, Meta has been pushing AI on users at every possible moment, inviting me to use its image generation software to imagine humans and penguins having a meeting. It often annotates my friends’ posts in ways that are charmingly clueless. When a friend cracked a joke about his schedule being packed here in snowy western Massachusetts because he has to shovel his driveway every two hours, the Meta AI chimed in with an earnest offer to explain why having a clear driveway was so important.
This particular intrusion by Meta AI seems to run counter to a broader societal shift. At a moment where AI is garnering enormous commercial attention, we are seeing a powerful cultural and political push for authenticity. Earlier this year, before Trump had started exiling migrants to gulags in El Salvador, before his senseless global trade war, before he declared war on American universities and detained students for exercising their right to protest, I reached out to a friend, a Democratic party insider, to ask: “Why aren’t Democratic politicians speaking up?” Deeply enmeshed in his party’s postmortem on the 2024 elections, my friend confessed to me, “We don’t know how to speak to voters anymore. Whatever we say, they don’t think we’re being sincere. Somehow, the other side has figured out how to be authentic, and we haven’t.”
Authenticity seems like a useful explanation for the shifting media landscape, at least in US politics. Donald Trump won the 2024 election in part by mobilising low-propensity voters, young men in particular, who tend not to be involved with elections. He reached many of these voters through unusual media channels: longform podcasts by influencers like Joe Rogan, whose brand is that of authentic, laid-back, casual chat.
These podcasts can seem excruciating to those not familiar with the form. Their hosts ramble for hours on end, and the conversation appears designed less as insightful interview and more as background noise. As a listener, you are invited to hang out with your digital friends, rather than to be informed as you would be by a newscast or documentary. These podcasts create a parasocial bond, a sense that you personally know the people you are virtually hanging out with, and that you look at the world in similar ways.
We can trace the power of this form of digital co-presence to the Covid lockdowns. Livestreaming was a common practice on the internet before then but largely in the video gaming community, where services such as Twitch made it possible for viewers, many of them young people, to hang out with influencers as they played video games and broadcasted their progress. During the pandemic, when many forms of in-person socialisation were cut off, there is evidence that forms of broadcasting through podcasts, livestreams and short video such as TikTok and YouTube saw a sharp increase. My lab, which studies YouTube and TikTok, observes a spike in usage of TikTok as the pandemic unfolded. It seems likely that, faced with time away from work or school, faced with loneliness or isolation, we got into the habit of sharing parts of our life online with whoever would watch.
Unpolished, personal expression during the pandemic was probably a useful emotional release valve for millions of people who engaged with it. Influencers like Rogan have raised that expression to an art form, not to mention a powerful commercial force. In 2020, Spotify is said to have paid as much as $200m to acquire the exclusive hosting rights to Rogan’s podcast. When Rogan renewed the deal in 2024, he secured an additional $250m as well as the rights to stream it on other platforms. Thousands of other influencers have copied his formula of speaking off the cuff and apparently from the heart. The sheer length of the form seems to favour authenticity. After all, who could remain carefully scripted for such long periods? For at least some audiences, this ability to see someone perform authenticity is satisfying in a way that carefully scripted, packaged entertainment is not.
But authenticity is a complicated thing. As the talk show host Joe Franklin said, “The key to success is sincerity. Once you can fake that, you’ve got it made.” In talking on a podcast multiple hours in a day, Rogan may well be sharing his authentic views of the world, but he is also very much a professional and a performer.
One of the tricks to appearing authentic may be particularly harmful to society as a whole. On some online platforms, it is common for content creators—figures such as Andrew Tate—to express voluntarily misogynist and hateful views. This may be part of the performance of authenticity. After all, what better way to convince an audience that you’re being sincere than to cast yourself in a bad light? If the speaker wasn’t sincere, why would he admit to being a misogynist?
My friend the Democratic insider worries that not only have centrist and left-leaning politicians not learned how to communicate authenticity, but also that the priorities of the American left read as insincere virtue signalling, rather than as concerned and engaged. The leadership of the resistance, he tells me, is unlikely to come from established politicians, but from someone on the left who figures out how to speak authentically about their fears, values and priorities.
Where does this leave the multibillion-dollar AI industry? Grindr, the gay dating platform, is testing an AI Wingman designed to help you navigate the first awkward conversations on a dating app. I imagine that discovering someone was using AI to chat you up might be a serious red flag. But given that many users are turning to AI as counsellors and companions, maybe what I’m experiencing is my own middle-aged culture shock: I have trouble seeing this AI-generated text as anything but insincere by definition.
Perhaps there’s a market for high-quality performance of authenticity by machines. Or perhaps Meta and its ilk just have it wrong. People may be excited to use AI to write computer code more quickly. But I hope it remains a faux pas to have AI tell your friend you’re sorry about their mother’s death.
As AI embeds itself more deeply in our society, we need a conversation about authenticity in culture, technology and politics. Arguably, the 2024 US presidential election was won by someone with sincerely held terrible ideas. But he did a better job persuading voters of his authenticity than the candidate with smarter ideas who gave voters the impression that she was a scripted, polished politician. If performances of authenticity are a shortcut to success and power, we need a way to understand what we should truly view as authentic. Because if we can’t communicate authenticity ourselves, companies such as Meta will try to make machines do it for us.