Media Confidential

Has Musk gone too far?

Alan Rusbridger and Lionel Barber explore the question: how do you solve a problem like X?

September 05, 2024
article header image

This episode is sponsored by Reviewed & Cleared

  

X, the social media platform formerly known as Twitter, has become the champion of free speech—but at what cost? During a summer of unrest in the UK, a hornet’s nest of hatred was stirred up on X. Elon Musk himself gave the nest a good kicking when he provocatively claimed that the UK was heading to civil war. And when Alan responded with a column about the post, he was met by a barrage of abuse from Musk’s supporters. So, is Elon Musk—with his 196m followers—out of control? And what does his support for Donald Trump in the upcoming US election tell us about X’s relationship with democracy?

One person who understands the inner workings of the platform is Twitter’s former vice president of Europe, the Middle East and Africa, Bruce Daisley. He believes that the platform was weaponised to stoke the racist riots in England and Northern Ireland this summer—and that Musk played a role as well. One way to stop this happening again, Daisley says, would be to threaten him with personal legal liability. So, what kind of regulation could make a difference?

 

This is an AI-generated transcript. Please check against delivery.

Alan Rusbridger [00:01:00] We're back. Welcome to Media Confidential. We're back after our six week summer break. Laurel, you're here in London. When we last saw each other. Joe Biden was the presidential candidate for the Democrats, 30. Keir Starmer was still enjoying a honeymoon. So much has changed in six weeks. 

 Lionel Barber [00:01:17] Extraordinary. That race in America. I wish I was there. I have to say, I have many times said sitting in New York, that this is the most important, presidential election ever. You know, in 2000 with George W Bush and then Obama in 2008, this is the big one, and it's turned completely upside down. Harris is running ahead in the national polls. Not a done deal yet, but Trump is struggling. 

 Alan Rusbridger [00:01:45] And so, Lionel, tell me you've been following all this from from lying on a beach. 

 Lionel Barber [00:01:49] No, I've been watching the following it very closely from my bicycle seat in the southern. Is that possible? It is. If you're the middle aged man in Lycra. You have a wife. 

Alan Rusbridger [00:02:02] On your bike. 

 Lionel Barber [00:02:03] Yeah. And, yes, I have been in, in the Surry Hills back now in London, but off to Italy, for some biking for a little short break. I know you just essentially said we've had a longer break than those people in Brussels with the grand vacations. 

 Alan Rusbridger [00:02:22] And we've had. 

 Lionel Barber [00:02:22] It. But you've been in Austria. 

 Alan Rusbridger [00:02:24] And elsewhere, mean hiking and listening to music and following the news from a distance. But anyway, we're back, and we've got a packed episode with, a former vice president of Twitter, and maybe we could just talk a little about Twitter, because we're going to be talking to Bruce Daisley, who was a vice president. But I, I was I mean, the other thing that's happened since we last met was the these riots in England. And I wrote a few columns about Musk and his role, Elon Musk and his role. And I thought I could just share some of the response I got, because it turns out when you write, in criticism of Elon Musk, you get the most amazing response from the Musk is who they wake up about lunchtime. 

Lionel Barber [00:03:10] This is a family podcast, by the way, Elon. I mean, be careful. 

Alan Rusbridger [00:03:13] To switch off now because, because I was told that I was a Commie mouthpiece from a country which was in the pisser. There was an anonymous user because most of that anonymous told me I was screaming like a feminist at a free tampons rally. I quite like that one that I had. Quotes. Micro penis anxiety. 

Lionel Barber [00:03:35] oh. You go and give out some things. 

 Alan Rusbridger [00:03:37] I'm a freaking fagot. T this is a scary place for pussies like you, I think. I feel I'm reading this in the the wrong accent. I was a shit libertarian bitch tantrums. I was a Jew, a loser, a purveyor of drivel, a Hampstead elitist. But my favorite one was somebody actually took the trouble of sending me a private message saying, get your testosterone checked. It seems low. No kidding. Okay, that's what that's what happens. 

Lionel Barber [00:04:04] And clearly moderation is going well. Did you right about. Why are you still on it, Alan? 

Alan Rusbridger [00:04:10] Well, I had this idea that we should all leave at the same time. And that would need some coordination. You know, the dilemma. I mean, I've got 220,000, you know, followers that it's still the place where things happen, where news is broken. Very difficult to go unless everybody else decides they're going to go as well. But I do have this anxiety that by staying there, you're lending some credibility to a site that really has become. Yeah, especially during the summer riots, a very worrying place of racial hatred, of incitement, of violence, of falsehoods. And, and the proprietor of the site, the publisher, the owner of the site, Musk just out of control this summer. 

Lionel Barber [00:05:00] The problem is that the he's the owner, he's the CEO and he's also the editor in chief. And he's got what is it, 220 million followers. I mean, it's it's a massive number, which he's sort of mobilizing. I would say that I took him to task for talking about or invoking the, the likelihood of Britain falling into a civil war, which he tweeted, June ahead of the riots. So around the time of the riots and I took him to task and guess what happened, Elon? What happened? My Twitter followers suddenly started disappearing.  

Alan Rusbridger [00:05:38] There you go. I mean, there you go. You know. 

Lionel Barber [00:05:40] I think I I'm not a conspiracy theorist, you know that. I'm a very connected journalist, but I think I've noticed around 2500. And are they bots? Are they some of the people that you've just been citing or what?  

Alan Rusbridger [00:05:54] As he goes, you know, so. 

Lionel Barber [00:05:55] 112. 

Alan Rusbridger [00:05:56] You're sharing your. My creepiness anxiety you have. I have every third word that you speak. Anyway, what else has been happening? We've we've got Kamala. We'll be talking about how the Daily Telegraph, which hit the. 

Lionel Barber [00:06:08] Telegraph, been. 

Alan Rusbridger [00:06:09] Having to go on. 

Lionel Barber [00:06:10] Write the Telegraph, reported that The spectator, excellent magazine edited by Fraser Nelson, is likely to be sold to Paul Marshall. Your friend.  

Alan Rusbridger [00:06:21] Says, said. 

Lionel Barber [00:06:22] Sir Paul Marshall, for around 100. 

Alan Rusbridger [00:06:25] Million pounds. That is not a good price, right? 

Lionel Barber [00:06:27] Well, I would say that's a very handsome price. But crucially and a little birdie tells me that, you know, the one thing that Fraser Nelson cares about is beating the 44 times multiple, which the Financial Times was sold to, to Nikkei back in 2015. We were sold for $1.3 billion, 840 million pounds, which was a 44 multiple. Well, if the spectator is sold for 100 million and the spectator makes around 2.5 million. That would be a better multiple than the F2. 

Alan Rusbridger [00:07:08] Period as a world record holder would be. 

Lionel Barber [00:07:10] On. I'm worried about that, but let's see what happens. 

 Alan Rusbridger [00:07:15] And the other the other thing that's happened while we've been away is that standard. The London, one of the great cities in the world, no longer has a newspaper is gone, but we'll come and we'll talk about that in future for future episodes.  

Lionel Barber [00:07:26] One other, I think very important news item was the arrest of Pavel Durov, the founder and CEO of telegram, the messaging app. He was arrested in Paris, and essentially facing charges that he'll cloud his platform telegram to to be, a platform for fraud, pedophiles, and general criminal abuse. And I think this is really serious in terms of establishing owners responsibility for content on platforms. But also there's many questions to be answered about, you know, who is actually Pavel Durov, the the citizen of, France, Russia, the UAE and wait for it, Saint Kitts and Nevis. I mean, this is really someone the the pure globalist, the man out with the laws and you. Yeah, he's registered in Paris, but he's been arrested facing these serious charges.  

Alan Rusbridger [00:08:25] Which is a nice link to our first guest. Bruce Lee. He's a former vice president of Twitter for Europe, the Middle East and Africa. He has been vocal about his criticisms of Elon Musk since Musk bought the platform in 2022. He describes Musk's management style as chaotic and damaging, and he's talked about how Musk's focus on speech absolutism and his disbanding. Of all the teams that used to look after content moderation could lead to a rise in misinformation and harmful content on the platform. Welcome, Bruce, to Media Confidential. Perhaps you could just start talking about by the work you were doing on Twitter, to give us some sense of what Twitter was like before the day that Musk bought it? 

Bruce Daisley [00:09:23] Yeah, I think Twitter's always felt flawed from the outside. And, you know, that was never taken as a given. So even though even before Musk bought it, people used to declare it to be an infernal health site. We did try, to make it a slightly more palatable version of of what it was. And so I was responsible for Twitter across Europe, Middle East and Africa. And the job effectively was trying to, defend the organization's reputation and that of might have been to try and win, political support or support from the high profile users. And then we also just tried to expand the reach of Twitter. Twitter's not evenly distributed in different countries. And then ultimately, bring revenue in. So it was about audience reputation and revenue, really. 

Alan Rusbridger [00:10:17] The jobs and the bits that Musk were was the first to disband were around things like trust and safety. You would you were trying to minimize bad content in inverted commas, and you were trying to promote the most useful and reliable content. Can you just talk a bit about those teams and what what the aim? I know it was with mixed success, but at least there was an intention. There, wasn't there? 

Bruce Daisley [00:10:46] There was. I mean, the one thing that you encounter when you're a British person or actually a any non-American joining a tech firm, is that you learn very quickly that the things that might be on your agenda, aren't necessarily on the agenda of the Silicon Valley firm. So, you know, candidly, we had a series of episodes where we saw a lot of abuse directed at feminist campaigners. We saw a lot of abuse directed at black footballers. And, you know, from our perspective, we thought, well, these things are relatively easy to resolve. But what we found generally was that the headquarters and I think this is true across all tech platforms, headquarters generally weren't that interested because, as Musk then went onto to really clearly evidence, these things are expensive. It cost you, you know, if you if you use a simple metaphor, it's far easier to run a leisure center with no lifeguards around the swimming pool than it is to, to try and keep everyone safe there. And broadly, what you find is that these functions aren't lavishly expensive, but they do require resource, because if someone report something, a pair of human eyes needs to take a look at that and decide whether it's it's acceptable or not. 

Lionel Barber [00:12:06] Tell us how many life guards you actually had operating at Twitter? 

Bruce Daisley [00:12:10] Yeah, well, that's one of the reasons why I frequently say to people that if we wanted to get more political control over these platforms, or just a sense of governmental autonomy over what happens inside our countries, one of the best questions you can ask tech platforms is how many people do you have full time working on, administering these things? And the caveat I normally had is how many native English speakers working in the United Kingdom. And the reason why I give that caveat is that at Twitter, at any stage, we might have had a team of 30 people in total, but that generally means about six people at any time, work on it. And what you principally find is that this work migrate very quickly to the cheapest place to do it. And so what you find is that we would have a group of six people in Hungary administering, trying to decide whether tweets are acceptable or not. 

 Lionel Barber [00:13:07] Hungary is not the most obvious place, but there we are, given, Mr. Orban in power. 

 Bruce Daisley [00:13:11] Yeah, but what you very quickly find is that the, abuse generally comes in quite colloquial expression. It often comes in meme chat comes in swearwords. And so if someone isn't speaking English as a native language, what you discoveries that there's an incentive for the. Reviewing it to say that there's no abuse. Because if you say there's no abuse, then this generally there's no appeal. And so they don't get their work rechecked by their manager. 

 Lionel Barber [00:13:44] Okay, let's move to Musk though. And what happened in the mascara? Is there any reason to suggest he's right in that, you know, a lot of speech was being filtered or censored, and that the tilt towards more free speech is actually not a bad thing. 

 Bruce Daisley [00:14:00] You could definitely make a case that some of the things that might be considered subject to nuance and discussion were treated. Absolutely. And, you know, probably themes of gender are the most obvious ones where you could say that the platform got into a degree of editorial control. Principally, that was because I think often transgender or non-binary people were subject to disproportionate amounts of abuse. They were just they were subject to pylons that often were completely disproportionate. And so as a result, many platforms decided that misgendering someone, deliberately misgendering someone was going to be an actionable event. So, so someone could get suspended. Now, what you often found is that if that was just in discourse, there was no action taken. It was mainly when it was targeted so that people setting out to send abuse to a particular person. Now you might say, well, you know, themes of gender, they enter into realms of opinion. You could make that claim. And so you could say that for platforms to get involved in saying that certain things weren't permitted was overstepping the mark. I think if you if you tried to mark Musk's homework with a favorable eye, you could take that stance.  

Alan Rusbridger [00:15:24] But the other thing he did was that Twitter had thrown a number of people off the platform. And he invited them all back on the platform. And in the end, the people like Alex Jones and Andrew Tate, Tommy Robinson, they were all welcomed back. Despite their record of using the platform for, disinformation, for stirring up hatred, for, creating the conditions for violence. I mean, this was all part of Musk's, attachment to a different idea of speech, isn't it? That if you're a libertarian West Coast American, you don't really believe in any, shackles on speech, or at least as the claim made? Right? Whereas there's a different tradition in Europe, isn't there? 

Bruce Daisley [00:16:12] Yeah. That's right. And look, what you often find is that people from certain perspectives, they don't necessarily get a full insight into the abuse that other people experience. And so, you know, what we have found is that abuse disproportionately falls onto minority ethnic groups. It falls onto minority, sexuality groups, it falls onto women. And so someone like Musk, who doesn't fall into any of those categories, probably gets a different form of abuse than a lot of people receive or someone like him. He might just brush it off because, you know, he's he's the richest guy in the world. How could anyone hurt him? So yeah, he's definitely decided he wants a different version. It's an interesting take on this, because this immediately becomes the subject that you can frame as free speech. And in fact, the notion of free speech sounds so intoxicating that it's often used as an argument for everything. Over the the last week, we've seen the arrest of the CEO of telegram, and people who have an issue with that have screamed, well, this seems to be injurious for free speech, but it's worth saying that most of the things he was arrested for were more related to child sexual material that was on the platform, and the fact that the platform was used for illegality. I think it's a very convenient excuse to try to say that actually, he was setting out to be the home of free speech. It's expressly using that as a means to try to disguise everything else that happens under the the umbrella thing, or you. 

Alan Rusbridger [00:17:55] After the break, we'll have more on our discussion about X, formerly Twitter. 

Speaker 4 [00:18:01] You can still get three months of unlimited digital access to prospect for only 3 pounds, or take advantage of our introductory print and digital bundle offer of three issues for 3 pounds. Just type in Prospect magazine or co.uk into any browser that you might be using, and click on subscribe to take advantage of this incredible offer. I'm Ellen Halliday, and in this week's prospect podcast, I talk to Sophie Plucky, an award winning writer and historian. Sergiy joined me to talk about his new book, Chernobyl Roulette, and just how dangerous it is when war. This case in his native Ukraine comes to a country with nuclear power but a loner firebrand. I decide whether the Oasys comeback is a banger or a dud. I want to start, first of all, by caveat this with the fact that I like Oasis. I like Oasis music. The theme tune to the Royal Family, I think is one of the most beautiful songs ever written. I'm not like a huge fan, but I like how I got to be honest. I'm trying to conjure that in my mind. I can't remember, but I'm going to spare everybody for me singing it. So Alan remembers the song. We'll say, for those who remember, remember. You can look it up afterwards. So with that caveat, I want to say that I am highly judgmental of, very wealthy musicians, fleecing their fans, who are looking for a bit of nostalgia and to kind of remember the past concerts of their youth or whatever for lots and lots of money, because they very clearly want to make a bit more money. The fact that they might reform and make music again and perform is great and could be a kind of good creative musical moment, but it just looks like such a kind of cynical business decision. I find it really hard to see it as a positive thing.  

Alan Rusbridger [00:19:49] So follow the prospect podcast wherever you get your podcasts or your. 

Lionel Barber [00:19:58] I want to talk about the riots. You've been outspoken. You think that Musk and Twitter have been an enabling element in stoking violence on the streets in the UK? Don't you? 

Bruce Daisley [00:20:11] The take that I've got is that if we find ourselves in a situation where a platform is being weaponized, then we shouldn't say to ourselves, this is outside of our jurisdiction and there is nothing we can do. And so I'm not necessarily saying expressly that anyone should be arrested, but what I am saying is that in my experience and I worked at YouTube, I worked at Twitter before it was X, and but I also have a lot of friends who worked at TikTok, Instagram and Facebook and, and the thing I know really clearly is that the threat of personal legal liability is far more mobilizing than any prospect of corporate fines. 

Lionel Barber [00:20:53] Well, that's true, but do you believe that these social media platforms were being weaponised this summer, stoking riots in the country? Is that your view? 

Bruce Daisley [00:21:05] Yeah. Most definitely. 

Lionel Barber [00:21:07] And what would you point to? 

 Bruce Daisley [00:21:08] I think at times there's two things where these, there's two things on a platform where you can see that the platform has been used for, has has been exploited. The first one is allowing, allowing false information to be peddled and to develop momentum and look, you know, critically, what platforms generally can't do is they can't be the arbiter of truth. They can't they can't verify every tweet. However, when things are very clearly intent on, peddling misinformation, what you would previously do and what most platforms do but don't talk about is they operate something called shadow banning, where effectively certain tweets, if they are harmful, even if you don't want to take them down, what you do is you prevent them being amplified by the algorithm. And what we've seen on X is the inflammatory tweets have not only been promoted by the algorithm, they've been promoted by the C, the owner of the business. Musk has famously gone out and reposted retweeted patently, untrue allegations about what the British government is doing. He set about amplifying these things both personally and through the algorithm. And I think those things are actionable. I think we we should be able to take a view that, firstly, I think it shouldn't be down to the the platforms themselves to have the final say of whether Tommy Robinson should be allowed to use the platform, whether, Andrew Tate should be allowed to use the platform within our own geographical boundaries. We should be able to call the shots on that. But in addition, I think there should be the ability for us to assert that we don't want certain toxic tweets and certain toxic messages to be amplified. 

 Lionel Barber [00:23:02] Well, Musk predicted in as many words that there could be or would be a civil war eventually in the UK, as a result of, immigration and tensions in these inner cities. That's an opinion you wouldn't necessarily strike that down with.  

Bruce Daisley [00:23:18] No, no, absolutely. 

Lionel Barber [00:23:19] But now you. So that's harder to deal with. But would you be proposing new regulation of the platforms in the wake of the summer violence in. 

Bruce Daisley [00:23:31] Yeah, I think platforms need to demonstrate accountability. And we're seeing some of this playing out in Brazil right now. So critically, I think we as we entered 2024, we knew that there were more elections around the world than anywhere else. And one of the things that's been raised in Brazil is there there are large number of of weaponised misinformation accounts that are deliberately posting false information. And Brazil took issue with that and said that this is in many ways subverting democracy. It's not about one person giving an opinion. 

Lionel Barber [00:24:03] A judge on the Supreme Court in Brazil.  

Bruce Daisley [00:24:06] Yeah. That's right. And and there was a ban implemented. They, or they asserted that these things needed to be taken into hand. There was a fine levied. And Musk not only has, in a quite infantile way, taken to abusing the Brazilians and the, the judges involved, but they've just not complied with it. And it just strikes me that if you've got platforms that have this degree of influence, the anti-democratic action here is not having them be accountable to elected governments. And I think that's the challenge that Musk and his ilk. We sort of hesitate to use the word oligarch, but an oligarch is a really good description, a supremely rich person with large amounts of political influence. And until now, the oligarchs from the tech world have used their influence lightly. You know the like. Of Mark Zuckerberg or or the Google founders haven't really tried to intervene in politics domestically or internationally. What we're seeing with Musk is something which is far closer to our notion of oligarchs. He's trying to not only have an impact on the US election, but he appears to be trying to influence politics around the world. 

Alan Rusbridger [00:25:20] Bruce, there's been a 30 year discussion about whether the tech companies are more like a telephone company or more like a newspaper. And initially the the legislation treated them more like a telephone company. You can't be responsible for what's going down the pipes of the telephone company. And there's been a big debate now as to whether they should be treated more like newspapers, that they should have this responsibility. So you've talked about personal liability. I mean, the way the law seems to be going in Europe and in the UK is not to say yes, you are liable for every single piece of content, but the you have to show that you've got the processes and mechanisms in place to deal with systems that are going wrong. Can you just talk a bit about how the that regulation is going to work and what jeopardy it has for the musks of this world? 

Bruce Daisley [00:26:14] Yeah, I mean, look, you know, you probably know more about this than me from from firsthand exposure to, to to meter. And look, I think critically, we we need demonstration of good faith from tech firms that they are willing to engage in and interact. And actually, if anyone's curious about what this looks like, then Germany's a really good example of it. In Germany, there are very clear rules asserting, how Nazi information can be posted online. And if you ever doubt that tech firms can react very quickly and react, I think is the critical thing that they're not doing this proactively. But when something is reported, tech firms in Germany generally get Nazi content down in about 5 to 10 minutes. They've got an ability to react. Now, the critical thing is, is that Mark Zuckerberg was out last week saying that maybe Covid interventions had gone too far. Yeah, he would say that because it's immensely cheap not to intervene. It's immensely cheap not to have any checks and balances. And I think what we do know is that these firms, when run properly, can be highly profitable businesses. There's no reason why there shouldn't be some degree of accountability of these firms that have become they have become the utilities of modern life. And I don't always loved our attempt to try and apply an old fashioned heuristic to to whether it's a cat or a dog. But, you know, because they're different. And actually the convenience of trying to apply an old model is kind of irrelevant. But we should know that these things can be run for the public good, and they probably a strong need. 

Alan Rusbridger [00:28:01] For that stick in Germany, if I'm right, was enormous fines. I mean, really, really ginormous fines. But but what we're not talking about with the, with, the DSA, the, the the European legislation or the online harms bill in the UK is actually personal liability that that the, the executives themselves could be arrested, as we've seen happening with Durov. I think you're saying that's the only thing that they understand. 

Bruce Daisley [00:28:27] Yeah. And that is is a part of the German law. It's also in Australia as well. And so I saw this first hand, if you remember the horrors of the Christchurch, shooting a few years ago. Now, one of the things that was happening at the time was that a lot of the content from Christchurch was being re uploaded to social platforms, so it was proving immensely difficult to take down. At the time in Australia, there was personal liability on executives to get this taken down. And let me tell you, nothing produces more immediate response than people feeling that they are in personal jeopardy. And so, you know, that's what I would emphasize, that big fines don't mean a lot to the likes of Google and Meta. They've demonstrated they've both received billion dollar fines in the EU, and it hasn't radically changed their approach to doing business. The best way that we can try and do this is the the threat or the prospect of individual liability. And so that's why a raise the prospect that Elon Musk could find himself with a summons, a writ if he landed in the UK, he's a Playboy. If anyone wants to track Elon's movements, there's an Instagram account that tracks Elon's jet week to week, and he's in Europe every couple of months. And so, you know, if he found himself suddenly recognizing he was liable for what he was doing in. Countries beyond the US. I think it would probably focus his mind a little. 

Alan Rusbridger [00:30:03] What did you make. 

Bruce Daisley [00:30:04] Of. 

Alan Rusbridger [00:30:05] The episode this summer where the European Commissioner, Thierry Breton, wrote to him? I kind of, sort of. Oh, that kind of warning shot across his bars. And Musks response was, forgive my French. He used a meme that said, and literally fuck your own face. 

 Bruce Daisley [00:30:24] Just this last weekend, he told the the judge in the Brazilian Supreme Court that he was gonna end his life in jail. So, you know, these are I mean, the closest comparison. These are Trumpian, ways to respond. And, you know, Trumpian responses do move the Overton window about what you think are acceptable and what are not acceptable. But I think it demonstrates that Musk still perceives himself as being some way above the law and some way above international regulation. He's also got a an army of a million minions who follow him, who are willing to abuse and and tackle online any anyone who speaks up. And you know, anyone who's received a lot of the, the, the tickets, the, the blue tick, people who pay for his subscription and receive the abuse from them, will know that they're, pretty effective force that can be quickly mobilized. I think it's demonstration that he's kind of thumbing his nose at other democracies. It's almost challenging them to say, you can't do anything about this. 

Lionel Barber [00:31:35] If we stay parochial for just a moment to wrap up. Do you think that the UK will wait for Brussels to act first under the Digital Services Act? Or do you think that something may happen in Parliament with new regulation or legislation? 

Bruce Daisley [00:31:57] The challenge is it's taken us about 5 or 6 years to get to the law that's just passed and before it's even been implemented. 

Lionel Barber [00:32:05] This is the online. 

 Bruce Daisley [00:32:07] The UN, the online, Safety Act. Yeah. And and it's pretty clear the moment before it's even been implemented, it's not fit for the job. You can you can read it and try and fill in some of their gaps thinking, well, if they push this here, push this there. Effectively it's about, really it's about protecting children first and foremost. There is stuff in there about racial hatred and stirring up racial hatred, but it doesn't speak to the the weaponization of these platforms. And it doesn't speak to, I guess, corporate, just the sense of corporate rules being abandoned, corporate niceties. We most of the law is kind of like an unwritten constitution. It just presumes these companies are going to be broadly positive in their approach or broadly consensual in their approach. So it's not fit for the task. The challenge then is if it's taken us this long to get this, is there really going to be the appetite to either amend it and and redo it? So I suspect we are going to see something from Europe before we see something in the UK. It also raises the question, of course, is Ofcom an organization that has had rings run around it by GP news of all operations? Is Ofcom really fit for this task? 

 Alan Rusbridger [00:33:29] I was going to be my question. Apart from its flabby behavior over GB news, what makes you think that it's going to. It would struggle with this kind of task. 

Bruce Daisley [00:33:37] Because I just think this is, non-linear in terms of the the weight aid it comes up. I don't think they've demonstrated any muscularity in dealing with very traditional media and how they would seek to respond to someone as maverick and unpredictable as Musk just seems to be a test of anyone. And for me, I've not seen enough to make me think Ofcom is ready to step up to this. 

Alan Rusbridger [00:34:06] Bruce, thank you so much for joining us today. That's been fascinating. 

Lionel Barber [00:34:10] Thank you Bruce. 

Alan Rusbridger [00:34:12] Well, thank you so much to Bruce. 

Lionel Barber [00:34:14] And that's all from Media Confidential. We'll be back next week with more analysis as we delve behind the headlines. 

Alan Rusbridger [00:34:21] And we're going to have a new bonus show that will land this Sunday, because each week we're going to be answering your questions about the industry and shedding some light on some of the most mind boggling aspects of the media. 

Lionel Barber [00:34:32] So send us your questions. Media Confidential at Prospect magazine, Dot. Co.Uk or at Media Pod. 

Alan Rusbridger [00:34:40] That's on Twitter. 

Lionel Barber [00:34:41] And thank you for listening to Media Confidential, brought to you by Prospect and Fresh Air.