play_arrow

keyboard_arrow_right

Listeners:

Top listeners:

skip_previous play_arrow skip_next
00:00 00:00
chevron_left
chevron_left
  • cover play_arrow

    RADIO ROXI TIMELESS TUNES

Alternative News

Zuckerberg Goes “Full MAGA” as Meta Ends Fact-Checking in U.S. & Paves Way for More Hate Speech

today10/01/2025

Background
share close


This is a rush transcript. Copy may not be in its final form.

NERMEEN SHAIKH: Out with the fact-checkers. That was the message delivered by Meta CEO Mark Zuckerberg as he announced sweeping changes at Facebook, Instagram and Threads. Zuckerberg outlined the new changes in a video posted online.

MARK ZUCKERBERG: The recent elections also feel like a cultural tipping point towards once again prioritizing speech. So we’re going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms. … First, we’re going to get rid of fact-checkers and replace them with Community Notes, similar to X, starting in the U.S. After Trump first got elected in 2016, the legacy media wrote nonstop about how misinformation was a threat to democracy. We tried in good faith to address those concerns without becoming the arbiters of truth. But the fact-checkers have just been too politically biased and have destroyed more trust than they’ve created, especially in the U.S.

NERMEEN SHAIKH: That’s Meta CEO Mark Zuckerberg.

Nicole Gill, the head of Accountable Tech, called Meta’s move a, quote, “gift to Donald Trump and extremists around the world.” The decision comes weeks after Zuckerberg dined with Trump at Mar-a-Lago and after Meta donated a million dollars to Trump’s inauguration. On Tuesday, Meta also appointed a close Trump ally, Dana White, to the company’s board. White is the CEO of Ultimate Fighting Championship. Trump praised the new Meta policies.

PRESIDENTELECT DONALD TRUMP: Honestly, I think they’ve come a long way, Meta, Facebook. I think they’ve come a long way.

AMY GOODMAN: Mark Zuckerberg also announced other changes, including a loosening of rules over what kind of content can be posted on Facebook and the other sites. Under the changes, women can be referred to as “household objects.” Meta users will once again be allowed to say gay and trans people have mental illness, and more.

To talk about what Meta’s new policies mean in the U.S. and worldwide, we’re joined by three guests.

In Manila, Philippines, we’re joined by Maria Ressa, founder, CEO and executive editor of the Philippine independent news site Rappler. She won the Nobel Peace Prize in 2021 for her work defending free expression in the Philippines. Her book is titled How to Stand Up to a Dictator: The Fight for Our Future.

Siva Vaidhyanathan is the author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. He’s professor of media studies and director of the Center for Media and Citizenship at the University of Virginia. His piece for The Guardian, just out, “Mark Zuckerberg has gone full Maga.”

And in Doha, we’re joined by Marc Owen Jones, associate professor of media analytics at Northwestern University in Qatar and an expert on disinformation. He’s the author of Digital Authoritarianism in the Middle East: Deception, Disinformation and Social Media.

Let’s begin with you, Professor Jones. Digital authoritarianism. Can you talk about how Mark Zuckerberg’s announcement, soon after he met with Donald Trump and is giving well over a million dollars to the inauguration and put one of his closest allies — Trump’s allies, that is — on his board, fits into that concept and what it means, digital authoritarianism?

MARC OWEN JONES: Yeah, absolutely. Thank you for having me.

I mean, digital authoritarianism, broadly speaking, is the use of digital technology for authoritarian ends. You know, it’s fundamentally anti-democratic. And I think Zuckerberg’s move is indicative of the authoritarian direction that the United States is going. He is essentially saying that the social media company of Meta is subservient to the whims of the political elite.

And I think a key part of digital authoritarianism here that’s really important is this notion of the post-truth. Authoritarianism thrives on the absence of facts. It thrives on a war on reality. Why is that the case? Well, because authoritarian leaders, they want — they’re not interested in the truth, because the truth can be used to attack them, to discredit them, as we know that Trump is — was documented as being one of the most lying presidents and politicians in history. But they want an alternative reality, alternative facts. And the idea of a war on fact-checkers really fits in with this, because what it means is that authoritarian leaders, they want people to respect them, they want people fear them. They fundamentally want people to see the world through their eyes and be submissive to their wishes. And I think that’s what this represents, is that Mark Zuckerberg is clearly bending the knee to Trump, not only doing his bidding, but also creating this massive donation.

And I think what we need to understand about digital authoritarianism is that we are in a space where we increasingly communicate in digital technologies. Zuckerberg’s move is designed not just to, quote-unquote, “prevent censorship.” Here, “censorship” is a euphemism for, actually, harm against minorities. Some of the examples you mentioned were women, but also transgender people. But also immigrants were mentioned by Zuckerberg himself. These are all right-wing dog whistles. Essentially, what this means it’s going to be, it’s open season on these digital platforms for people to attack minorities and to harm them. It’s not about reducing censorship. It’s actually about increasing digital violence against minorities, which is part of the new, I would say, increasingly autocratic America.

NERMEEN SHAIKH: So, Maria Ressa, can we get your response to Meta’s new policies? You’ve said that Zuckerberg’s decision would lead to, quote, “a world without facts” and that that was “a world that’s right for a dictator.” So, could you respond to what Marc said now and what you think this means, in particular, for the Philippines, for social media, Meta social media sites in the Philippines?

MARIA RESSA: Well, I mean, first of all, the Philippines, we had a bout with our own dictator starting in 2016. And I would say we were in hell under our previous president, and now we’re in purgatory, while the United States is just headed to hell, it feels like.

Let me quickly say, it’s not a free speech issue, which is what he claims. It’s a safety issue. And exactly what we just heard, what you’re seeing is that now this platform, one that unites more than 3.2 billion people around the world, has declared open season on minorities. Actually, it’s no longer safe for any person, because one of the things that we’ve learned in the Philippines is that the insidious manipulation on this takes away will. Right? So, we already know around the world that it has become a platform that has enabled genocide. This is in Myanmar in Southeast Asia. Meta itself sent a team, and they concurred with the United Nations in two separate findings. The second one is, this is a platform that hacked our biology, that changes the way we feel, fear, anger and hate, to change the way we see the world and the way we act.

Election interference. What you’ve seen now is, as early as 2016, on Brexit, the U.K. actually fined Facebook then, the largest amount at that point in time, because of what would be equivalent to election interference. Nothing else was done beyond that. But again, part of the reason the world is where it is, where, as of last year, 71% of the world is under authoritarian rule, we are electing illiberal leaders democratically, partly because our public information ecosystem, social media, Facebook, is corrupting our individual communications with each other, taking away agency.

NERMEEN SHAIKH: And, Maria, could you say, specifically, I mean, in a place like the Philippines, there’s — almost 90% of the population in the Philippines is on Facebook. And how is Facebook used, not only in terms of posting information, but also as a means of communication, and the way in which misinformation has spread on Facebook in the past, not only, of course, in Myanmar, which is the example that you cited, but also in the Philippines and elsewhere?

MARIA RESSA: I mean, very specifically, our past president, Rodrigo Duterte, used the design of what was an advertising and marketing platform of Facebook to do information operations. I was getting an average of 90 — nine, zero — hate messages per hour, free speech used to stifle free speech. But beyond that, these networks of information warfare, networks of disinformation literally changed our history in front of our eyes, setting the stage for the election of Ferdinand Marcos Jr., the only son of our dictator who was in power for 21 years, Ferdinand Marcos. You’ll remember Imelda Marcos and her shoes. Her son, Ferdinand Marcos Jr., won overwhelmingly in our presidential elections, because, literally, Filipinos were taught in information operations — YouTube, Meta and Facebook — that Marcos was not a dictator, in 1986 ousted in people power; instead, he was the greatest leader the Philippines has ever known.

AMY GOODMAN: In 2021, the Rohingya refugees actually sued Facebook’s parent company Meta over its failure to stop violent hate speech on its platform, which contributed to the bloody military crackdown of the Rohingya Muslim community by the military in 2017 in Burma, also known as Myanmar. Law firms in the U.S. and U.K. launched the legal effort on behalf of the Rohingya around the world, including in Bangladeshi refugee camps. Plaintiffs demanded over $150 billion in damages. The U.N. found as many as 10,000 Rohingya were killed by Burmese forces during the 2017 genocide, though some estimates put the death toll at twice that. Over another 730,000 people were forced to flee the country. I just want to go to a quick video, part of a video, produced by Amnesty, featuring a survivor of the Rohingya genocide.

SAWYEDDOLLAH: My name is Sawyeddollah. I am from Myanmar, but now I am living in Bangladesh, Cox’s Bazar, as a forcibly displaced Myanmar national. [translated] I personally experienced that online hate speech turned into offline hate for us. There were many different things on Facebook against us, which turned the people of Myanmar against us. I saw many different anti-Rohingya posts on Facebook. For example, I still remember a post that said, “The birth rate of these Bengali [Rohingya] people is very high. If it remains continuously like this, we will become their slaves. Now they are our slaves. Let’s go for action.” I knew about the option of reporting hate speech to Facebook. I already reported some hate speech to Facebook, but it only said, “Thank you for reporting. This post doesn’t go against our community standards.”

AMY GOODMAN: I wanted to bring Siva Vaidhyanathan into this conversation, author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, a professor at the University of Virginia. And I wanted to go to this issue of hate and how deadly it can be, whether we’re talking about Burma, Myanmar, or whether we’re talking about the United States. I mean, you have, just giving a bunch of examples, roughly seven in 10 lesbian, gay, bisexual adults have encountered any harassment online. Fully 51% have been targeted for more severe forms of online abuse. You have roughly about a third of women under 35 may have been sexually harassed online. You have — a study of 51 countries reveals 38% of women had personally experienced online harassment. Professor Vaidhyanathan, if you can talk about the connection to hate and how that connects to, well, your headline of your Guardian piece, “Mark Zuckerberg has gone full Maga”?

SIVA VAIDHYANATHAN: Right. Well, let’s be clear about this change. And I’m really thrilled that you’ve started this conversation by going beyond the United States. What Mark Zuckerberg announced the other day clearly only applies to the United States in terms of its specificity. In the United States, since 2017, Meta has been trying to at least appear that it’s taking content moderation seriously. And let’s remember, the proper phrase to describe what we’re talking about is “content moderation.” Fact-checking was never part of the formula. It’s not about facts. It’s about protecting users. It’s about protecting people from things like harassment, for protecting groups of people from campaigns and movements, people like citizens of my town, Charlottesville, Virginia, who encountered a flood of violent right-wingers in the summer of 2017 through a movement that was largely organized through Facebook. So, the idea here — and why Zuckerberg has appeared before Congress multiple times swearing that he’s doing a good job — has all been about the United States.

The content moderation policies and procedures and infrastructure beyond the United States have never been sufficient and have never been serious, with the exception of Western Europe. Facebook, like every other digital media company, pays particular attention to the demands and laws of the countries in which it operates. And therefore, it didn’t pay much attention to the demands of content moderation and protecting users in the Philippines. It’s never paid attention to protecting users in India or in Pakistan, in Cambodia, in Uzbekistan. These are places where tremendous propaganda flows through Facebook channels unmitigated.

Now, the United States question, though, in this sense, is: What is Zuckerberg changing? Zuckerberg is scrapping a rather elaborate and expensive system, that sort of worked, to keep the Nazis at bay and to somewhat limit things like harassment and somewhat make sure that the user experience in the United States is not horrifying, the way that one would encounter horrifying material on X today, for instance. He’s gone against that.

And I think it’s wrong to think of this as a pandering to Trump or getting down on his knee to Trump. It’s actually just the opposite. In the United States, the government is subservient to corporations, especially this corporation. Zuckerberg always gets what he wants out of the United States government. He sees an opportunity to get even more of what he wants out of the Trump administration, because what he wants is to be able to threaten and win in a power struggle against Lula in Brazil, who has put forth very strong regulations against all social media companies. So, Zuckerberg absolutely cares more about Brazil than he does the United States. Brazil is still one of the most fervent and potentially important growth markets for Meta. He’s extremely concerned that the United States be able to use its power to limit regulation in Europe, as well. So what he’s hoping is Trump’s irascible, off-the-hook, unpredictable foreign policy, throwing sticks around the world, might help limit Europe’s willingness to regulate Facebook and Brazil’s willingness to regulate Facebook. This is all about opportunity, not subservience. He is hoping that once again the U.S. government is subservient to Facebook.

NERMEEN SHAIKH: So, Marc Owen Jones, if you could respond to what Meta has said the new form of content moderation will be, this idea of community notes, which is what is operative on Elon Musk’s X, and why you think this decision has been taken now, and what the effect will be?

MARC OWEN JONES: Well, I think it’s very clear that the Community Notes system, which involves basically a certain group of, in X’s case, sort of anointed fact-checkers — not fact-checkers, sort of people commenting on viral posts to determine whether it’s truthful or not — is really problematic. It doesn’t work. And there’s a number of reasons why it doesn’t work, because, essentially, what it’s resulted in is people not correcting facts, but people just posting their opinions as fact. So, what we’re getting is not fact-checking or, in theory, content moderation; we’re getting this relativization of truth, where people just respond to some information that they disagree with politically or for some other reason, and then say the opposite. And so, all you’re doing is getting this kind of situation where it’s not even about verifiable facts anymore, but just about presenting an alternative opinion.

And I think the reason, I mean, this is going to be rolled out, we’re not quite sure, on Meta’s platforms. But it won’t work. And crucially, I think there’s, you know, another point here that’s very interesting. Firstly, for Zuckerberg, him abandoning his fact-checking system is going to save him money. The idea of Community Notes is also, in theory, going to increase engagement, which might mean more profit for Facebook.

But fundamentally, what it’s going to do is just actually create a space in which people are going to engage in increasingly contentious arguments. And there’s a really important point we need to bear in mind here, is that Community Notes also implies that there’s a level playing field when it comes to the kind of information that’s put out there on Facebook, right? We don’t know which posts are getting promoted, which posts will get community noted. We also know that — and I really want to make this point, is that sometimes this Community Notes thing is framed as a bottom-up form of free speech that allows users to engage in debate. It’s not about free speech; it’s about the free market of speech. And the difference is that the free market of speech allows people with authority, influence to actually have a louder voice than others, whether that’s paid advertisers or Facebook’s algorithms deciding which information gets promoted more, and therefore community noted.

And that’s really important, because we know from Facebook all this talk of free speech is misleading, because Facebook have been known, through their testing, to promote content that is, quote-unquote, “bad for the world,” the kind of content that will make you angry or potentially laugh. So, Facebook are in the business of modulating public conversations. And this Community Notes thing is a smokescreen to suggest that Facebook are actually permitting free speech and user-led fact-checking. It’s a system that won’t work, and it’s a system that will just lead to increased polarization and the undermining of democracy by polluting the information space with ideological disinformation.

AMY GOODMAN: Professor Vaidhyanathan, Zuckerberg said in his announcement Wednesday that Meta is moving its trust and safety content moderation teams out of California to Texas, a move he said would, quote, “help us build trust to do this work in places where there’s less concern about the bias of our teams.” Now, this comes after the Elon Musk-founded companies Tesla and SpaceX changed their incorporation state from Delaware to Texas — run, of course, with Republican leadership, from Senator Ted Cruz to the governor of Texas. Talk about the significance of this and, as you talk about, you know, Meta going MAGA.

SIVA VAIDHYANATHAN: Yeah, I mean, it’s purely symbolic. It has no operational significance. You know, the people who work in that field — and there will be many fewer of them. That’s important. But the people who work in that field, it doesn’t matter where they sit, right? They’re trained the same way. They have to follow the same rules. It’s a very structured system for content moderation.

The real question here is, all of those thousands of contracted workers, many of whom are in English-speaking countries or partially English-speaking countries around the world, low-wage workers, people in places like India and the Philippines, people who were subjected to hours and hours of horrible images and videos, animal mutilation, child abuse, horrible violence, they are — their fate is that, basically, they won’t have any jobs anymore. You know, they had horrible jobs before, but those contracts will all be canceled.

The small group of domestic Facebook employees will be moved to Texas or — I’m not sure exactly what he means by that. And he may not even follow through on it, right? There’s no reason to believe anything he said. No one will hold him to it. And I don’t think it means much, except that it was a symbolic or signal to — a symbolic signal to the Republicans that this is some sort of candy for them. But I don’t think it’s going to change the Facebook experience in any significant way.

What really will change the Facebook experience is the fact that Community Notes will fail and that our Facebook and Instagram experiences will be filled with hate speech almost immediately. It’s going to become deeply unpleasant. The real question is: Will it become so unpleasant for advertisers that they start pushing back at Meta? And if that happens, because that’s the only other countervailing power at work in this system, then there could be another correction to this policy. And I wouldn’t be surprised by that. Zuckerberg has no principles on these matters, except that what is good for Facebook is good for the world. And everything he does follows from that.

NERMEEN SHAIKH: So, Maria Ressa, as Siva said, now we can anticipate these platforms being filled with hate speech. You’ve said that you will do everything possible to, quote, “ensure information integrity.” So, tell us, how do you think individuals and independent news organizations like yours can ensure that there is truthful, fact-checked information available as these sites get flooded with misinformation?

MARIA RESSA: Well, it’s already happened, right? Cory Doctorow coined the phrase “the enshitification of the internet.” And, in fact, I don’t know if you guys saw Shrimp Jesus on Facebook, right? AI-generated crap. This is — and Facebook says that it’s going to be adding more of this.

So, I think the first step is to actually do what we did, which is to collaborate, collaborate, collaborate. The way we were able to take over the center of the Facebook information ecosystem was by creating something we called #FactsFirstPH, a collaboration of about 150 different groups, four-layer pyramid. The bottom were 60 news organizations working together, right? So, fact-checks don’t distribute, don’t spread as fast and as far as lies laced with fear, anger and hate. So what do you do? Well, you have 60 news groups doing daily fact-checks. But the second layer, that’s civil society groups, the church, business groups. That’s the group that is essentially distribution. We called it the mesh layer. We organized ourselves so that every person in these groups will share one fact-check every day, but will share it with emotion.

The third layer were academic partners of ours. We gave a pipeline of the fact-checks of the data, and every week the academic — because, again, who are the first groups under attack? Just like Mark Zuckerberg said in that quote, the journalists. So, as credibility was pushing backwards because we’re under constant attack, the academe came in. And every week leading up to our 2022 elections, academics came in and told people what Meta narratives were being seeded, what was going viral, which candidate was actually winning in this.

And finally, the last layer is lawyers, six different law groups — left, right and center — that protected rule of law. You cannot have rule of law without integrity of facts. And just three or four days before our elections, we did take over the center of the Facebook information ecosystem.

But here’s your problem. Just in order to have facts, you have to work quadruple, four times, five times harder, and you have to collaborate more. That’s an interim solution. The longer-term solution, of course, is the other group that’s abdicated responsibility for protecting the public are actually democratic governments who have failed to build a public tech stack. This is not a free speech issue — again, just to pull together, right? This is — in the end, Facebook is after power and money. And President Trump gave his blessing. Mark Zuckerberg now feels more comfortable. Maybe the cases won’t come, right? That’s what you’re seeing jostling. But the second part is, it will — surveillance capitalism, unchecked, will bring Facebook more money, a company that makes more than $300 billion a year at the expense of the safety of the people on the platform.

AMY GOODMAN: We’re going to leave it there. And, of course, Mark Zuckerberg does face, as CEO of Meta, in the coming months, under President Trump, the FTC case around the breakup of Meta, something he’s concerned about losing billions over. Maria Ressa, founder, CEO and executive editor of the Philippine independent news site Rappler, Nobel Peace Prize winner for her work defending free expression. Marc Owen Jones, professor at Northwestern University in Qatar, author of Digital Authoritarianism in the Middle East: Deception, Disinformation and Social Media. And Siva Vaidhyanathan, author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, professor at the University of Virginia. We’ll link your piece in The Guardian, “Mark Zuckerberg has gone full Maga.”

Next up, to Los Angeles, where the devastating fires have killed at least five, destroyed 2,000 homes and buildings. More than 130,000 people have been evacuated. We’ll look at what’s happening, hear the story of one person who had to evacuate her whole family, and then talk about the world on fire. We’ll talk about climate change. Back in 20 seconds.

[break]

AMY GOODMAN: “A Voice from On High” by the late Charlie Haden, who lived in L.A. To see our interview with him, go to democracynow.org.



Source link

Written by: radioroxi

Rate it

Post comments (0)

Leave a reply

Your email address will not be published. Required fields are marked *

0%