UW-Milwaukee professor reacts to Meta's decision to end fact-checking programs on social media platforms

NOW: UW-Milwaukee professor reacts to Meta’s decision to end fact-checking programs on social media platforms
NEXT:

MILWAUKEE (CBS 58) -- On Tuesday, Jan. 7, Meta CEO Mark Zuckerberg announced the company will soon end their fact-checking programs for social media giants Facebook and Instagram.

This change means that instead of third-party moderating systems, the more than three-billion users will now be responsible.

Zuckerberg broke the news in a video posted to Instagram which has since garnered more than 200,000 likes.

"So, we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms," Zuckerberg said. "The reality is that this is a tradeoff. It means we're going to catch less bad stuff, but we'll also reduce the number of innocent people's posts and accounts that we accidentally take down."

Now Meta will adopt a system similar to the Community Notes on X -- run by Elon Musk -- which allows users to point out what is real and what is not.

"I think community notes is multi-billion-dollar platforms getting audiences to do their work for them," said Michael Mirer, an assistant professor of communications at the University of Wisconsin Milwaukee. "And so, one way to look at this is that Facebook is that that is just surrendering rather than trying to fight the fight against misinformation."

However, Mirer added that it's not fully clear as to how this will actually help prevent the misinformation spread.

"If the actual move toward a free expression platform is that there's no moderation, you know, I don't think that that necessarily is conducive to like, being a welcoming space," Mirer said.

In the video, Zuckerberg noted the election played a major influence on Meta's decision, and that the company will work with the Trump administration to promote free speech worldwide.

"It really comes down to the math problem of trying to figure out what you're going to allow, what you're going to throttle, what you're going to encourage on your platform," Mirer said.

The announcement also comes as Facebook is moving its trust and safety and content moderation teams from California to Texas.

Zuckerberg noted that as Meta works on free expression, he believes the move "will help us build trust to do this work in places where there's less concern about the bias of our teams."

Share this article: