Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook Disputes Claims It Fuels Political Polarization And Extremism

Facebook is stepping up its defenses against claims its algorithms favor inflammatory content.
Jenny Kane
/
AP
Facebook is stepping up its defenses against claims its algorithms favor inflammatory content.

Facebook is making changes to give users more choice over what posts they see in their news feeds, as the social media company defends itself from accusations that it fuels extremism and political polarization.

The changes, announced Wednesday, include making it easier for people to switch their feeds to a "Most Recent" mode, where the newest posts appear first, and allowing users to pick up to 30 friends or pages to prioritize. Users can now limit who can comment on their posts.

The goal is to "give people real transparency in how the systems work and allows people to pull levers," Nick Clegg, Facebook's vice president for global affairs and communications, told NPR's Morning Edition. "You'll be able to, in effect, override the algorithm and curate your own news feed."

Facebook has come under escalating scrutiny over the impact of its platform on society since the Jan. 6th assault on the U.S. Capitol by a pro-Trump mob, which was planned and documented on social media sites including Facebook.

Many critics have zeroed in on the role of Facebook's algorithms, which determine what posts users are shown and what groups and accounts they are recommended to join or follow, and how they may push people toward more inflammatory content.

In his interview with NPR and in a 5,000-word Medium post published on Wednesday, Clegg clapped back at that criticism.

"Central to many of the charges by Facebook's critics is the idea that its algorithmic systems actively encourage the sharing of sensational content and are designed to keep people scrolling endlessly," Clegg wrote in the Medium post.

While "content that provokes strong emotions is invariably going to be shared," he acknowledged, he said that was because of "human nature" — not Facebook's algorithms.

"Facebook's systems are not designed to reward provocative content. In fact, key parts of those systems are designed to do just the opposite," he wrote.

Clegg disputed claims that social media contributes to political partisanship, saying academic research into the matter has been "mixed." He also defended the benefits social media provides, from personalized advertising to "a dramatic and historic democratization of speech."

Clegg told NPR his intention was not to blame Facebook users for the divisiveness on the platform, but to highlight the "complex" interactions between humans and technology.

"It's foolish to say it's all the users fault, but equally to say it's all somehow a faceless machine's fault," he said.

"People want simple answers to what are complex issues. I'm urging us nonetheless to try and grapple with the complexity of this, and not ... reduce it to some faceless machine that we blame for things that sometimes lie deep within society itself."

Facebook CEO Mark Zuckerberg gave a similar defense of the platform last week at a congressional hearing about the spread of extremism and misinformation on social media.

When lawmakers pressed him about whether Facebook bore responsibility for the Jan. 6 attack, Zuckerberg pinned blame on the rioters and on former President Donald Trump.

"I think the responsibility lies with the people who took the action to break the law and do the insurrection," he said. "And secondarily with the people who spread that content."

Facebook's stepped-up efforts to defend its platform and promise users greater transparency and control come as the prospect of new Internet regulations looms larger.

Several bills are circulating that would attempt to hold companies including Facebook more responsible for the content posted by their users and the real-world consequences of online activity.

Facebook itself is calling for reform in an extensive ad campaign, and Zuckerberg laid out his vision for updating regulations in his testimony last week.

"Everybody accepts that new rules of the road need to be written," Clegg told NPR.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.