Facebook test warns users who may have seen 'harmful extremist content'

·Senior Editor
·1 min read

Facebook is testing new prompts to reach users who may be “becoming an extremist.” The in-app messages, which Facebook has confirmed is a test, direct users to resources aimed at combating extremism.

CNN first reported the new prompts, which have been spotted by Twitter users in recent days. One version is aimed at people who may know someone falling into extremism. “Are you concerned that someone you know is becoming an extremist,” it reads. 

This content is not available due to your privacy preferences.
Update your settings here to see it.

Another prompt appears to warn users who may have encountered extremist content on the platform. “Violent groups try to manipulate your anger and disappointment,” it says. “You can take action now to protect yourself and others.”

Facebook spokesperson Andy Stone confirmed the messages are “part of our ongoing Redirect Initiative work.” The initiative is part of a broader effort by Facebook to fight extremism on its platform by working with groups like Life After Hate, which helps people leave extremist groups. The prompts will send users to Life After Hate or other resources, according to CNN.

It’s not clear how Facebook is determining which users may be most likely to be affected by extremism, but the issue has become a hot-button topic for Facebook. The company was widely criticized for not doing enough to prevent QAnon and other fringe groups from using its platform to grow their followings. Facebook has also been accused of downplaying its role in enabling the events of January 6th. And when the Oversight Board recommended the company conduct its own inquiry into the issue, the company said investigations should remain in the hands of law enforcement and elected officials.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting