Estimated read time: 2-3 minutes
ATLANTA (CNN) — Some Facebook users in the United States are being served a prompt that asks if they are worried that someone they know might be becoming an extremist. Others are being notified that they may have been exposed to extremist content.
It is all part of a test the social media company is running that stems from its Redirect Initiative, which aims to combat violent extremism, Andy Stone, a Facebook spokesperson, told CNN. Screen shots of the alerts surfaced on social media Thursday.
"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk," Stone said.
"We are partnering with NGOs and academic experts in this space and hope to have more to share in the future," Stone added.
One of the alerts, a screen grab of which made the rounds on social media Thursday, asks users, "Are you concerned that someone you know is becoming an extremist?"
"We care about preventing extremism on Facebook," explained that alert, according to a screen grab posted on social media. "Others in your situation have received confidential support."
The alert then redirects the user to a support page.
"Violent groups try to manipulate your anger and disappointment," another alert reads. "You can take action now to protect yourself and others."
That alert also redirects the user to a support page.
Stone, the Facebook spokesperson, said the company is directing users to a variety of resources, including Life After Hate, an advocacy group that helps people leave violent far-right movements.
Over the last several years, Facebook has come under intense scrutiny from critics for not taking enough action to curtail extremist content on its platform. In 2020, for instance, the company was criticized for failing to shut down the page of a militia group that urged armed citizens to take to the streets of Kenosha, Wisconsin.
The company has also repeatedly promised to do better at stopping the flow of misinformation and conspiracy theories. Facebook's independent oversight board even urged the company in May to investigate the role its platform played in the January 6 insurrection.