Facebook Inc. is starting to warn some users that they may have seen “extremist content” on the social media site, the company said Thursday.
Screenshots shared on Twitter showed a note asking, “Are you concerned that someone you know might become an extremist?” and another that alerted users that they “have recently been exposed to potentially harmful extremist content”. Both included links to “Get Support”.
The world’s largest social media network has long been under pressure from lawmakers and civil rights groups to fight extremism on its platforms, including American movements that participated in the Capitol Rising of the 6th of Joe Biden’s election victory confirmed in November.
Facebook said the small test, which only takes place on its main platform, is running in the US as a pilot for a global approach to preventing radicalization on the website.
“This test is part of our broader work to assess how people on Facebook can provide resources and support who have dealt with or been exposed to extremist content or know someone who is at risk,” said a Facebook spokesman in an email statement.
“We are working with NGOs and academic experts in this area and we hope to share more in the future.”
The effort is part of his commitment to the Christchurch Call to Action, a campaign with major technology platforms to combat violent extremist content online, launched after an attack in New Zealand in 2019 that was streamed live on Facebook.
Facebook said in the test that it identifies both users who may have been exposed to potentially rule-breaking extremist content and users who were previously the subject of Facebook enforcement.
The company, which has tightened its rules against violent and hate groups in recent years, said it will proactively remove some content and accounts that violate its rules before the material is viewed by users against enforced.
Live television
#mute