Because misinformation spreads fast, and its consequences range from cementing ignorance to violent insurrection attempts, Facebook is testing out new prompts reaching out to users it thinks have been exposed to extremist content. The prompt messages inform you that certain groups, posts, and profiles you’ve been viewing may contain extremist content, and also contain links to resources meant to dissuade extremism.
Facebook’s extreme prompts are More Than Words
The prompts, first reported on by CNN, began circling the social media verse on Thursday. One is aimed at providing support for user’s worried that their friends and family may be consuming extremist content, asking, “Are you concerned that someone you know is becoming an extremist?” It then redirects you to a support page. Another alert, with the same redirect link, reads: “Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others.”
Which is the modus operandi of most extremist groups. They appeal to the angry and disappointed, providing them with a target for their emotions, a person or group to blame.
The prompts are currently being tested with American users. A Facebook spokesperson told CNN that, “This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk.”
Facebook has recently been re-upping its efforts to thwart harmful content on the platform, particularly in light of the ongoing pandemic and the chaos of the 2020 US presidential election. This latest move is a move to prevent more people from falling into extremist groups and potentially harming others around them, and, unlike that controversial new privacy policy, it’s something we think most folks can get behind.