Facebook is expanding its fight against fake news and tackling the most direct way it affects elections: voter suppression.
In a post titled “Expanding Our Policies on Voter Suppression,” Facebook’s public policy manager Jessica Leinwand explains how the social media platform is ramping up its battle against any content that is “designed to deter or prevent people from voting.”
While the company already had policies in place to combat voter suppression tactics from spreading on the site, this latest update extends Facebook’s rules to include spreading fake news about the methods in which users can vote.
You’ve probably seen a viral example of this type of content spreading around the internet, which informs voters that they can vote via text message instead of physically heading to the polls. This is obviously false information and the type of content that will no longer fly on Facebook.
Facebook’s new guidelines will also cover disinformation about whether or not your vote will count. It gave this as an example of a false claim: “If you voted in the primary, your vote in the general election won’t count.”
The updates to Facebook policy went into effect last month. Facebook also unveiled a new reporting option that brings this kind of content to the company’s attention.
When giving feedback on a post, as seen below, there is now an option to report “incorrect voting info.” The company says it also set up a “dedicated reporting channel for state election authorities” in which they can report their findings to Facebook too.
As pointed out in the post, Facebook policy had already prohibited content that promoted “misrepresentations about the dates, locations, times and qualifications for casting a ballot.” Facebook offered the following example as the type of content that would be covered by rules it had established in 2016.
This latest move by Facebook comes hot off the heels of a controversial mass purge of hundreds of pages and accounts the company labeled as spam or fake news.