Facebook to implement policies aimed at dialing down post-election unrest

Get the latest BPR news delivered free to your inbox daily. SIGN UP HERE

CHECK OUT WeThePeople.store for best SWAG!

Facebook executives are putting in place several emergency measures to help reduce post-election unrest next month that were previously reserved for “at-risk” countries in an attempt to dial down anger and mistrust.

On Sunday, The Wall Street Journal reported that the social media behemoth will limit the reach of viral content while lowering the standard for suppressing posts that could inflame emotions, measures that were previously utilized in Sri Lanka and Myanmar.

Intelligence reports, including some seen by BizPac Review, indicate groups are preparing various protest and disruption scenarios following the election, especially if President Donald Trump is reelected. They include organizing large street protests, potential rioting, and occupation of transportation and distribution hubs.

Also, the campaigns of President Trump and Democratic rival Joe Biden are preparing for legal challenges to election results stemming from disagreements over the counting of ballots.

Those disagreements and legal battles between campaigns and candidates will inflame political passions among Americans and play out, in large part, on social media.

That said, Facebook execs have indicated they will only deploy the suppression tools in the direst of circumstances including violence related to election results, people familiar with the planning told the WSJ, adding that the company was preparing for any eventuality.

Measures include broad suppression of posts as they begin to go viral, as well as changing news, feeds to alter what kinds of content users will see, sources told the paper. The platform may also lower the standard for algorithms that detect content viewed as dangerous.

The paper adds:

Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence and misinformation, said the people familiar with the measures. But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said.

“We’ve spent years building for safer, more secure elections,” Facebook spokesman Andy Stone told the WSJ. “We’ve applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”

Facebook — and Stone, in particular — have faced pushback and criticism in recent days from the Trump campaign and conservatives over the platform’s limiting of New York Post stories last week containing allegations of corruption linked to Joe Biden and his son, Hunter Biden.

Meanwhile, Democrats have said the platform doesn’t do enough to prevent the spread of false information and that it is deferential to the right.

Either way, the platform’s attempts to limit the spread of information and tweak content post-election will no doubt lead to new allegations of improper manipulation, the WSJ reported. That said, Facebook routinely adjusts its algorithms either to bolster engagement or to punish users operating in bad faith.

Facebook executives have previously warned the platform was preparing tools to deploy in the event of post-election unrest.

“We need to be doing everything that we can to reduce the chances of violence or civil unrest in the wake of this election,” Facebook founder and CEO Mark Zuckerberg told Axios in September.

Also, the platform’s global head of communications and policy, Nick Clegg, told USA Today that “break-glass tools” were needed in the event of a crisis, though he refused to elaborate “because it will no doubt elicit [a] greater sense of anxiety than we hope will be warranted.”

Earlier this month during a companywide meeting, Zuckerberg discussed potential post-election violence and what the platform will be doing to help dial down the anger and mistrust.

“Once we’re past these events, and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content,” the billionaire CEO said, according to Buzzfeed News.


Please help us! If you are fed up with letting radical big tech execs, phony fact-checkers, tyrannical liberals and a lying mainstream media have unprecedented power over your news please consider making a donation to BPR to help us fight them. Now is the time. Truth has never been more critical!

Success! Thank you for donating. Please share BPR content to help combat the lies.
Jon Dougherty


We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.

PLEASE JOIN OUR NEW COMMENT SYSTEM! We love hearing from our readers and invite you to join us for feedback and great conversation. If you've commented with us before, we'll need you to re-input your email address for this. The public will not see it and we do not share it.

Latest Articles