Get the latest BPR news delivered free to your inbox daily. SIGN UP HERE
CHECK OUT WeThePeople.store for best SWAG!
Facebook executives are putting in place several emergency measures to help reduce post-election unrest next month that were previously reserved for “at-risk” countries in an attempt to dial down anger and mistrust.
On Sunday, The Wall Street Journal reported that the social media behemoth will limit the reach of viral content while lowering the standard for suppressing posts that could inflame emotions, measures that were previously utilized in Sri Lanka and Myanmar.
Intelligence reports, including some seen by BizPac Review, indicate groups are preparing various protest and disruption scenarios following the election, especially if President Donald Trump is reelected. They include organizing large street protests, potential rioting, and occupation of transportation and distribution hubs.
Also, the campaigns of President Trump and Democratic rival Joe Biden are preparing for legal challenges to election results stemming from disagreements over the counting of ballots.
Those disagreements and legal battles between campaigns and candidates will inflame political passions among Americans and play out, in large part, on social media.
That said, Facebook execs have indicated they will only deploy the suppression tools in the direst of circumstances including violence related to election results, people familiar with the planning told the WSJ, adding that the company was preparing for any eventuality.
Measures include broad suppression of posts as they begin to go viral, as well as changing news, feeds to alter what kinds of content users will see, sources told the paper. The platform may also lower the standard for algorithms that detect content viewed as dangerous.
The paper adds:
Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence and misinformation, said the people familiar with the measures. But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said.
“We’ve spent years building for safer, more secure elections,” Facebook spokesman Andy Stone told the WSJ. “We’ve applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”
Facebook — and Stone, in particular — have faced pushback and criticism in recent days from the Trump campaign and conservatives over the platform’s limiting of New York Post stories last week containing allegations of corruption linked to Joe Biden and his son, Hunter Biden.
While I will intentionally not link to the New York Post, I want be clear that this story is eligible to be fact checked by Facebook's third-party fact checking partners. In the meantime, we are reducing its distribution on our platform.
— Andy Stone (@andymstone) October 14, 2020
Meanwhile, Democrats have said the platform doesn’t do enough to prevent the spread of false information and that it is deferential to the right.
Either way, the platform’s attempts to limit the spread of information and tweak content post-election will no doubt lead to new allegations of improper manipulation, the WSJ reported. That said, Facebook routinely adjusts its algorithms either to bolster engagement or to punish users operating in bad faith.
Facebook executives have previously warned the platform was preparing tools to deploy in the event of post-election unrest.
“We need to be doing everything that we can to reduce the chances of violence or civil unrest in the wake of this election,” Facebook founder and CEO Mark Zuckerberg told Axios in September.
Also, the platform’s global head of communications and policy, Nick Clegg, told USA Today that “break-glass tools” were needed in the event of a crisis, though he refused to elaborate “because it will no doubt elicit [a] greater sense of anxiety than we hope will be warranted.”
Earlier this month during a companywide meeting, Zuckerberg discussed potential post-election violence and what the platform will be doing to help dial down the anger and mistrust.
“Once we’re past these events, and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content,” the billionaire CEO said, according to Buzzfeed News.
Jon is a staff writer for BizPac Review with 30 years' worth of reporting experience, as well as an author and U.S. Army veteran. He has a BA in political science from Ashford University and an MA in national security studies/intelligence analysis from American Military University.
Latest posts by Jon Dougherty (see all)
- Media’s suppression of key stories likely gave Biden election win, study finds - November 25, 2020
- MSNBC contributor calls on Americans to ask, what if Kyle Rittenhouse was Muslim? - November 25, 2020
- Tucker Carlson: Now, we get to find out what voting for Biden means - November 25, 2020