Facebook Fears 'Election Chaos'; Prepares 'Emergency Measures' to Avoid 'Dire Circumstances'

There they go again? Less than two weeks after Facebook faced massive criticism for censoring a New York Post story on Hunter Biden’s emails, Zuck & Co. are implementing aggressive “emergency measures” to limit the spread of viral content and lower the benchmark for suppressing “inflammatory” posts over fears of “election unrest.”

Advertisement

As reported by the Wall Street Journal on Sunday, tools include slowing the spread of “certain posts” and tweaking users’ news feeds, which sources say would only be used under “dire circumstances.” As Yogi Berra famously said, it’s déjà vu all over again.

As reported by Fox Business, the move comes days after Facebook (and Twitter) faced harsh backlash from President Trump and congressional Republicans who have long criticized the platform’s role in regulating (censoring conservative) content. In response, Facebook CEO Mark Zuckerberg said the company would impose fewer restrictions on content after the election, but that they have implemented “policy changes” to “address any uncertainty and the perpetuation of disinformation for the time being.”

What does this mean?

The “emergency” measures would lower Facebook’s previously established threshold for content deemed “dangerous” (by Facebook, of course) and would slow down the sharing of specific posts as they begin to gain traction. In addition, an adjustment would be applied to news feeds to control the availability of “dangerous content” to users.

As reported by the WSJ (bolded font, mine):
Advertisement

Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence, and misinformation said the people familiar with the measures. But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said.

Facebook spokesman Andy Stone told the Journal that the company has “spent years building for safer, more secure elections,” and that their strategy is based on “lessons from previous elections, hired experts, and “new teams with experience across different areas to prepare for various scenarios.”

Hang on. Why is it Facebook’s job to “build” what it determines to be “safer, more secure elections”? If Facebook is the arbiter of “safe and secure elections,” as it sees fit, who is the arbiter of Facebook? Mark Zuckerberg? Gimme a break.

In a September 3 Facebook post, Zuckerberg wrote:

“[W]ith Covid-19 affecting communities across the country, I’m concerned about the challenges people could face when voting. I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.

“This election is not going to be business as usual. We all have a responsibility to protect our democracy. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.

“Today, we’re announcing additional steps we’re taking at Facebook to encourage voting, connect people with authoritative information, and fight misinformation. These changes reflect what we’ve learned from our elections work over the past four years and the conversations we’ve had with voting rights experts and our civil rights auditors.”

Advertisement

Among those “additional steps”:

  • We’re going to block new political and issue ads during the final week of the campaign. It’s important that campaigns can run get out the vote campaigns, and I generally believe the best antidote to bad speech is more speech, but in the final days of an election, there may not be enough time to contest new claims.
  • We’re reducing the risk of misinformation and harmful content going viral by limiting forwarding on Messenger. You’ll still be able to share information about the election, but we’ll limit the number of chats you can forward a message to at one time.
  • We will attach an informational label to content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods, for example, by claiming that lawful methods of voting will lead to fraud.
  • We’ll enforce our violence and harm policies more broadly by expanding our definition of high-risk people to include election officials in order to help prevent any attempts to pressure or harm them, especially while they’re fulfilling their critical obligations to oversee the vote counting.
  • We’ve already strengthened our enforcement against militias, conspiracy networks like QAnon, and other groups that could be used to organize violence or civil unrest in the period after the elections.

In conclusion, Zuck said:

“I believe our democracy is strong enough to withstand this challenge and deliver a free and fair election — even if it takes time for every vote to be counted. […] We can do this. […] We all have a part to play in making sure that the democratic process works, and that every voter can make their voice heard where it matters most — at the ballot box.”

Advertisement

And, as reported by Fox Business in reference to Facebook’s implementation of its latest shiny new “tools” to help ensure “a free and fair election,”  Zuckerberg said:

“Once we’re past these events and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content.”

Therein lies the debate raging in the wake of Facebook’s (and Twitter’s) suppression of explosive reporting by the New York Post, as I said at the top, of the controversy surrounding Democrat presidential nominee Joe Biden and his gadfly son, Hunter.

To suggest that Biden’s refusal to answer questions about the controversy isn’t buoyed by a combination of suppression by Facebook and Twitter, and the “mainstream” media sock puppets flying wingman for him and the Democrats 24×7 would be incorrect.

To suggest that Facebook’s Mark Zuckerberg and Twitter’s Jack Dorsey should be entrusted with “building a safer and more secure election” would be laughable.

And, of course, the larger issue is, who died and left Zuckerberg and Dorsey in charge of deciding which news Americans should have access to and which news is too “dangerous” or “inflammatory” for them to see? Who gave them the right to do that? Answer: Nobody. Drunk with power and laden with smug left-wing ideology, they usurped it for themselves.

Advertisement

Recommended

Join the conversation as a VIP Member

Trending on RedState Videos