For social media, it’s not Election Day. It’s judgment day.
All the preparations of the past four years to protect the election have come down to this. Facebook, Twitter and Google-owned YouTube are pulling on their battle fatigues. They will be on high alert Tuesday and in coming days for any effort to destabilize the election or delegitimize the results.
This heightened state is the result of the 2016 presidential election when the major online platforms were caught with their guard down as Russians inflamed the electorate with divisive messages and falsehoods and hoaxes ran rampant.
This time, the stakes are even higher. The full-blown partisan warfare of presidential campaigning that has ripped through social media in recent months may lead to an unprecedented torrent of misinformation, voter suppression efforts, even fomenting of violence, observers say.
On Election Eve, Facebook and Twitter posted a warning label on a President Donald Trump post. Twitter said his assertion that a recent Supreme Court decision could lead to problems and even violence in the election in Pennsylvania was misleading, and Twitter users were prevented from liking or replying to the tweet.
Facebook election:Facebook bans ads that seek to delegitimize the election or make false claims about voting
Facebook and Twitter warn on Trump post:Facebook and Twitter post warning on President Trump’s election eve Supreme Court tweet
Facebook also fact-checked the president, with a label that says voting fraud is “extremely rare.” The result of a policy announced in September, the Facebook label is slapped on posts that seek to delegitimize the outcome of the election or discuss the legitimacy of voting methods.
“Platforms know this is a referendum on their futures and how they’ll be regulated. The public is aware of the risks and social media companies know they need to demonstrate they are trying,” said Jennifer Grygiel, a communications professor at Syracuse University who studies social media, told USA TODAY.
“This election,” they said, “is about as safe as skydiving without a parachute.”
Facebook says it has invested billions of dollars and assigned more than 35,000 people to fight harmful content from “coordinated inauthentic behavior” (accounts that work together to spread misinformation) to foreign interference to election-related misinformation.
“Our Election Operations Center will continue monitoring a range of issues in real time –including reports of voter suppression content. If we see attempts to suppress participation, intimidate voters, or organize to do so, the content will be removed,” the company said Monday night.
The team staffing that center will also track other issues, such as the swarming of Joe Biden campaign buses over the week, Facebook said.
“We are monitoring closely and will remove content calling for coordinated harm or interference with anyone’s ability to vote,” the company said.
If a presidential candidate or party declares premature victory before the race is called by major media outlets, Facebook said it will add labels on candidates’ posts and will put a notification at the top of News Feed to alert voters that no winner has been projected. It will also continue to show updated and accurate information in the Voting Information Center.
After polls close, Facebook will run a notification at the top of Facebook and Instagram and label voting-related posts from everyone, including politicians, with a link to its Voting Information Center giving the latest state-by-state results for president, the Senate, and the House.
A voting alerts tool will allow state and local election authorities to reach constituents with notifications on Facebook. The voting alerts will also appear in the Voting Information Center.
Facebook has readied “break-glass” tools to deploy if election-related violence erupts. These tools slow the spread of inflammatory posts.
Facebook stopped accepting new political ads a week before the election and plans to suspend all political ads on Facebook and Instagram after the polls close Tuesday.
It has also temporarily limited popular features, turning off political and social group recommendations, removing a feature in Instagram hashtag pages and restricting the forwarding of messages in its WhatsApp messaging app.
Before the election, Twitter made aggressive changes to curb misinformation by labeling tweets on mail-in voting and COVID-19, even from prominent political figures including President Trump.
On Election Day, the company says it will take action against any tweet that claims victory before the election is called by state election officials or projected by authoritative national news outlets.
Election Day preparation:Facebook, Twitter and YouTube brace for a turbulent election and post-election cycle
Trump Facebook feuds:Nothing could have prepared social media users for the full-blown partisan warfare of 2020
Tweets that include premature claims of victory will be labeled and will direct users to Twitter’s election page which contains credible election information. The company says it may add a warning label or remove tweets that incite people to interfere with the election or encourage violence.
Twitter said it would prioritize labeling tweets about the presidential race and any other hotly contested races “where there may be significant issues with misleading information.”
When people attempt to retweet a tweet with a misleading information label, they’ll see a prompt pointing them to credible information before they are able to amplify it.
During the election and at least through election week, Twitter will try to slow down the spread of misinformation by encouraging users to add their own commentary before amplifying content (in other words prompting them to “Quote Tweet” instead of retweeting someone else’s post).
You will have to tap through a warning to see tweets with misleading information from U.S. political figures, U.S. based accounts with more than 100,000 followers or accounts that get a lot of engagement, Twitter says. You will not be able to retweet those posts but you can quote tweet them. Likes, retweets and replies will also be turned off.
In September, Twitter launched an election hub to make it easier for users to find accurate voting and election information. It also banned political ads before the election. Its stand: political reach “should be earned, not bought.”
On Election Day, YouTube, the popular video platform owned by Google, says it will prominently display election results in an information panel at the top of search results for a wide range of queries related to the election and under videos that discuss the election.
The panel will warn users that the results may not be final and will link to a Google election page which will track results in real time based on data from The Associated Press.
As polls close, YouTube will also point users to live streams of election night coverage from authoritative news outlets. YouTube said it would “elevate” these outlets in election-related news and information queries in search results and “watch next” panels, too.
YouTube says it does not allow videos that mislead voters or that encourage people to interfere in the election and will quickly take down any violating videos.
Google says it will temporarily suspend election ads on Google and YouTube after the polls close.