Skip to main content

Electoral Dysfunction: The Anticipated Legal Implications for Social Media Beyond the 2020 Election

Posted by on Sunday, October 4, 2020 in Blog Posts.

By Danielle Schaefer

Social media giants are eager to avoid a repeat of the 2016 election and the scrutiny that followed, and have since developed new techniques to identify and address misinformation posted on their platforms that may affect the outcome of the election. For example, Twitter updated its existing Civic Integrity Policy to increase focus on flagging or removing any information on the platform that is meant to “undermine public confidence in an election or other civic process.” Facebook also announced additional steps it would be taking to fight misinformation surrounding the election, including blocking new political ads on the site in the week leading up to the upcoming election.

Social media platforms are on especially high alert for misinformation as the country approaches the 2020 presidential election in the midst of a global pandemic. Twitter and Facebook’s updated measures were announced last month following Trump’s repeated urging for North Carolina residents to attempt to vote twice to test the validity of the mail-in voting system. In response, the two platforms announced how they planned to address these posts, which they both say possibly violate their policies against encouraging voter fraud or other illegal conduct. More recently, platforms like Facebook, Twitter, and YouTube had to kick efforts into high gear to mitigate the potential spread of misinformation surrounding President Trump’s coronavirus diagnosis mere weeks before election day.

Misinformation in the weeks leading up to the election, however, is not the only concern for social media platforms. With the ongoing COVID-19 pandemic, it is possible that more people will be voting by mail, therefore potentially delaying the results of the election. Platforms fear the civil unrest that might follow if one, or both, of the candidates, declare victory prior to a full tally of the votes and use their sites to announce these declarations.  Social media platforms are seemingly feeling the weight of the responsibility that has accompanied their increased role and prominence in political discourse.

Social media platforms were able to reach the monumental levels of power and influence that we have seen the past two presidential election cycles primarily due to Section 230 of the Communications Decency Act of 1996. Section 230 allows these platforms to moderate what appears on their sites, even if it is not illegal while being shielded from liability for the content of any posts by their users. While some insist this provision was essential to allowing social media platforms to exist in the first place, some argue that it has ultimately allowed platforms’ power to expand too broadly and remain relatively unchecked.

In fact, government officials on both sides of the political aisle are now trying to check this power. Conservatives have been particularly vocal on this issue, claiming that their political views are being over-moderated by biased social media platforms like Facebook and Twitter. In May, Trump issued an executive order calling for a change in the immunity protections under Section 230, claiming that platforms engage in “selective censorship” that promotes a clear political bias. Just last month, the Justice Department sent draft legislation to Congress that, if enacted, would limit the circumstances under which platforms could moderate political speech without facing legal liability. Democratic presidential nominee Joe Biden has also remarked that the provision should be revoked.

However, proposed legislation on this issue is unlikely to move forward quickly during an election cycle, and any legislation that does manage to eventually pass congressional muster is sure to face constitutional challenges in court. First Amendment challenges from both moderated users and the platforms themselves may arise if the federal government passes legislation that effectively prohibits legal speech through delegation to social media platforms.

While there is growing agreement that Section 230 needs to be changed, there remain stark divisions over what the proper changes would be. There may be alternative methods that Congress can use that instead focus on encouraging platform diversity rather than outright restrictions regarding platform speech, but the path forward for such legislation is equally unclear. In the meantime, social media platforms will be free to moderate their users’ speech, including the president’s, as they see fit.

Danielle Schaefer is a 2L from North Carolina. She plans to practice law in New York after graduation to be closer to her family (and her dog). When she manages to find free time, Danielle enjoys binging Netflix shows and perfecting her macaroni and cheese recipe.

You can download a copy of Danielle’s post here.