By Thomas Long, J.D.
The proposals would mandate transparency in content moderation by service providers to encourage open discourse on online platforms, while also carving out immunity protections for certain user-posted content.
Draft legislation sent to Congress today by the Department of Justice, on behalf of the Trump Administration, would make changes to the immunity protections afforded to online platforms by Section 230 of the Communications Decency Act. According to a Justice Department media release, the proposals are meant to promote transparency and open discourse and ensure that platforms are fairer to the public when removing lawful speech from their services. The proposals—which include carve-outs from immunity for content that involves child abuse, terrorism, and cyberstalking and greater transparency about content moderation—reflect a set of recommendations issued by the Department in June after a yearlong review of Section 230. The legislation would implement a recent executive order by President Trump aimed at limiting companies’ ability to claim Section 230’s liability shield for third-party content if they remove or limit access to content.
Among the provisions of Section 230 is protection from civil liability for third-party content when an interactive computer service provider or user voluntarily acts in good faith to restrict access or availability to material it considers to be "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." Some in government and the private sector have criticized interactive computer services of misusing Section 230 to facilitate the willful distribution of illegal material.
In addition, Republicans have accused tech giants, such as Alphabet, Facebook, Squarespace, and Twitter, of moderating content in bad faith in order to exclude conservative viewpoints from the platforms. Senator Mike Lee (R., Utah), chair of the Senate Judiciary Committee’s antitrust subcommittee and a member of the Senate Commerce Committee, sent letters to the CEOs of Alphabet, Facebook, Twitter, and Squarespace expressing concern over "corporations wielding their power unilaterally to silence opinions they dislike, and thus warp the public debates their platforms present to the American people." The platforms have denied such conduct.
The proposed legislation drafted by the Department focuses on two areas of reform, both of which, according to the Department "are, at minimum, necessary to recalibrate the outdated immunity of Section 230." A redlined version of Section 230 showing the Departments proposed changes is available here. A section-by-section breakdown is available here.
Promoting transparency and open discourse. First, the legislation attempts to promote transparency and open discourse and ensure that platforms are fairer to the public when removing lawful speech from their services, the Justice Department said.
"The current interpretations of Section 230 have enabled online platforms to hide behind the immunity to censor lawful speech in bad faith and is inconsistent with their own terms of service," according to the Department. "To remedy this, the department’s legislative proposal revises and clarifies the existing language of Section 230 and replaces vague terms that may be used to shield arbitrary content moderation decisions with more concrete language that gives greater guidance to platforms, users, and courts."
The proposal also adds language to the definition of "information content provider" to clarify when platforms should be responsible for speech that they affirmatively and substantively contribute to or modify. The added language says, "Being responsible in whole or in part for the creation or development of information includes, but is not limited to, instances in which a person or entity solicits comments upon, funds, or affirmatively and substantively contributes to, modifies, or alters information provided by another person or entity."
Addressing illicit activity online. Other proposed amendments would seek to incentivize platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation claims.
"Section 230 immunity is meant to incentivize and protect online Good Samaritans," said the Department. "Platforms that purposely solicit and facilitate harmful criminal activity—in effect, online Bad Samaritans—should not receive the benefit of this immunity." The Department also said that platforms should not receive blanket immunity when they continue to host known criminal content even after receiving repeated pleas from victims to take action.
The proposed legislation would more clearly carve out federal civil enforcement actions from Section 230. Federal criminal prosecutions have always been outside the scope of Section 230 immunity; in the Department’s view, the growing problem of online crime warrants removing any roadblock against the federal government from pursuing civil enforcement.
Finally, the proposal would carve out categories of civil claims that are "far outside" Section 230’s core objective, the Department said. These categories include offenses involving child sexual abuse, terrorism, and cyberstalking.
Stakeholder reactions. Internet Association Deputy General Counsel Elizabeth Banker criticized the draft legislation. "The DOJ’s proposal would severely limit people’s ability to express themselves and have a safe experience online," she said. "Small community listservs, religious forums, and anyone else who hosts and moderates online content would face new legal restrictions and requirements on every content decision. Current good-faith moderation efforts that remove things like misinformation, platform manipulation, and cyberbullying would all result in lawsuits under this proposal."
Free Press Action Senior Policy Counsel Gaurav Laroia said, "Section 230 enables websites and users, regardless of their size, to tend to their own gardens and set standards for the kinds of discourse they allow without having the government improperly peering over their shoulders to determine whether their attempts to moderate lawful but harmful third-party content are right or wrong." He continued, "Members of Congress would be wise to follow the sound logic of their predecessors who created Section 230. A regime of government-mandated speech completely undermines efforts to promote free speech and a diversity of views online."
Computer & Communications Industry Association president Matt Schruers said, "Amid a pandemic and an election, undermining the tools social media companies use to respond to problematic content like disinformation is more dangerous than ever. The U.S. Government should be enabling efforts to address nefarious content and behavior, not hamstringing them in misguided pursuit of political gain."
Other legislation. On September 8, Senate Commerce, Science, and Transportation Committee Chairman Roger Wicker (R-Miss.), Senate Judiciary Committee Chairman Lindsey Graham (R-S.C.), and Sen. Marsha Blackburn (R-Tenn.) introduced a Section 230 reform measure. Senate Bill 4534, titled the "Online Freedom and Viewpoint Diversity Act," would—like the Justice Department’s proposal—remove the phrase "otherwise objectionable" in listing the content moderation activities that would not be deemed to disqualify a platform provider for the liability immunity and replace it with concrete terms, such as "promoting terrorism," content that is determined to be "unlawful," and content that promotes "self-harm." It would also condition the content moderation liability shield on an objective reasonableness standard, permitting immunity only when a platform provider has an objectively reasonable belief that the content falls within a certain, specified category.
Representatives Doug Collins (R-Ga.) and Paul Gosar (R-Ariz.) introduced a reform bill in the House on July 28 (H.R. 7808). The proposed "Stop the Censorship Act" would replace the phrase "otherwise objectionable" with the phrase "unlawful, or that promotes violence or terrorism." The measure also would protect providers from civil liability for "any action taken to provide users with the option to restrict access to any other material, whether or not such material is constitutionally protected."
MainStory: TopStory Antitrust ConsumerProtection
Interested in submitting an article?
Submit your information to us today!Learn More
Antitrust Law Daily: Breaking legal news at your fingertips
Sign up today for your free trial to this daily reporting service created by attorneys, for attorneys. Stay up to date on antitrust legal matters with same-day coverage of breaking news, court decisions, legislation, and regulatory activity with easy access through email or mobile app.