Technology and the Private Sector: A Breakdown of Net Neutrality, Antitrust Laws, and the 1st Amendment

Amazon’s Suspension of Parlor in 2020

“AWS provides technology and services to customers across the political spectrum…” as stated in an email (Parler Suspension, 2021) from Amazon prior to the abrupt suspension of Parlor (an openly conservative social media site) from their services, based on the allegation that Parlor has not sufficiently moderated content posted on their platform. This action, along with many others, such as Twitter’s removal of the U.S. President from use of their services, has raised questions among many regarding the rights of social media companies, specifically in the private sector, when it comes to content moderation. Some suggest that regulating user content based on political standards, or any standards, violates the first amendment, and accordingly is illegal. Others believe that private companies not only have the right to moderate their own content, but the obligation, and that failure to do so should result in the company being liable for damages which may occur as a result.

First Amendment’s Application to Private Companies

The First Amendment of the United States reads as follows: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.” (United States Government, n.d.). In regard to the right to free speech, this amendment applies only to laws enacted by the government, and has no jurisdiction over the affairs of private corporations. Any private company allowing users to post content, whether written, graphic, or otherwise, has their own terms of service stating how that company treats user content. If those terms of service specify that the company reserves the right to remove user content as they see fit, which many do, then such provision applies to any user of the services of that company, since users, if they desire to begin using or continue to use those services, inherently agree to them upon sign up, and upon continued use of the service. As a result, in most cases any objection by users, to a company’s removal or dissimilar treatment of their content is invalidated, since those users provided an equivalent of their digital signature in agreement to that platform’s terms of service upon using the service, even if they have been negligent in reading those terms. In addition to not being obligated to comply with the free speech clause of the first amendment, private companies, under Section 230 of the Communications Decency Act, are explicitly provided the right to moderate the user content uploaded to their platforms (Puckett, 2021).

Rights of Content Moderation

Although the private sector is not beholden to upholding the first amendment, there are some that argue the right to content moderation should not be provided to private companies allowing user interaction to the extent that it is. For example, as stated by UNL Law professor Eric Berger, “Just because the first amendment doesn’t apply to private actors, does not necessarily mean there might not be problems with these companies having actors decide what speech to allow and what speech not to allow.” (Austin, 2021). There have been numerous arguments as to whether or not private companies, specifically those involved in communications such as social media, advertising, search engines, messaging, and publishing; to include companies like Facebook, Twitter, Google, Apple, Microsoft, Amazon, and other similar technology and communication companies, should be allowed to limit, remove, rank, or otherwise manipulate the order or accessibility of one kind of information over another, or perhaps the same sort of information, but from a different source. These arguments, aside from the direct question of their relation to free speech and the first amendment, also tie into the heavily controversial topic of net neutrality, conflicts of interest, and the overall role of the private sector in communications, and how much power they can and should have.

Proposed Antitrust Restrictions on Google

Adherent to the potential issues of entities within the private sector gaining too much control and influence, along with the aspect of user-content moderation, congress has passed laws to regulate the extent of each, known respectively as the Antitrust Laws, and ection 230 (Federal Trade Commission, n.d.); (Department of Justice, 2020). According to the Federal Trade Commission, the Antitrust laws, which were implemented with the intention of limiting the amount of power individual private entities are entitled to, specifically prevent monopolization, certain restraints of trade, unfair methods of competition, and unfair or deceptive acts or
practices (Federal Trade Commission, n.d.). These restrictions include attempts by multiple companies to fix prices or engage in other activity which may hinder the continuity of healthy competition, and also includes attempts made by companies to gain complete or near-complete control over a certain market, whether it be through direct monopolization, or through other means such as domination of a certain field, which may lead to an advantage by that company in other fields. A well illustrated example of this concept can be seen in recent allegations brought against Google regarding its alleged preservation of a monopoly in the online sphere. According to eMarketer (Nieva, 2019), Google controls nearly 75% of the online advertising market, and has been accused of using that advantage to promote its own products and services, and those of their affiliates, in search results of Google Search, Shopping, Travel, Maps, etc, along with advertisements through Google Adsense. These and other allegations have raised concern among many that large companies are purposefully using such advantages to eradicate competition and preserve their control over the market, which has led to a recent investigation of Google, and also of Facebook, Apple, and Amazon, by 50 U.S. Attorney Generals, to determine whether or not such actions violate current Antitrust laws (Nieva, 2019). Similar concern has been raised with Facebook’s acquisition of Instagram and WhatsApp, along with eBay’s acquisition of PayPal. These and other instances of potential violations of the current Antitrust laws, although seen by many as harmful to competition, are not necessarily viewed by others in the same way. Zachary Karabell, for example, from WIRED, says this: “…competition… is not a virtue or need in and of itself. It is the means to a set of ends—namely, economic liberty, unfettered trade, lower prices, and better services for consumers. By itself, competition does not guarantee anything.” (Karabell, 2021). People like Karabell may argue that breaking up larger companies, which is a solution sought to accomplish by many who are displeased with the current state of those companies, is actually counterproductive, since they will then be forced to spend larger amounts of money on advertising and other competitive initiatives, rather than having a majority of their focus remain on the quality of the services they provide. In addition, those companies would not have the amount of staff, finances, and other resources that they currently possess in order to ensure the quality of their products and services. For these reasons and others, the dispersion of these companies remains widely
disagreed upon.

Section 230’s Implications on the Private Sector

While the question remains of how much longer some of these companies will go on in their current state, based on many factors including the views of the incoming presidential administration, there is also some tension on what will happen with Section 230 of the Communications Decency Act. This protection currently acts to limit the liability of the platforms through which user content is uploaded and distributed, specifically through the following clause: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” (Title 47, n.d.) While many companies have relied heavily on Section 230, there are a number of people who wish to eliminate or amend the protection. President Joe Biden, for example, in an interview with the New York Times on January 19, 2021, said the following: “…Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg [referring to Facebook] and other platforms.” (New York Times, 2021). In addition, former president Donald Trump issued in executive order in May 2020, stating the following: “the immunity should not extend beyond its text and purpose to provide protection for those who purport to provide users a forum for free and open speech, but in reality use their power over a vital means of communication to engage in deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.” – Donald Trump (Dept. of Infrastructure and Technology, 2020). These quotes illustrate the willingness of modern U.S. presidents on both sides of the political spectrum to partially or entirely remove the Section 230 protection of social media and other similar companies, with the democratic viewpoint tending towards forcing these companies to more effectively moderate their content to eliminate “falsehoods,” lest their protections be removed, and the republican viewpoint tending towards removing the protection of companies that moderate content to an excess (Guynn, 2021). In both cases, these measures taken by both sides are seen by many as political, since although the same action is being taken (implementing the threat of removing protections of social media), that action is being taken to seemingly create a mechanic of leverage within the federal government, which will act to bend the information industry in the private sector to comply with the federal government, lest they lose the Section 230 protections and become prone to lawsuits relating to user-content.

Alleged Overreach of Company Power

While few debate that it was within Amazon’s rights to deny service to another platform, especially since their reason for doing so aligned with their terms of service, the concept of Amazon, Facebook, and Google all having taken steps to dismantle a social media competitor has disturbed, and even outraged many who believe the action was taken for political reasons, and further that it was taken via the privilege of what some consider a monopoly, in terms of Amazon’s AWS web hosting, the Google Play Store, and the Apple App Store being nearly the only feasible means by which any online platform can gain traction. In addition, the reasoning behind these actions, being Parler’s failure to effectively moderate their user’s content (Shieber, 2021), has begged the question of whether these companies are taking on a systematic role of governing other companies, that others feel they are not entitled to, especially given the very few alternatives to their services.

Ramifications Going Forward

Altogether, these questions constitute an ongoing issue in the United States, and around many other parts of the world, concerning the extent to which the government regulates private companies in terms of their content moderation practices, the amount of growth these companies are allowed to have, the amount of a given sector they are permitted to control, and what conditions may be imposed as a result of such control, such as further regulations preventing against conflict of interest, along with the extent to which companies are allowed to dictate the affairs of those using their services (as in the case of Amazon and Parlor), especially
when those clients have little to no alternatives to use of that company’s services, and as such are in all practicality accountable to that company to the same extent that they’re accountable to the government, in all respects. While the government cannot constitutionally demand that social media moderate it’s content (as attempting to do so would be a violation of the 1st Amendment), and while they are also cannot demand complete lack of content moderation (as per Section 230), they may be able to establish laws regarding agreements between separate companies in terms of how one executes privileged authority and leverage over another, especially when the said companies are competitors or when that exersion of power occurs to the economic benefit, political benefit, social benefit, etc of the positionally advantaged party, as such laws could justifyably be categorized under Antitrust laws, ensuring the preservation of competition. Whether or not such laws might be enacted is yet to be seen, and will be largely determined by events that follow in the future, and most importantly, by the effect the actions formerly mentioned and those similar will have on the economy and the overall status of competition in the private sector.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *