Alina Utrata

Now that the 2020 US presidential election has concluded, the post-mortem evaluation of how well social media platforms performed will begin. Since the content moderation debate has mostly focused on platforms’ willing or unwillingness to remove content or accounts, the post-election coverage will almost inevitably center around who and what was removed or labeled.

In 2019, the New York Times published a feature story about individuals who had had their Facebook accounts suspended—possibly because they had been misidentified as fake accounts in a general security sweep. However, these users do not know for certain. Individuals were only told that their accounts had been disabled because of “suspicious activity”—the appeal process to restore suspended Facebook accounts is not a transparent one, and cases frequently drag on for extended periods with no resolution. (Facebook, as the article documents, has quite sophisticated techniques for catching individuals attempting to make multiple accounts, foreclosing the solution of a “second Facebook account” for suspended users.)

Facebook CEO Mark Zuckerberg has frequently said that he does not want the platform to become the “arbiter of free speech.” Free speech, however, is a restriction that only applies to governments. As the existence of these suspended accounts show, in reality Facebook can limit speech or ban users for almost any reason it cares to put in its terms of service. It is a private corporation, not a government. 

The problem of the exclusionary policies of private corporations might be less acute in a competitive marketplace. For example, it could be inconvenient if a personal feud with your local corner store leads you to being banned from the shop; but it is always possible to buy milk from another store down the road. It is different, of course, if you happen to live in Lawton, Oklahoma—or one of the hundreds of communities across the US where Walmart is the dominant monopoly, capturing 70% or more of the grocery store market. Being banned from Walmart (either for using their electric scooters while intoxicated or violating the store’s policy on not carrying guns) might be far more significant for your life and livelihood.

Facebook’s form of monopoly power means that being banned from the platform can have significant consequences for individuals’ lives: loss of the data hosted on the platform (like photos or old messages), the ability to use messenger to connect to friends and family, or participate in professional or social groups only organized on Facebook. Some people depend on Facebook for their livelihoods, communicating with customers or selling on Facebook’s marketplace; or for political campaigns, reaching out to voters in a run for local city council, for example. The same dynamics are true for other digital monopolies, like Amazon. The recent House Judiciary report found that Amazon can, and often does, arbitrarily lower third-party sellers’ products in their search ranking, lengthen their shipping times, or kick them off the site entirely. About 850,000 businesses, or 37% of third-party sellers on Amazon, rely on Amazon as their sole source of income. Monopolies can be, as Zephyr Teachout argues, a form of economic tyranny.

There are two general approaches floated to remedying this monopoly power. The first is to “break them up.” Facebook or Amazon’s policies might be less important if there are many e-commerce or social networking sites in town—and perhaps their policies would improve if they had to compete with other platforms for users or sellers. On the other hand, Facebook might argue that the value of social networking sites are the fact that they are consolidated. As the sudden surge in popularity of the app Parler may soon demonstrate: there’s very little point in being on a social networking site if the people you want to reach aren’t there too. Alternative social networking sites may simply be complementary, rather than competitive. Similarly, Amazon might argue that it is convenient, and beneficial, to both consumers and sellers that e-commerce is located all in one place. Instead of searching online (by using another monopoly, Google) through hundreds of webpages with no guide as to quality, you can go to one portal at Amazon and find exactly what you want.

A second approach to tackling monopolies is regulation. For example, the state can and does get involved if a private corporation excludes you on the basis of a protected identity, such a race or sexual orientation. US Senator Elizabeth Warren’s calls for Amazon, Apple or Google to choose whether they want to be “market platforms” or “market participators” is another example of the state’s attempt to impose regulations in order to make sure that these monopolies are more fair. The government also gets involved when it involves product safety. For example, the Forum on Information and Democracy just published a report outlining recommending principles for regulating quality standards for digital platforms, in the same way that governments might require standards for food or medicine sold on the market. In this approach, the state imposes limits or controls on corporations to try and curb or reform their power over consumers. However, this approach requires active government enforcement and involvement. As the House Judiciary report documented, even though they are equipped with anti-trust laws, many US regulatory agencies have been slow or unwilling to take on the Big Tech monopolies. Corporations also point out that government involvement can stifle innovative and entrepreneurship.

However, there might be a third approach: democratization. Mark Zuckerberg has said that, “in a lot of ways Facebook is more like a government than a traditional company.” If that is the case, then it has been a long time since the United States tolerated a government with the kind of absolute power Mark Zuckerberg exerts over Facebook (as CEO and founder, Zuckerberg retains majority voting shares). So could we democratize Facebook, and make it a company ruled by consent of the governed rather than fiat of the king? Could Facebook users appoint representatives to a “Constitutional Convention” to draft Facebook’s terms of service, or adopt a Bill of Rights to guide design and algorithmic principles? Facebook’s Oversight Board has already been compared to a Supreme Court, so why not add a legislative branch too? Could we have elections on representatives to a Facebook legislature, which would pass “laws” about how the online community should be governed? (A Facebook legislature would arguably be more effective than the referendum process Facebook tried last time it experimented with democratization.) 

Crucially, however, any democratization process would have to be coupled with genuine democratic reform of Facebook’s corporate governance: a Facebook Parliament in name only wouldn’t achieve much if Mark Zuckerberg retained absolute control of the company. True democratization would require in a change not just in who we think represents Facebook, but who owns Facebook—or, rather, who ought to own Facebook. Mark Zuckerberg? Or we, its users? If the answer is Mark Zuckerberg, a Facebook account will always be a privilege, not a right.