As you can read anywhere on the Internet, Facebook is in trouble. Vice President of Global Affairs Nick Clegg has been on the talk show circuit, saying things. I am today specifically interested in one statement Clegg made to NBC:
We’re not saying this is a substitution of our own responsibilities, but there are a whole bunch of things that only regulators and lawmakers can do. I don’t think anyone wants a private company to adjudicate on these difficult trade-offs between free expression on one hand and moderating or removing content on the other. Only lawmakers can create a digital regulator … we make the best judgment we possibly can but we’re caught in the middle. Lawmakers have to resolve that themselves.”Nick Clegg
Essentially, Clegg is pleading that no one wants Facebook to make decisions about its product, and that the government should do it for Facebook. This is a blatant and transparent blame-shifting tactic, but we’ll take his argument in good faith for the purposes of this blog post.
First, Clegg is wrong on the merits of the argument. There are people who want private companies to adjudicate these tradeoffs. Most people want every company to make decisions about its products. Users want their opinions taken into consideration and implemented, but usually (open source aside for the moment) they don’t want to do it themselves. The company should make decisions about the product that the user likes, in the user’s estimation. This is a highly generic statement. Still: at this level of generality, Clegg is wrong.
In the specifics of the argument, Clegg is right. Most people don’t want Facebook making those trade-offs, because Facebook has been abjectly terrible at making good decisions on these fronts in the past. Some platforms make content decisions decisively and transparently, like Pinterest. So it’s not that content moderation decisions are impossible, it’s that there’s something particular about Facebook that makes it untenable for Facebook specifically to make content moderation judgments effectively.
Similarly, Clegg’s argument that “only lawmakers can create a digital regulator” is technically true but not literally true. Companies can and do regulate content on their platform all the time. Pinterest eliminated anti-vaccination content, Tumblr eliminated pornography after having allowed it for many years, and Ravelry banned discussion of Donald Trump. Clegg is dancing around a truth: Facebook is terrible at regulating itself and thus needs someone else to make rules for it. However, other people do regulate themselves better than Facebook. Meta (the overall org) needs rules because (again) there is something particular about Facebook that makes it untenable for Facebook specifically to make content moderation judgments effectively.
Here is the something particular: they are too big. No one could possibly do the moderation that they need to do. The scale is too great.
Thus, I am sympathetic to Facebook’s argument that they need rules. I would like to give them rules! Here is the first one: Facebook is to be broken up into regions, like the outcome of United States vs AT&T (1982) turned the AT&T monopoly into Regional Bells/Baby Bells. I imagine that 16 regions would be currently sufficient:
- West Coast America, Alaska, and Hawai’i;
- Mountain America;
- Midwest America (includes the Great Lakes States);
- American South;
- American Coastal and New England;
- South America;
- Mexico, Central America, and Caribbean;
- the UK;
- Western Europe;
- Eastern Europe (includes Russia);
- North/Saharan Africa;
- Sub-Saharan Africa (with a note that they will need to break up into more parts if/when FB grows its share there; there are many ways that this could be divided, all of which should be up to people in Africa);
- Northern/Eastern Asia (unless China allows in FB, and then we need a lot more Asias);
- Southern Asia; and
I wish we could get away with having fewer American regions, but the America-wide monopoly of AT&T was broken up into more regional Bells than I offered up here. I personally would have to maintain accounts on at least three and perhaps many more of these, so I know I am suggesting something onerous for transregional, transnational folks. I also support the very radical idea of each country having its own Facebook (or choosing to not have one), which would be an enormous departure from the current situation and even more onerous for transregional/transnational folks. So I’m not proposing that now, but I reserve the right to propose it in the future.
Outside of the United States, these regions are not particularly culturally sensitive. Deriving culturally sensitive units for communal discussion is necessary, but I think dcreating culturally-sensitive spaces within the regional units would be more possible than it is currently, with all 3 billion users dropped into a one-policy bucket. Companies taking on oversight of smaller groups of people can be more attune to individual groups of users than current-Facebook can.
Commensurately, these 16 regions would all be their own companies with their own CEO, board of directors, and oversight board. Zuckerberg would get to pick one to be in charge of (if he manages to stay CEO of Meta long enough to see his company be broken up).
Breaking up Facebook’s 3 billion users into 16 chunks would not solve all of their problems. It may not even directly solve any of their problems. But Clegg said it himself: no one wants Facebook to make content moderation decisions, and Facebook wants the government to make its decisions. Sounds good! To make headway on content moderation, we’re going to have to do some breaking up. Once the content to moderate is present in much smaller amounts, then we can figure out how each of the regions wants to do regionally-appropriate, culturally-aware moderation.