Technology

Book Review - Custodians of the Internet

By  | 

Some technology books age well, and Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media has already achieved this status. As highlighted on the book cover, content moderation is a hot topic and gained momentum in 2020 with Congressional hearings as U.S. policymakers discussed pandemic misinformation, censorship, and the social media influence on the last two presidential elections. Tarleton Gillespie manages to shed light on crucial elements behind content moderation and potential social media platforms' improvements by offering a detailed and objective analysis that predates current political arguments.

The author has a Ph.D. in Communication and speaks confidently about the topic. As the Principal Researcher at Microsoft and an Associate Professor at Cornell University, he combines both business and academic knowledge and approaches. 

The scope of the book is broad. The author defines platforms as online sites and services that (i) host users' content, (ii) do not produce that content, and (iii) use internet infrastructure (i.e. hardware, transmission media, and software) to process data, customer service, and advertise. He also argues that they should: (iv) moderate content using detection, review, and enforcement. The author includes a wide range of companies in this definitions, including social networks (e.g., Facebook, LinkedIn), blogging platforms (e.g., Blogger, WordPress, Twitter), photo sharing services (e.g., Instagram, Pinterest, Snapchat), video sharing platforms (e.g., Youtube, Vimeo), discussion tools (e.g., Reddit, Digg), dating apps (e.g., Tinder, Grindr), collaborative knowledge sites (e.g., Wikipedia, Quora), app stores (e.g., iTunes, Google Play), live broadcasting sites (e.g., Periscope), recommendation sites (e.g., TripAdvisor, Yelp), exchange platforms (e.g., Etsy), video game worlds (e.g., Minecraft, Second Life), and even search engines (e.g., Google, Bing, Yahoo).

The central argument of the book is that content moderation is an integral and essential element of platforms. Platforms must moderate to protect one user from another or another group, remove illegal content, and curate and present themselves to new users and advertisers. Moderation is "essential, constitutional, definitional" for what platforms do, and acknowledging its inherent importance will change the way social media platforms operate and how they are regulated. The book relies on an extensive list of sources for each chapter and assertion, which adds much-needed substance to the debate.

In the first chapter, the book jumps right into the main argument, showing that all platforms moderate content, at least by removing spam content. Platforms operate between the pillars of free speech and protecting the community. Underneath these objectives, they have the economic goal of keeping people on the platform engaged, interacting, and staying for the longest time to increase advertising revenues and gather more data and insights into consumer behavior. It is a delicate balance between too little moderation and a toxic environment, or too much moderation and losing users because of intrusion and censorship. Therefore, users also have power over the platforms in a limited sense, as the behavior of the users influences the level of moderation a platform can apply.  These dynamics of behavior and reactions between users and platforms end up shaping the public discourse, and platforms are inherently responsible for all of it.

Next, Gillespie debunks the myth that platforms are, or should be, neutral. Anybody interested in social regulation should read this second chapter, as it tells the history of section 230 and its implications for the current discussions around the role of content moderation and social media misinformation and censorship. The first regulations of internet intermediaries in the U.S. focused on copyright issues, moved to decency and pornography, then arrived at section 230 of the U.S. telecommunication law as the safe harbor for intermediaries. Safe harbor is a common term used to describe protection from certain legal liability. In this case, section 230 gives a safe harbor for internet intermediaries in two ways. First, intermediaries are not responsible for the speech of their users. Second, intermediaries may police the speech of its users without losing their safe harbor status. In other words, they cannot be sued for what appears on their sites, and they may legally regulate content.

With fluid writing, Gillespie argues that platforms are outgrowing the safe harbor provision, however. The use of social media by terrorist groups for online propaganda and by extremists to spread hate speech increases users and governments' desires to hold platforms more accountable. Gillespie calls us to rethink liability for social media platforms, for example, by modifying section 230 to set minimum obligations of content moderation. Otherwise, social media platforms will push the boundaries of their rights to moderate speech without any responsibility for the effects of their actions.

The book then explains that platforms present their rules for the users in two documents, one more legal – the terms of service--and other one in lay language that shows what is accepted or prohibited in abiding by the platform's principles. The author explores how platforms struggle with the actual application of those guidelines. Social media platforms have relied on three strategies to tackle unwanted content: editorial review, community flagging, and automatic detection. The latter uses advanced software to identify mainly pornography and hate speech, as it is more effective when one already knows what to look for. Community flagging can be weaponized for political and social reasons with organized flagging campaigns. Moreover, editorial review is resource-intensive and difficult to scale up. These solutions are far from perfect, and artificial intelligence has the potential to scale up to automatic detection.

The author is very didactic in explaining each concept and thoroughly demonstrates the immense challenge in implementing these solutions. Content moderation requires intensive human labor nowadays. Labor is involved in a pyramid of multiple layers, and as Gillespie notes, the larger social media companies employ thousands. In smaller ones, of course, the process is still more artisanal. As an extreme example, the author highlights that "the Chinese government employs hundreds of thousands of people to scour social media for political criticism." The book also shows that in 2009 Facebook had 150 moderators (out of 850 employees), and this number grew to 4500 in 2017, with promises to an additional 3000 after the criticism of murders and suicides in the Facebook Live service. As of April 2021, there may be as many as 15,000 moderators on Facebook, with the majority being outsourced to third party companies.

The book then explores a practical case, diving into Facebook's decade-long issue with the moderation of nudity and its overlap with breastfeeding images. The opacity of rules and processes generated strong reactions from affected mothers and activist groups. This example leads then to a healthy debate on filtering and removing content and the impact on affected users. There is a crucial duality in the way platforms work. On one side, platforms remove or filter content, and on the other side, they have the ability to promote, recommend or highlight content. These dual criteria define the entire experience of the platform.

In the last chapter, Gillespie suggests a possible roadmap of improvements in the moderation process. It includes adding transparency to moderation practices and sharing more data around those decisions. The author also suggests the interoperability of user data (profiles and preferences) among platforms to give users the ability to quit quickly if a platform presents unsatisfactory moderation. Moreover, the author asks for increased diversity in both teams and ideas to improve moderation. Platforms distinguish themselves from the open web because they moderate, recommend, and curate content. However, based on the negative repercussions with users, policymakers, and the media, Gillespie concludes that those differentiating levers are out of tune and do not deliver the expected results. Recognizing the central role of moderation in platforms should reshape their internal processes, but also the external pressure from society and policymakers. The author also calls for all platform users to exercise authority and collaborate. Platforms should also have a public obligation to compensate society for their safe harbor benefit. It is a tangible next step and a fundamental concept to learn from the book.

The book’s conversational language makes it very easy to understand complex concepts. Nevertheless, it is surprising that the author does not explore political or religious discussions, which are even more complex and ambiguous zones for companies to moderation. Furthermore, Gillespie states that it is good to see platforms converging on what could be best practices; but he omits the downsides of having all major platforms potentially banning the same points of view. It would be fantastic if Gillespie explored these topics in a sequel.

Another aspect that the author could have explored is the tension between content moderation and the platform's business goals. The book does not explore advertisement, filter bubbles, promotion, and manipulation. Lastly, the author could also explore ethical conflicts such as platforms investing in lobbying even as policymakers are forced to use their platforms to campaign.

The book is delightful to read. It is a masterpiece balancing strong research, complex topics, and possible solutions. The reader will absorb foundational elements that shape platforms and public discourse in a politically unbiased way, which is rare and deserves immense appreciation. Understanding that platforms do moderate, and have to, will transform the lenses through which practitioners read and think in this field. The book interacts well with Speech Police on the governance of platforms. To get other perspectives, one could also explore the topic through a business lens with The Business of Platforms. But start with Gillespie's book.

Book Cover of Custodians of the Internet

About the Author: 

Christian Wickert is a current graduate student in the Masters in International Service program. An engineer, MBA, and expert in strategic planning, Christian Wickert started his career as a software programmer in a garage in Sao Paulo, Brazil. He then worked in a range of industries, from treasury risk management to strategy consulting, from telecommunications to an NGO active in digital education and sustainability. He then focused on regulation, negotiating with governments and ministries, which led to him joining the businesses of Merck KGaA, Darmstadt, Germany in 2015, where he works in Corporate Affairs and Policy, based in Washington, DC. Christian has also provided a TED talk on how fiction can help us better understand our reality.


 

*THE VIEWS EXPRESSED HERE ARE STRICTLY THOSE OF THE AUTHOR AND DO NOT NECESSARILY REPRESENT THOSE OF THE CENTER OR ANY OTHER PERSON OR ENTITY AT AMERICAN UNIVERSITY.

 

more_csint_reviews