Legal and regulatory complicity in building-up big tech
In Between Truth and Power, Julie Cohen situates the ongoing transformation of society in an age of information capitalism. Drawing from Karl Polanyi’s The Great Transformation, she shows how the ongoing social and economic changes are themselves enabled by the American legal system, obtaining favorable court interpretations, and even immunities from publisher liability by law (Section 230 of the Communications Decency Act) despite now being the dominant information and news distributors.
The key enabler for this approach to policy and regulation is what Cohen terms “managerial neoliberalism”: the belief that markets should be left undisturbed, and that any policy or government action should be in the service of their proper functioning. Such a perspective, which seems to have pervaded many of the parts of the regulating bodies with the mandate and power to act, has facilitated this laissez faire bias case after case, and in the decisions that have characterized the era to protect company profits.
First, let’s describe some of Cohen’s most salient descriptions of what actually happened to bring us to the present day of informational capitalism. Given that discussion of section 230 (which provides tech platforms immunity when hosting content) has been extensive, I will focus on three other key points that have shaped the contemporary landscape:
- The use and abuse of people for digital businesses
- The normalization of user terms of service and privacy policies as a way to evade liability
- And lastly, the problems of scale which have largely been sidestepped by courts.
The enabling legal construct
Cohen describes the “enabling legal construct” which she names the biopolitical public domain, and it is worth explaining it at length. It is “a source of raw materials that are there for the taking and that are framed as inputs to a particular type of productive activity.” (Cohen 48) The understanding of a user’s data as a resource available for extraction provides “appropriative privilege” for technology businesses. “It justifies the pervasive redesign of networked environments for data harvesting and positions the new data refineries as sites of legal privilege, it naturalizes practices of appropriation by information platforms and data brokers.” (Cohen 73)
As Cohen herself remarks, the private enclosure of a public domain is far from a new idea. It undergirds the great transformation that Polanyi describes where feudal lands were enclosed upon and put to use in commercial agriculture. In Polanyi’s description: “Some of this was achieved by individual force and violence, some by revolution from above or below, some by war and conquest, some by legislative action, some by administrative pressure, some by spontaneous small-scale action of private persons over long stretches of time.” (Polanyi 189) Reading about this history dating back to the 14th century resonates with Cohen’s own writings.
There are also important analogues in biological arenas, for example. Cohen recounts the Moore case from 1984, in which a court rejected a UCLA doctor’s wrongful appropriation complaint. The doctor had a spleen removed from which the institution created and patented a valuable cell line. As Cohen writes, “The lawsuit reached the California Supreme Court, which rejected Moore’s conversion theory on the ground that diseased tissue removed from the human body could not be the subject of a property interest.” (Cohen 72) The court apparently was persuaded by the logic of private appropriation, believing “scientific research might be impeded if researchers or companies had uncertain title over cells and the patents that derived from them.” (Kapczynski 1506) Such a practice of courts enabling companies to “consume consumers”, harvesting either data or their organs for value, while then being deprived of any claim to ownership betrays more than neoliberalism.
Perhaps it reveals the obscene underbelly of neoliberalism, far from merely non-intervention, it means intervening to the benefit of large corporate institutions and against the claims of wronged citizens, while alleging neutrality. Cohen explains how the idea of a public domain depends on abundance and on absence of prior claims, as was conveniently imagined by Europeans of the lands in the Americas (50). Yet even in the Moore case, where there was neither abundance nor absence of claims, the court ruled in favor of dispossessing Moore arguing that his claim to his own extracted organ was illegitimate. The rationale the court showed in the Moore case betrays a sacrificial mindset that an individual’s rights must be yielded to companies for good things to happen in society.
Forced consent
Tech companies have also created a specific maneuver by which they deprive humans of raising possible legal claims against them. This is the sad growth and proliferation of terms of service agreements and privacy policies that tech platforms display with hushed hurried motions before getting people to “engage”, spinning the wheels on feeds to generate tech companies advertising profits. The story described by Cohen is as follows. In the late ’90s, companies were focused on applying tracking mechanisms (like the browser cookie) and advertisement dynamics to monetize their user base. They did this in the absence of a clear regulatory framework, which still does not exist to this day. Thus, “notice and consent became the dominant regulatory framework for evaluating online businesses’ use of tracking techniques.” Given that both the “turgid” privacy policy and user terms of service agreement are, 20 years later, still the reigning standard, this means that the courts have upheld these documents as valid. Users have also realized that their only real alternative is to accept and be on their way, or else not use these technologies at all.
In this sequence of events, we see a relation in which the sustained collective intelligence of corporations is used as power to silence claims and disputes of actual human beings, in other words — to force consent. Meanwhile, the courts willfully ignore the limits of human cognition, and the limits of time available to read and process these inane documents. Instead, they double down on the fairy tale that such an asymmetric arrangement that forces consent means that people, and collectively society, approve company machinations to surveil people, manipulate them and profit from these actions. It seems no rational discussion has been had on this basis. Instead, companies have sprinted to establish these acts as a standard without society finding out. We should also question whether any text irrespective of the number of words or the obfuscating phrases should be considered appropriate and binding. The obvious truth of the matter is that it is not possible for a human being to process all these terms and conditions documents which they are supposed to abide by. And thus, we have a sham regulatory framework, which rather than prevent bad practices, shields companies from legal liability arising from their abuse of human beings. This is a stunning inversion: the use of law and legalistic tactics to produce injustice. And clearly, this problem is one that carries through the legal structure which likes to pretend that weaving words ad infinitum in any way advances the cause of fairness for a being with a limited life span, a limited attention span, and limited cognition. Contrast this with corporations which suffer no limit but profits.
Asymmetrical handling
The rise of informational capitalism has also impacted the structure of decisions facing the legal system. While the legal structures seem to pretend that people have unlimited attention to process terms and conditions, they on the other hand realize their own limits in processing so many cases in the face of the explosion of use of digital technologies. Cohen terms this the problem of numerosity: “an unmanageably large and ever increasing number of claimants and interests”. (Cohen 144) Unsurprisingly, the claims of the tech companies, “for the recognition and enforcement of intellectual property rights and similar property interests — are afforded extensive and creative process.” Meanwhile, those that concern people, “those for recognition and vindication of information privacy, data protection, consumer, and worker claims — are afforded only minimal or notional process.” (Cohen 145) So rather than use numerosity as an indicator that something is broken and must be fixed, the very abundance of claims is used to reduce the probability that meaningful process will resolve the issue for many people.
Further, the transition to a digitally mediated interaction has given rise to the problem of harm: “a proliferation of asserted harms that are intangible, collective, and highly informationalized”. (Cohen 144) Thus the courts have relied on demanding that plaintiffs establish “injury in fact”, which Cohen describes as a “mid-twentieth century invention, constructed by the Supreme Court in response to cases in which the claimed injuries were predominantly informational and seemed too general and intangible to count as redressable wrongs.” (Cohen 146) And yet even this doctrine is selectively applied so as to benefit corporations. For the abstract wrongs relating to intellectual property violations, Cohen tells us how the courts can translate that to specific statutory damages even though the harms are vague and abstract. Meanwhile, users alleging information privacy harms run into the “logics of appropriative privilege and innovative and expressive immunity” (Cohen 149).
Conclusion
Julie Cohen’s description of the on-going transformation is compelling and eye-opening. While Polanyi himself reasoned that society would have learned to insulate parts of itself from the market to thrive in a complex economy, Cohen shows us that perhaps he underestimated the extent to which ideas such a neoliberalism, and the power of market forces, would capture much of society, and how this has been used by information technology companies. In her review of both Cohen’s work and Zuboff’s The Age of Surveillance Capitalism, Amy Kapczynski endorsed Cohen’s account and deemed it as a better framework from which to analyze the law of informational capitalism. I view the accounts as complementary, and believe that precision around the analytical framework used is less important than popularizing an understanding of the significant risks faced so that there may be vibrant solutions proposed and some actually tried.
However, while Cohen’s description of the biopolitical public domain is clear and compelling, she misses human attention as another public domain upon whose appropriation powers the digital economy. This is an area where Zuboff has powerfully described the potential for technology companies to mold human behavior. While such appropriation as implied by advertising generally — digital technologies take this to a new level, to the level of thought itself. Ads as one would have found in a magazine or billboard could easily be ignored, and could not be tailored to the particular individual nor the particular moment. Now, technology companies have the power to mediate every moment of experience through their screens. And with that, they have the power to harvest attention, to shift it, to manipulate us at will. This is a resource appropriation of colossal proportions, which avoids scrutiny because it resembles the established practice of advertisements in physical media. Both Cohen and Kapczynski seem to minimize the significant danger that computation applied to human cognition itself could represent. Indeed, Kapczynski shrugs it off explicitly by writing: “The existing evidence suggests that until behavioral advertising becomes much more sophisticated, it will have at most a small impact on behavior.” (1473) It seems to me that she misunderstands the evidence, drawing the conclusion that because the number of purchases from showing ads was small (1 purchase per 7,700 times an ad was shown), that this negates the power of a digital platform to shape a person’s experience of the world and manipulate their behavior. Indeed, the outcome of most behavioral nudges may simply be for the user to spend more time on a platform, or to post further, keeping more people online — not to buy a product.
Second, while Cohen documents the degree to which capture has occurred, providing de facto and de jure immunities, and privileging corporate interests when it comes to claims of harm, she probably overstates the extent to which legal and regulatory capture is a feature of informational capitalism. This seems like the same tactics used in other domains, yet made extra salient because the rise of the digital technology economy occurred in the 90’s during the grip of neoliberal ideologies in whose sway we are still very much caught. This suggests that while Cohen’s analysis and description is apt, the true characteristic of informational capitalism is likely to be found in the technology itself, and not the captured legal structures. Zuboff’s account starts from these technological capabilities, and thus remain the more convincing account about the novel danger of the era in which we live. This does not diminish the importance of Cohen’s careful diagnosis of managerial neoliberalism — it just puts it in the context of Polanyi from which she draws. These are issues that have been carried through the legal and economic structure for centuries.
Lastly, Cohen is in a position to strengthen her claims about the dangers faced if the legal capture is not remediated. Specifically, she has shown how courts refuse to use probabilistic harms when applied to people, such as the likelihood of damages brought about by a data leak of a user’s personal info. However, the nature of technological progress and digital technologies is that the scale often is enormous, and the low probability effects mount until the negative events are all but certain. Neither the courts nor Cohen nor Kapzynski and other interlocutors have appropriately expressed the dangers of business as usual between a legal system that behaves in a managerial neoliberal way, and technologies which are “eating the world.” The danger is ruin. This is a large price to pay for fancy text and images on screens.
That is why it is important for others to learn from Cohen’s account which conclusively shows how a managerial neoliberal approach to the legal system has failed society. It is time to turn back calls to favor innovation at all costs, and resist capture of the elements of government. Instead, we should restore the precautionary principle when dealing with technologies that span the global population and engage it at the level of thought itself.
Cited Sources
- Cohen, Julie. Between Truth and Power: The Legal Constructions of Informational Capitalism. Oxford University Press, 2019. Print.
Kapczynski, Amy. The Law of Informational Capitalism, 129 Yale Law Journal. 2019. - Polanyi, Karl. The Great Transformation: The Political and Economic Origins of Our Time. Boston, MA: Beacon Press, 2001. Print.
- Norman, Joseph, & Bar-Yam, Yaneer, & Taleb, Nassim Nicholas. Systemic risk of pandemic via novel pathogens — Coronavirus: A note, New England Complex Systems Institute (January 26, 2020).
- Zuboff, S., & Schwandt, K. The age of surveillance capitalism: the fight for a human future at the new frontier of power. 2019.