Facebook, Google, Amazon — “they track our every move, they monitor every moment in our lives, and they exploit our data for profit.” Thus stammered Richard Hendricks, the bedraggled, beleaguered founder and CEO of Pied Piper, a fictional Internet company at the heart of the HBO comedy Silicon Valley, in his surprisingly eloquent testimony before a hostile congressional committee investigating the abuses of Big Tech. “These companies are kings, and they rule over kingdoms far larger than any nation in human history. They won, we lost.”
But far from giving up, Hendricks suggested a path toward victory: “The way we win is by creating a new, democratic, decentralized Internet, one where the behavior of companies like this will be impossible forever. One where it is the users, not the kings, who have sovereign control over their data. This I promise to you: I will help you end this journey by building an Internet that is of the people, by the people, and for the people, so help me God,” Hendricks concluded, to the rousing applause of his employees watching on TV.
Only one small problem: viewers later learn that Pied Piper itself was engaging for years in the very practice Hendricks so roundly condemned, as a popular video game operating under its umbrella had been collecting and exploiting customer data. It turns out it’s not as easy as we think to turn off the information spigot, for we have more or less accepted the fundamental bargain of the Internet: In exchange for our personal data, we gain access to an unprecedented cornucopia of digital goods.
Shoshana Zuboff picks up Hendricks’s mantle in examining the fraught relationship between Internet companies, their users, and those users’ data in Surveillance Capitalism, a term she succinctly defines as “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales.” Fundamentally, she worries that our “rights to privacy, knowledge, and application have been usurped by a bold market venture powered by unilateral claims to others’ experience and the knowledge that flows from it.” While Zuboff provides an impassioned, eloquent, and thought-provoking analysis of the interplay between technology and personal autonomy, her book suffers from deep flaws, largely ignoring the tremendous benefits Big Tech companies have conferred on society, failing to reckon with Americans’ choice to accept this tradeoff, and neglecting to offer any serious or realistic solutions.
Zuboff trained in philosophy at the University of Chicago and in social psychology at Harvard, and in the 1980s became one of the first tenured female professors at the Harvard Business School. There she conducted pioneering scholarship on how the information society was changing culture and economics. Her 1988 book In the Age of the Smart Machine: The Future of Work and Power foresaw the growth of AI and machine learning, and anticipated the core contention of Surveillance Capitalism that digital technology is upending traditional ownership roles and concepts, fundamentally transforming the nature of the market.
The strongest articulation of her claim arrives early on, where she seeks to slaughter certain sacred cows of the digerati in order to reveal the true nature of the tech industry:
Surveillance capitalism’s products and services are not the objects of a value exchange. They do not establish constructive producer–consumer reciprocities. Instead, they are the “hooks” that lure users into their extractive operations in which our personal experiences are scraped and packaged as the means to others’ ends. We are not surveillance capitalism’s “customers.” Although the saying tells us “If it’s free, then you are the product,” that is also incorrect. We are the sources of surveillance capitalism’s crucial surplus: the objects of a technologically advanced and increasingly inescapable raw-material-extraction operation. Surveillance capitalism’s actual customers are the enterprises that trade in its markets for future behavior.
In Zuboff’s telling, the technology industry extracts nothing short of actual human experience from its users, denuding our existence of its richness and manipulating our behavior toward the industry’s own commercial ends. What initially began as a laudable effort to liberate information and improve customer experience rapidly degenerated into a naked power grab that threatens the very foundation of democratic society.
Worse, Zuboff argues, tech companies have managed to harness the power of government to their voracious, avaricious engine. Far from meaningfully regulating or curbing the companies’ malign influence, entities like the National Security Agency only supercharge their efforts by conspiring to further eradicate privacy and individual autonomy.
In some ways, Zuboff’s truck with digital companies has less to do with their being digital than their being companies. She gives away the game early in the book, where she labels surveillance capitalism “parasitic and self-referential” and likens it to “Karl Marx’s old image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of labor, surveillance capitalism feeds on every aspect of every human’s experience.” And she points the finger at economists like Friedrich Hayek, who plowed the ground for digital exploitation by envisioning the “self-regulating market as a natural force of such complexity and perfection that it demanded radical freedom from all forms of state oversight.” (This would come as a surprise to Hayek, who argued in The Road to Serfdom that “coercive interference … may very considerably assist” economic competition, which “requires certain kinds of government action.”)
Digital companies in particular are uniquely guilty of feeding on human experience. Whereas old-economy companies regarded us as “the subjects of value realization,” digital companies treat us as “the objects from which raw materials are extracted and expropriated.” Think of a traditional transaction, like a car sale. Old-economy companies see mutual beneficiaries in this value exchange: You get the car, I get the money. But in the Google economy, Zuboff says, ordinary people are no longer the consumers; advertisers are. And what they consume is products based on data about users. “This new market form declares that serving the genuine needs of people is less lucrative, and therefore less important, than selling predictions of their behavior.”
But it’s not obvious what is genuinely new in this form of capitalism. Zuboff is right that most of Google’s users are not its actual customers — yet they are nevertheless engaged in an exchange from which both parties derive real benefit. Even if what we trade is now data about our online behavior rather than money, in getting a reliable search engine we are plainly still subjects of value realization. Might we also be objectified? Clearly so — but then, every profit-maximizing company to some extent treats us as objects from which it extracts revenue, just as we treat companies as tools to be used for our own benefit. This instrumentalist approach, after all, is fundamental to capitalism writ large, not simply to surveillance capitalism. The arrangement of the players in the digital exchange may be novel, but its fundamental nature is not.
Coupled with what she repeatedly calls “neoliberal ideology,” Zuboff’s central objection appears to be to capitalism, not just to surveillance.
What if the digital economy is far more straightforward and less sinister than Zuboff allows? After all, she acknowledges that companies like Google and Facebook promised to apply their products to “new domains of critical importance, rescuing information and people from the old institutional confines, enabling us to find what and whom we wanted, when and how we wanted to search or connect.” This amounted to nothing less than the “promise that quickly lodged at the very heart of the commercial digital project.”
This promise has indeed largely been realized, thanks to relentless innovation by the industry Zuboff castigates. Take, for example, the tracking of driving behavior through vehicle telematics — like the tracking devices some car insurance companies offer that promise lower rates for less aggressive drivers. Zuboff derides this emerging form of data-gathering as a greedy, overweening, micromanaging technological trend, or “behavioral control” by nefarious insurance companies. But this use of our data is voluntary, and it is a feature, not a bug, financially incentivizing safe driving, and ultimately saving lives. While Zuboff waxes at length about the dangers posed by Big Tech’s exploitation of user data, she breezily brushes aside the benefits we can gain from such data-driven technologies as fitness tracking, electronic gathering and analysis of public health data, automated transportation, traffic management, and many others.
Elsewhere, in fact, Zuboff concedes that digital companies have been reasonably upfront about the tradeoffs involved in using their products: “Privacy, they said, was the price one must pay for the abundant rewards of information, connection, and other digital goods when, where, and how you want them.” This isn’t just what tech companies say. It is also true.
For instance, in exploring the meteoric rise of Google, whose revenues surged by a whopping 3,600 percent in four years after introducing its advertising platform — from $86 million in 2001 to over $3 billion by 2004 — Zuboff laments a colossal missed opportunity: “What other pathways to sustainable revenue might have been explored or invented? What alternative futures might have been summoned to keep faith with the founders’ principles and with their users’ rights to self-determination?”
But we more or less know the answer to that question: Without commercializing its search engine, Google would most likely either have gone out of business or shifted to a non-profit model along the lines of the Mozilla Foundation, a highly respected and thoughtful entity with only marginal impact on the lives of Americans, the many virtues of its Firefox browser notwithstanding. Absent a monetization of Google’s search engine, it would not have transformed the ways we in the digital world interact with maps, products, culture, books, and one another — innovations we have all come to expect to be both reliable and free.
Zuboff’s colorful and charged language maintains the reader’s interest throughout nearly 700 pages of exposition, but at times it also oversells her thesis. At one point she likens Google’s adoption of its advertising strategy to a transformation from Dr. Jekyll to Mr. Hyde. At another she labels Sheryl Sandberg, the onetime Google executive whom Mark Zuckerberg hired to serve as Facebook’s chief operating officer, as the “‘Typhoid Mary’ of surveillance capitalism,” alluding to the Irish immigrant first identified as a carrier of the disease in the United States. Her writing tends toward the overdramatic, even the bombastic, as when she labels surveillance capitalism a “Faustian compact” because “it is nearly impossible to tear ourselves away, despite the fact that what we must give in return will destroy life as we have known it.” While these piquant characterizations undoubtedly spice up her prose, which is zesty enough without them, they also undermine its seriousness.
But by far the most problematic facet of Surveillance Capitalism is its fundamental, even philosophical, misapprehension — whether accidental or intentional — of the nature of digital companies’ acquisition of data, which Zuboff cleverly, repeatedly, but inaccurately casts as “dispossession.” She discusses what Hannah Arendt called the “original sin of simple robbery” — the idea that capitalism first became possible by grabbing land and natural resources and turning them into capital. Zuboff argues that Google committed this sin again when it realized that “human experience … could be extracted.” User information, photos of public spaces, traffic patterns can all be “rendered as behavioral data,” creating a surplus that can be sold for profit.
Google dispossesses us, per Zuboff, by way of its relentless “incursion” into our lives: “your laptop, your phone, a web page, the street where you live, an e-mail to your friend, your walk in the park,” and so on. “Incursion moves down the road … laying claim to decision rights over whatever is in its path. ‘I’m taking this,’ it says. ‘These are mine now.’” When we take Google to court, it “seduces, ignores, overwhelms, or simply exhausts its adversaries.”
Consider Google Street View, for which “the company did not ask permission”:
It simply repeated the “original sin of simple robbery” and took what it wanted, waiting for resistance to run its course as it devoured and datafied the world’s public spaces, streets, buildings, and homes.
In her telling, personal information and human experience itself aren’t simply accessed but captured, extracted, seized.
Yet robbery and dispossession are deeply flawed terms to describe these “takings” precisely because the user does not in any meaningful sense lose her data or experiences or photos or tweets or political opinions. Unlike plundering diamond mines or extracting natural gas, where every gemstone or cubic foot of methane changes hands, using “behavioral surplus” — the models of human behavior generated from user data — is not a zero-sum game. True, Big Tech companies have accessed this information, but quite apart from the question of whether they did so with public consent, there has been no dispossession, because the public — as a whole, and as individuals — still very much continues to possess the information, even if not in the exact same form. Put differently, we may be unhappy when Facebook serves us a targeted trattoria advertisement on the basis of a video we posted of our children cavorting at the Trevi Fountain, but it strains credulity to suggest that we have somehow lost that video, or the underlying experience and sentiments it reflects, simply because Facebook has used it.
So if Zuboff misdiagnoses and exaggerates the problems created by what she ominously calls Big Other, how do her proposed solutions stack up? Not well. At a high level, Zuboff appears to believe the problem unsolvable by conventional methods. She blames the absence of a way out on consumers’ “radical indifference,” and on the interest the government and the military share with industry in expanding the surveillance apparatus. She even likens surveillance capitalism, for whose destruction she yearns, to the Berlin Wall, and her advice boils down to an exhortation “to rekindle the sense of outrage and loss over what is being taken from us.”
But while a comprehensive fix to surveillance capitalism isn’t forthcoming, Zuboff does advance several partial solutions. First, Zuboff homes in on Internet companies’ exploitation of a legal scheme supposedly designed to shelter them. Section 230 of the 1996 Communications Decency Act shields certain websites from liability for the content posted on them by others, under the theory that these sites function more as neutral platforms than as publishers. This notion, of course, has recently come under heavy fire from left and right alike, with Senator Josh Hawley (R-Mo.) introducing legislation to unwind this protection on the grounds that by imposing certain limits on offensive or otherwise harmful posts, sites like Facebook and Twitter have eschewed their neutral intermediary status and more closely resemble content providers, like traditional publishers.
Here Zuboff joins forces with Hawley, although on slightly different grounds, noting that “Section 230’s protection of the ‘intermediaries’ now functions as another bulwark that shelters this extractive surveillance capitalist operation from critical examination.” In her telling, because Google, Facebook, and the like depend so heavily on the content their users upload to the sites, they should also bear responsibility as publishers of that content. But she fails to explain how the mere fact that digital companies profit from content voluntarily posted by their customers requires them to bear culpability for that content, such as defamation lawsuits or tortious remedies like pain and suffering caused to other users who encounter it. Her proposed approach would be akin to imposing liability on a for-profit book-of the-month club for the offensive content of one of its books simply because the club uses its members’ data — with their permission — to target them for additional sales.
More substantively, she champions a beefed-up “right to be forgotten,” currently in vogue in Europe following a May 2014 European Court of Justice ruling that empowered ordinary individuals to request that Internet companies delete damaging information about them. If they can prove this information is “inadequate, irrelevant or no longer relevant, or excessive,” the digital platform must delist it. Zuboff hails this ruling and highlights a predecessor decision four years earlier by the Spanish Data Protection Agency to elevate the right to be forgotten as a binding legal principle. Last year’s enactment by the European Union of the General Data Protection Regulation (GDPR) enshrined this right into EU law, signifying its full maturation.
Yet this right enjoys far more popularity in Europe than in the United States, and is hardly without its detractors even on the continent. Data on public opinion is scant; much of what we know comes from a five-year-old survey of 500 respondents that suggests that fewer than 40 percent of Americans support a European-style right to be forgotten, while another 15 percent favor a limited such right for minors only. Americans largely recognize that privacy is no unalloyed good and instead must be balanced against other societal goods. At the same time, when tech giants overreach, they provoke a backlash even among Americans, as Zuboff acknowledges was the case with Google Glass, which at least until now has proven too creepy and invasive for all but the most enthusiastic of tech nerds.
More broadly, both in the United States and abroad, the tension between an individual’s desire to avoid scrutiny for past indiscretions and the public’s right to know about others’ dangerous or problematic acts is difficult to resolve, especially when someone has, for instance, been charged with an offense but never convicted. One recent New York Times article illustrated this tension through the troubling story of an Italian journalism website dedicated to exposing corruption, in a country notorious for corruption, that wound up having to shutter itself after numerous court rulings required it to delete information about locally prominent individuals who had been arrested but whose charges had been dropped.
For its part, Google claims to have complied with nearly half of GDPR requests to delist specific websites from its search engine. It contends that the decision to reject the remaining requests involves factors such as “the requester’s professional life, a past crime, political office, position in public life, or whether the content is self-authored content, consists of government documents, or is journalistic in nature.”
Nevertheless, for all of Zuboff’s inflation of the putative predations in which the tech industry indulges, for all her analytical and rhetorical excesses, and for all her profoundly problematic suggested remedies for those harms, she identifies troubling trends requiring redress. Surveillance Capitalism challenges readers who, like myself, often wax enthusiastic about the tremendous benefits and value created by Google, Facebook, and others to reconsider their prior assumptions and examine whether our society should recalibrate itself in favor of greater protection over user data.
Dangers abound in this realm, to be sure. Nobody wants Alexa accidentally recording their private conversations and sending them out to acquaintances, or Target inadvertently informing a teenager’s father that she was pregnant by mailing out maternity clothes coupons. We may find it jarring to receive push notifications from supermarkets urging us to buy Green Giant frozen peas when we approach strip malls, unaware that one or more apps on our phones had been tracking our location. There is no shortage of changes that tech companies can make to improve user experience and safeguard our information, such as shorter and clearer privacy policies, more definitive opt-outs for location tracking, more regular destruction of aging or leftover user records, and more transparent disclosure of which third-party applications have access to our information.
But in the end, on the question of technology and privacy, American consumers have voted with their thumbs and fingers and pocketbooks: We’re fundamentally willing to exchange some measure of privacy for great gobs of technology. And condemning this popular practice as “surveillance” is unlikely to alter that deep-seated truth.