The future of prediction

Google and other digital masters have the capability to capture, store and process the content and meta-data of our communications, movements and interactions. At least, that’s according to Shoshana Zuboff’s argument about life and power in The Age of Surveillance Capitalism.

Aided by algorithms that sort, categorise, and make inferences, Google can predict our desires and actions with sufficient accuracy to implement measures that influence us to carry out particular actions, like voting for a political party, or buying products. Whether or not Google, Facebook and their clients exercise that power, such influence could operate on individuals and groups, and mostly by way of nudging and conditioning.

“As digital signals monitor and track a person’s daily activities, the company gradually masters the schedule of reinforcements — rewards, recognition, or praise that can reliably produce the specific user behaviors that the company selects for dominance” (295).

This data and inferences can be sold on to advertisers and other interests. This is the “behavioural surplus” generated by consumers that Google monetizes, though not for our personal benefit.

“Predictions about our behavior are Google’s products, and they are sold to its actual customers but not to us” (94).

Exploitation of the everyday

For Zuboff, our everyday lives are amongst the casualties of this exploitation. The everyday is a target for the corrupting effects of surveillance capitalism. “The everyday” serves in her narrative as an indicator of the ordinary and habitual field of activities and things that populate our individual, personal, and often private worlds. Surveillance capitalism subjects us to “aggressive extraction operations that mine the intimate depths of everyday life” (19).

“so everyday life is set to become a mere canvas for the explosion of a new always-on market cosmos dedicated to our behavior and from which there is no escape” (268).

Contrary to Zuboff’s pessimistic outlook, for some writers the everyday is less a site to defend than a medium of actions and resistance tactics. I alluded to some of the thinking about that in my post, Tags and codes, with reference to Jean-François Augoyard’s book Step by Step: Everyday Walks in a French Urban Housing Project.

Under the influence

The concerns expressed by Zuboff are indeed profound and important, especially if we believe that the technology and those who marshal it are as effective as she assumes. I think there are several moments of vulnerability in her assumed chain of cause and effect. I’ll start with the end result, influence.

Are influence campaigns and their subtle incremental incursions really so effective? Are people so influenced, for all time and all of the time? Zuboff seems to assume the systems actually work as advertised to affect behaviour.

Others have expanded on the ways that propaganda and influence work and don’t work, especially in advertising,  e.g. Vance Packard’s The Hidden Persuaders. The novelty claimed of surveillance capitalism is that this persuasion happens in a way that is automated, customised, targeted, and operates via nefarious data gathering and opaque processes. That’s new, but what does it mean to be under the influence of an outside agent?

The autonomous individual

Zuboff’s argument about influence is most potent if we assume there is some pure trajectory of independent human action that is capable of operating prior to any external influence.

“surveillance capitalism is a rogue force driven by novel economic imperatives that disregard social norms and nullify the elemental rights associated with individual autonomy that are essential to the very possibility of a democratic society” (512).

She cites the behavioural psychologist B.F. Skinner as a leading advocate who sought to diminish the sense of the individual — as “denouncing the autonomous self” (439). I think Skinner is a false target here. There are other antagonists against Zuboff’s implied assertion of the sanctity of the autonomous individual.

Following hermeneutical scholars I would say that we are all, always under the influence of family circumstances, upbringing, education, peer groups, media, political movements and trends. There’s no getting back to a pure, autonomous individuality.

The challenge for the social critic therefore is to expose and weigh influences against one another. In the case of surveillance capitalism, there are counter-influences and resistances of varying strengths. Books such as Zuboff’s are further sources of influence, serving to mobilise resistance for some readers at least.

Unpredictable you

Irrespective of how prone we are to their influence, are the behavioural predictions of Google’s systems sufficiently accurate? This is an empirical question. Presumably you can influence another person’s behaviour if you can predict how they will respond to a particular stimulus, and if you can adjust the stimulus to sway the behaviour in a particular direction of a particular individual, or enough individuals to make a difference.

The rogue firm of Cambridge Analytica (CA) claimed to influence election results. Its operations provide a good test case. Some scholars and journalists have challenged the accuracy of CA’s predictions. Organisations in the prediction business rely on being able to sell predictions, i.e. their claimed ability to produce accurate predictions. As we know from the art of fortune telling, predictions even exert influence that is unrelated to their accuracy.

Automated inference

Are the automated inference engines well founded? How well do they generate, explicate and justify choices and decisions? Can they cope with all the data available, and in real time. Zuboff seems to assume that these systems do what they say they do.

“These machine intelligence operations convert raw material into the firm’s highly profitable algorithmic products designed to predict the behavior of its users” (65).

There is a difference between what they are designed to do and whether they do it effectively. I still subscribe to the skepticism of AI critics such as Hubert Dreyfus, that disembodied algorithms are incapable or accounting for the wealth of factors, contingencies and values that come to the fore as human beings filter, discuss, debate and compromise — as they propose, implement and take responsibility for effective plans of action.

Never enough data

Is the data gathered through surveillance in fact as useful as claimed. However much algorithms can infer from people’s clicks, taps, the disposition of facial features, location, posture, gate, text feeds and demographics, there will always be something missing. To require yet more and detailed data implies an infinite regress of clues and signals to form a meaningful profile, accommodation, or prediction. These are some of the challenges of big data — its redundancy, non-uniformity, messiness, and relentless feeds across multiple channels.

I agree that under the sway of big data, only trivial, reductive operations start to count as worthy of analysis, i.e. the volume of clicks on like buttons carries more weight than complicated written accounts, or face-to-face dealings, and de-privileges other cues, clues and signals as people interact with each other and their material and digital worlds.

I maintain that the challenge is not that these systems deliver what is promised, but the narratives generated around them draw people in, hold up false hopes, induce anxieties, direct research programmes and deflect funds.

We are often told that life is confusing and complicated, and the digital world makes it more so. On the other hand, the next app will fix it for us. Digital tech can serve as both cause and cure. I rehearsed these points in an earlier post (What’s wrong with the future) where I looked at Google CEO Eric Schmidt’s book The New Digital Age, a book that Zuboff also criticises.


  • Augoyard, Jean-François. 2007. Step by Step: Everyday Walks in a French Urban Housing Project. Trans. David Ames Curtis. Minneapolis: University of Minnesota Press. First published in French in 1979.
  • Dreyfus, Hubert L. 1972. What Computers Can’t Do: The Limits of Artificial Intelligence. New York: Harper and Row
  • Packard, Vance. 2007. The Hidden Persuaders. Brooklyn, NY: Ig Publishing. First published in 1957.
  • Schmidt, Eric, and Jared Cohen. 2013. The New Digital Age: Reshaping the Future of People, Nations and Business. London: John Murray
  • Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books


  • Image is kit involved in LiDAR scanning the spaces in our workplace.

1 Comment

Leave a Reply