Amidst the many anxieties that surround datafication is the recurring question of how numbers and statistics have become cemented throughout all areas if society as a dominant way of understanding human experience (or, in academic terms, a dominant onto-epistemological framework). Regardless of the terminology, people from all backgrounds are increasingly wondering how numbers have become so integral to human reasoning. Across the social sciences, this question is more often tinged with regret rather than pride – how and why have we let the richness of daily life become reduced and simplified in order to comply with spreadsheets and algorithmic processing?
In his 1990 book The Taming of Chance Ian Hacking offers a detailed historical response to this question. Over the course of 23 chapters he develops a chronological framework that lays out how numeration and statistics became a dominant way of viewing the social and biological world. This account starts with establishment of ‘determinism’ as a dominant form of reasoning during the sixth century (i.e. the philosophical notion that all events, including moral choices, are determined completely by previously existing causes). Hacking argues that it was not until the 20thcentury that this belief was usurped. For Hacking, this relatively recent turn away from determinism was a decisive conceptual moment in human history.
However, this is not to say that there was a shift back to pre-determinist thinking – namely chaotic and ‘vulgar’ beliefs in chance and superstition. Instead, then, Hacking argues that numeration and statistics rose to prominence at this time as a ready means of ‘taming chance’:
Determinism was subverted by laws of chance. To believe there were such laws one needed law-like statistical regularities in large populations. How else could civilization hooked on universal causality get the idea of some alternative kind of law of nature or social behaviour? (p.3).
As Hacking reasons, this impetus led to a new kind of reasoning based upon numbers and statistics. From births and deaths, to suicide rates, enthusiasm for statistically-grounded biopolitics gave rise to comprehensive measures aimed at social groups or populations. This information could be linked to economics in the form of taxation and medical insurance, while also serving a more fundamental purpose in the creation of administrative nation states and the expansion of empires into new territories. What followed was a ‘sheer fetishism for numbers’ of all kinds, so that the ‘numbering of the world was occurring in every branch of human inquiry and not merely in population and health statistics’ (p.60).
In terms of our own small-scale interest in making sense of the use of data in schools, Hacking’s thesis strikes a particular chord. First, it reminds us that there have been (albeit a long time ago) other ways of understanding everyday life. As such, it is worth remembering that dominant epistemological frameworks can be shifted. Second, it helps us to see the deep-rooted origins of key data-related concepts such as ‘categories’ and ‘normalcy’, and therefore help unpack their ideological power. Thirdly, Hacking reminds us that perhaps our greatest point of concern (even with school data) should lie with the linking of statistics with economics. So, we certainly need to think during the course of our study how we might be able to find ways to uncouple this link. For example, we may try to ensure data used to evaluate teacher performance is linked with optimising learning, rather than ‘evidence’ that impacts on career progression or funding of a particular program within the school. What follows is a brief summary of a few key ideas in Hacking’s book:
i) ‘Making up’ people – the creation of statistical laws
An immediately engaging idea in Hacking’s book is the notion of ‘making up people’ (p.3). As he contends, the systematic collection of statistics and measurements led to the creation of categories and classifications into which people came subsequently to recognise themselves. As Hacking reasons, this changed the way that people viewed themselves and viewed others:
The systematic collection of data about people has affected not only the ways in which we conceive of a society but also the ways in which we describe our neighbour. It has profoundly transformed what we choose to do, who we try to be, and what we think of ourselves (p.3).
In this sense, numbers and data that can be used for the purpose of social control are especially valuable. As such, most of the systematically measured ‘regularities’ that became ‘law-like’ in 18th century Europe were those that were ‘first perceived in connection with deviancy: suicide, crime, vagrancy, madness, prostitution, disease’ (p.3). The question then became, ‘What other human choices might display regularities?’ (p.41). Moreover, how could measurement of these phenomena be used to construct what Hacking calls ‘the sweet despotism of reason’ (p.42). By the 19thcentury statistics had become ‘state oriented and intended to give the state its means of direction and control’. Statistics were no longer published and instead became a source of state power and control.
ii) Statistical fatalism
Related to the notion of ‘making up people’ is the idea of ‘statistical fatalism’, which is in many ways a precursor to predictive analytics. Once regularities in populations can be deduced, then particular behaviours and actions appear predetermined or inevitable. For example, if a person is abused as a child then the probability that they will become an abuser as an adult is much greater. In this sense, the statistics become self-actuating to particular groups. As Hacking explains, statistical fatalism is not only achieved through crime statistics, but also a ‘technology of distribution’ (p.117) – i.e. the more complex social processes through which sensitive information about particular groups are disseminated. Once a statistical law is applied to a group of people ‘then the freedom of individuals in that group was constrained’ (p.121). Hacking reasons that as it is the state that has collected the numbers and defined the categories and classification for deviance and crime, then the state comes to play a role in what kind of person one is.
iii) The ideological power of ‘normal’
In the latter chapters of the book, Hacking discusses how statistical reasoning inevitably led to the construction of ‘norms’ against which human behaviours were understood. Previously, theorists might have used the more open and ambiguous term ‘human nature’ (p.161), however, statistics also gave rise to the notion that human behaviour could be either ‘normal’ or ‘pathological’. Normality was not only used to describe behaviours and medical conditions but took on greater ideological power when it was moved to the social and political spheres. For example, normal became not only an ‘existing average’ but also the ‘figure of perfection to which we may progress’ (p.168). In this way, the idea that there is (or should be) a norm, pushes individuals toward a particular destiny. Hackings explains this why ‘the benign and sterile-sounding word “normal” has become one of the most powerful ideological tools of the twentieth century’ (p.169).
Concluding thoughts…
One particular quote used by Hacking to describe Prussia in the 19thcentury has relevance to many data issues evident in schools today. Hacking describes the Prussian use of statistics as ‘descriptive’, leading to a situation in which ‘bureaucratic efficiency was combined with mathematical naivete’ (p.25). Indeed, copious amounts of data collection and processing in schools is often driven by bureaucratic compliance, even though the numbers and mathematics underpinning these processes are weak. The focus of statistical reasoning, Hacking reminds us, is the appearance of efficiency and productivity, rather than the social or natural phenomena these numbers represent.
It is impossible to capture the sheer breadth of scholarship in a book like the Taming of Chance. However, this blog post has attempted to identify three key ideas that might be helpful in organising our thinking about numbers and statistics in the more mundane setting of the school. Hacking’s work draws our attention to the ontological and epistemological power inherent in statistical reasoning and also helps us to understand just why it is so very difficult to encourage people to think differently when it comes to data. Data fatalism and normalcy are clearly key tenets of our current turn to digital data across contemporary society. The notion of ‘making up’ people has clear parallels with our current concerns over ‘data subjects’ and data doubles. In all these ways, then, Hacking reminds us that developing people’s ‘data imaginations’ requires unravelling a way of reasoning that is embedded in everyday life – from state bureaucracies to social media platforms.