Some more examples of data-driven schooling

Northland College runs a few different ‘big ticket’ data analyses. These are computer-based calculations and quantifications that play a prominent part in school life. Often these ‘data-driven’ processes replicate things that would have previously been done in a school through a small group of teachers getting together to talk things over and rely on their collective professional judgement. Now, the school relies on its data-sets and the small team of staff that form its ‘data management group’.

As we reported in another post, Northland’s use of assessment data to ‘predict’ their Year 12 students’ likely performance in the national end-of-school rankings was seen as success. This had prompted the Deputy Principal (Russ) to think through how a similar analysis might be run to reward students’ current performance. At the beginning of the 2018 school year Russ had tasked the school’s ‘data lead’ teacher (Stephen – a senior maths teacher) to develop an ‘Academic Awards’ analysis which could identify the top performing students in each year. Russ had a notion that the top students in each Year group could be awarded ‘academic colours’ for achieving high grades in their subjects.

Stephen ‘played around with Excel’ to develop a slightly convoluted formula that ensured that a reasonable number of students would get the award (“if only a handful of kids can get it, that mean that no-one’s going to hunt for it … [so] we had to fiddle around with the model ‘til we thought we were on average going to get about 10 percent of kids”).  A key part of the ‘fiddling around’ with the data involved Stephen and Russ making a number of ad hoc judgements about the quality of the data that they had at their disposal. This resulted in them deciding to exclude performance data from subject areas that they did not consider to constitute ‘academic’ work (e.g. Sport), and to split remaining subjects into two differently-weighted groups according to how academically challenging they were seen to be. It also prompted them to make adjustments for classes where they considered particular teachers to be consistently “marking harder than others”.

As a fee-paying private school, it is perhaps surprising that Northland had not established an ‘Academic Colours’ earlier in the school’s history, but this was apparently the first time that such a scheme had ever been run. Compared to relying on a small committee of staff simply nominating and discussing students on the basis of perceived merit, Stephen’s computer-based analysis was felt to give the new process a legitimacy and accountability:

[With Excel] we’re doing it in a way that’s robust that when that one day when a parent challenges we can say, ‘Well, these are the rules on how we do it … and there’s the grades. So, you can see for yourself why they did or didn’t get it’.

That said, Stephen conceded that the ‘fiddling around’ work with Excel that had ensured a 10 percent return of student awards for 2019 would have to be repeated each year – as he put it: “I imagine there will always be tweaking”. This inconsistency of what was otherwise described as a ‘robust’ analysis was justified in terms of ensuring that each year’s model was “fit for purpose”.

Interestingly, these ad hoc analyses were praised by Stephen and Russ for revealing all manner of ‘insights’ about the school that had not previously been recognised. One favourite story about the Academic Colours data exercise was the data-led identification of one sporadically high-flying student who had previously not been noticed by their teachers as excelling:

This one particular kid had generated, you know, their academic awards essentially on Group A [challenging subjects] only … only across three semesters … but only with two B pluses in their group B [less challenging] subjects.  Here’s a kid who’s only focused on one area of learning. It throws up something that’s interesting to ask as a question. …maybe that’s just where their interest is? … or maybe that kid might be Asperger’s? … who knows what it is!

Nevertheless, there seemed to be little appetite for extending these ‘useful’ insights into regular systematic holistic analyses of the school’s data-sets. We talked about the possibility of the school running regular ‘deeper dives’ into their available data-sets to see what other unexpected patterns or outliers might be revealed. Yet it was inferred that school staff neither had the time nor the inclination to begin ‘digging around’ its data. At present Northland was felt to be functioning relatively well – meaning that its teachers were content to use data for specific purposes that informed their actions on particular points. It was clear that data analysis was only being conducted at certainly times for very specific purposes. Throughout all these processes it seemed that no-one has the time to be distracted by extraneous information!

This narrowly-focused approach was illustrated by another student ‘performance prediction’ process that Stephen had been asked to run – this time to identify “top-flying students” in junior year groups. As with the university prediction process, Stephen chose to do this by ranking students by Z scores in their main subject areas. Significantly the Head of Middle School was interested only in seeing the data for these ‘top-flying’ students. She was very clear that she did not want to see the full analysis of the whole cohort which Stephen had to produce in order to identify the top performing students:

 So I said, ‘Do you want to know the bottom [students] as well, so that you can work with them?’ She said, “’No, that’s a different day. I don’t …  for this day I want to know who’s the top and who we push, you know. I want to be sure that we doing something with them’.  I think that for me, a kid that needs extending is not necessarily a kid who’s doing fantastically.  They might be an average student who needs extending before they will do well. …  But it’s what she’s asked for. … She wants the data so she can go to the teachers and say, ‘I want to be assured that you’re appropriately challenging these top kids’.

 

These are all perfectly understandable uses of data. Schools are highly pressured environments, and lack of time is a key determining factor in everything that takes place (and does not take place) in a school. If a Head of School has committed herself to tackling a perceived issue with top performing students, then she will remain primarily focused on that task. There is simply not enough time to ‘explore’ what else the data might reveal.

Similarly, a school like Northland is understandably mindful of accountability – especially to fee-paying parents who might challenge a decision not to award their child an award. Of course, schools have be awarding ‘colours’ for well over a century without the aid of Excel and Z scores. In this sense, it might be argued that Northland’s new process was perhaps indicative of a underlying deprofessionalisation of the staff, or a diminishment of teachers’ professional judgement. Yet, the number-work being carried out by Stephen and Russ ‘behind’ the production of the ‘Colours’ list clearly involved a great deal of professional judgement and discretion. This was not simply a case of ‘the computer deciding’ whether to include students’ scores from subjects such as Sports. Moreover, carrying out the process through spreadsheets was allowing these teachers to notice existing inconsistences in their colleagues’ professional judgements (the student going ‘under the radar’, the varying severity of grading practices).

Stories such as these all further our understanding of the ‘messiness’ of data in schools. These are compromised, contingent and context-driven uses of data – determined by the actions and agendas of a few staff members with the technical skills and allotted time to make use of spreadsheets. On one hand, then, Northland can frame itself as making good use of data and evidence in supporting its decision-making. Yet on the other hand, these uses of digital data seem far-removed from academic concerns over the rise of the ‘smart school’ and ‘precision education’.

This raises an important emerging tension for us to explore further in our work. Are the low-key data practices that we are coming across in our fieldwork simply the initial stages of datafied logics (such as predictive analytics) being established in schools? Or, alternately, do these practices suggest entrenched school-specific counter-logics of how teachers ‘do data’, that might themselves shape (and compromise) more extensive attempts to introduce data-driven processes into the context of a school such as Northland? Schools are certainly distinct institutional contexts, and as Larry Cuban has demonstrated there is a long history of ‘Technology Meets Classroom, Classroom Wins’. Are we beginning to see a similar phenomenon being established in terms of how schools respond to the introduction of data-driven innovations?  Watch this space ….