‘Computer says no’

[FIELDNOTES]

I was in a school yesterday for the first day of the new school year. For the first half hour or so, a lot of energy and emotion was being directed toward which students had (and had not) been allocated to each form group.

Although the school has mixed-ability classes, they still take care to juggle students around the four different form groups … avoiding personality clashes, discouraging cliques, making sure the most troublesome kids are placed with the most robust teachers, and so on.

One teacher was blithely batting away awkward questions from students about ‘Why is so-and-so not with us this year?’ with the simple response: ‘The computer decided’, ‘It’s what the computer came up with’.

One of the conveniences of the data-driven schools is the ability to hide behind the data, to blame the computer, to deflect attention and move on from awkward questions and tricky situations.

Craig Gent refers to this as ‘managerial distanciation’ – a common workplace tactic by the managerial classes to divert blame and disingenuously suggest an exaggerated authority on the part of algorithms.

Data certainly offers a welcome get-out for harassed teachers. Classrooms are places where a few small ‘white lies’ can go a long way to reducing conflict and making the working day a little smoother.

However, if everyone in the school is blaming the computer and refusing to take responsibility for their actions then it surely diminishes the moral core of the school? Schools are surely better places if people in positions of authority are willing to take responsibility for the decisions that they have made?

If one was being cynical, then this could be described as an instance where data was undoubtedly helping teachers in their decision-making … but also helping them to duck-out of the consequences of these decisions!

 

PS …  This instance also raises an interesting point about teachers being cognisant of the socially-constructed nature of the ‘data’ that they are presented with at school.

The teacher in this instance is well aware of the subjective human input behind the number, but still goes along with the charade of the objective, neutral algorithm.

An example such as this might be an interesting ‘crack’ through which teachers can start to reflect on the provenance of other data that they themselves treat as unquestionable. Where should they be pushing back against being asked to conform to what the ‘computer says’?