‘Measure what matters’? Emotion detection and behaviour screening in schools

Most schools see ‘data’ primarily in terms of: (i) performance data (i.e. students’ grades from in-school tests, high-stakes examinations and various other assessments), and (ii) attendance data (i.e. class roll calls, late attendance data, and so on).

Alongside this, however, is a growing interest in gaining a better sense of ‘who students are’ and ‘how students are doing’. To date, most schools are tentatively making use of ‘student satisfaction’ surveys, compiling student ‘personal profiles’ and other low-key indicators that might ‘add a bit of colour’ to the baseline data described above.

At the same time, however, the EdTech industry is developing and selling all manner of data-driven ‘solutions’ whose ambitions stretch well beyond administering a quick Google Form survey at the end of the school term.

Take, for instance, the growing number of software products pitched toward forms of behavioural and emotional detection. This includes various products that generate analyses of students ‘states and traits’ to gauge levels of engagement with school work, to detect when students might be becoming frustrated or demotivated, and generally respond to students in ways that ensure they continue learning.

In some cases, the data that these judgements are based on might simply come from student ‘clicks’ on a school’s learning management system or piece of learning software. More sophisticated systems might use webcams to glean data from students’ facial expressions or eye movements – generating hundreds of data-points that might provide an indication of how a student is feeling.

Current examples of such products include Pearson’s ‘aimswebPlus’ software. This is sold to schools and colleges with the claim that its online ‘Behavioral and Emotional Screening System’ offers “a quick and reliable method for measuring behavioral and emotional strengths and weaknesses” after only 2-4 minutes of online assessment. Other systems purport to provide accurate measures of a student’s “willingness to learn” and “personal stability”.

While the validity of these claims is hotly contested, these systems are beginning to be adopted by some schools and districts in the US. So might software such as this (and the inherent logic of digitally-screening students’ emotions, feelings and intentions) become an established part of school practice in the near future?

At the moment, none of our Australian research schools are making use of such systems.  In fact, there appears to be a reticence amongst most of the teachers that we speak with toward the over-use of such ‘analytics’. As one school leader from Brookdale High School at  put it:

“A lot of the [analytics] products that we see come from higher education. It makes sense there – universities are constantly worried about students dropping out, or not coming into class and slacking off. Their students are adults who have to choose to learn. It’s different here. Our students *have* to come in each day. They *have* to attend class. [Student] retention isn’t an issue for us. How well they do in their exam *is*”

Nevertheless, it will be interesting to see if this resolve continues when these analytic features begin to be integrated into larger school systems, and/or begin to be offered in simplified form as ‘free’ apps for teachers to download onto their own tablet and use in the classroom. We will certainly be exploring the creep of these forms of ‘school data’ throughout our three years working in Northland, Brookdale and Weston.