This recent post from Springpoint consultants provides a neat overview of some of the main areas of enthusiasm currently shaping debates over the potential of school data. As might be expected, the blog post does a good job of distilling the prevailing rhetoric that has grown in the US education system in the wake of the NCLB initiative and its emphasis on data-driven school reform. As such, data is presented as a means of gaining accurate, ‘authentic’ and actionable insights into students – ‘articulating’ students’ critical needs and assets, and then using these insights to “devise important programmatic responses in the design of school”.
One notable element of such talk is the framing of school leaders as empirical investigators – drawing on diverse quantitative and qualitative data to support an ongoing cycle of deductive enquiry. Springpoint stress the importance of schools having access to raw data and engaging in a ‘cycle of enquiry’ at key repeated points across the school year where hypotheses can be tested and then rejected or refined. Data-based improvement is therefore described as an ongoing iterative process – all rooted in the maxim that “there is no question that schools better serve students when they understand the students they serve”
Of course, these ideals are not always easy to enact. Indeed, based on Springpoint’s experiences, the blog-post then highlights a number of recurring barriers. Many of these issues are familiar, for example:
- Schools often lack adequate access to useful data – for example, data is often made available only at an aggregated district level, accessible through complex data portals and/or provided to schools at short notice for ‘reporting purposes’ only. In addition, many schools also lack the data skills and aptitude to analyse data thoroughly, while district-level data experts rarely have time to provide ‘school-facing’ analyses.
- In addition, data that is collated and processed by district-level data specialists is often limited by state requirements. For example, cross-sectional analyses may not include groups with relatively small sample sizes – this often obscuring the most vulnerable and marginalised students that individual schools are most concerned with supporting.
- Another significant impediment is a lack of data interoperability – i.e. where different data systems and data-sets “are not set up to speak to one another”. In addition, inconsistent data practices by different members of staff can compound interoperability issues – rendering large swathes of school data practically unusable. In a neat turn of phrase, the authors talk of schools struggling (and quickly giving up) with such “unruly data systems”.
Addressing these issues is therefore seen to require various shifts and changes. Firstly, the authors spend time emphasising the importance of specialist agencies and individuals working at the state and district levels. These data specialists need to expand their focus beyond ‘state reporting requirements’ and, instead, take a greater interest in supporting the use of this data within individual schools. This might include taking responsibility for training school staff, or conducting individual school-specific ‘data cuts’ throughout the academic year for specific schools to then work further with
Secondly, in terms of fostering a culture of data interoperability, it is also suggested that school leaders take responsibility for developing standard operating requirements for data systems and school-based data entry. Issues of interoperability should also be a key factor in any procurement decisions that schools make in terms of data-based systems and products.
Finally, it is argued that school leaders need to be supported in developing their data ‘aptitude’. This involves being able to formulate questions that are appropriate in terms of their granularity and specificity, and follow through the consequences of any data-driven changes and repeat on a cyclical basis – as the blog post puts it, taking time to “see the results, refine, and repeat”.