By Daniela Doyle
When I start working with a new school district to understand a challenge they’re facing and how best to proceed, I always start with the data.
More specifically, I explore whatever quantitative data I can get my hands on. Those data, I’ve learned, hold all kinds of lessons. They helped me to uncover that a district was only “sort of” implementing its new budgeting formula as it tried to maintain the sacred link between teacher pay and years of service. They showed that a nonprofit’s pricing model fell far short of its costs, so it would have to increase its rates or find a stopgap until it could reach scale. And then there was the dataset I created mapping my team’s capacity against the tasks on our plate. It made clear that every member of my team would need to work 12 hours a day for the next six months to get it all done – or we could grow the team.
Salacious as some of these findings were, they were never gotcha moments. To the contrary, they opened the door to more questions, pushed the organizations I was working with to confront the realities of their decisions and consider next steps that would help them to achieve their goals in light of those realities.
Data-driven instruction has long been a buzz-term in education. But after nearly two decades working in the field, I’m struck by how often school districts struggle to use data to drive their own decision-making. In particular, I’ve seen the same three challenges pop up over and over again:
Districts often don’t collect all the data they need to answer their most pressing questions. For example, a district recently asked me about historical trends for a special education program, but the district could only share data from the last five months. Another time, a district wanted to know the impact of a new staffing model on student performance, but there wasn’t a marker capturing which students were learning from teachers participating in the model. And more than once, human resources data have failed to capture when a teacher stopped teaching (or whether they moved into an administrative position or left the profession altogether), so my team couldn’t determine how turnover rates were changing year-over-year. Quite simply, without the right data, districts can’t answer the right questions.
Even if districts are collecting the right data, they often struggle to do so in a way that allows them to analyze it. I’ve seen districts collect all the data, but do so across a variety of programs that don’t automatically link to one another - such as PowerSchool, Google Sheets, and a Word doc on the previous director’s laptop. I’ve also seen districts use different ID numbers in different systems for the same students – and spent days creating crosswalks by matching on names (want to guess how often a child’s name is spelled differently in different data files?!). The choices districts make in terms of when and how they collect data have huge implications for the analysis that follows (or doesn’t). Moreover, when there isn’t an easy way to pull the relevant pieces together into a single dataset, odds are a district won’t leverage all the data it has to offer.
Many school districts balk at investing in the talent needed to collect and analyze data effectively. Most people can filter information on a spreadsheet, but real data analysts speak a whole other language. They understand both the commands needed to analyze the data, as well as when to create an indicator variable, and the data infrastructure a district should build up front to prepare for the changes and questions anticipated three years in the future. The best data analysts also understand how the education programs they’re examining work. These are all skills learned and honed over time; they are not skills everyone who has used a free Google Sheet can easily execute.
I recently worked with a district that was required to report on a long list of questions about one of its programs. The team responding to the questions was doing everything it could to gather the necessary information, but there were gaps in what they had, the data crossed several files, and data analysis wasn’t a formal part of anyone’s job description.
My team worked with the district to create a single dataset with all the information they needed. Rewarding as it was to hear how useful they found our work, I couldn’t help but despair at the fact that they had found themselves in the position of not having that information to begin with.
The good news is that more districts seem to be making better use of data. Laurel Public Schools in Montana developed a rubric that allowed it to code evidence-based literacy practices and identify gaps in instruction. Mashpee Public Schools in Massachusetts adopted a new tool that mines its student performance data and flag students who need additional support. And in a 2022 survey, the Data Quality Campaign found that 93% of school superintendents started collecting new data during the pandemic.
Data isn’t just for quant majors; it’s a fundamental tool that allows organizations to learn from the past and chart a path towards a better, brighter future. Without it, districts have no choice but to guess at what works and how to get the greatest impact out of their scarce resources. Any organization that truly wants to improve, needs to put data analysis at the center of its work.
Daniela Doyle is a Director at Thru, based in North Carolina, and is focused on PK-12 research and district-wide strategies, as well as non-profit operations and leadership. She can be reached at email@example.com