Instead, the EI focused on changing how teachers made decisions in their classrooms. The reforms were built around the idea that data can be transformative, but only if people know how to use it. To change students’ lives, educators had to understand how to transform all the spreadsheets and statistics and online dashboards into insights and plans. They had to be forced to interact with data until it influenced how they behaved. By the time Dante entered the third grade, two years after the EI started, the program was already so successful it was hailed by the White House as a model of inner-city reform. South Avondale’s test scores went up so much that the school earned an “excellent” rating from state officials. By the end of Dante’s third-grade year, 80 percent of his classmates were reading at grade level; 84 percent passed the state math exam. The school had quadrupled the number of students meeting the state’s guidelines. “South Avondale drastically improved student academic performance in the 2010–11 academic year and changed the culture of the school,” a review by the school district read. The school’s transformation was so startling that researchers from around the nation soon began traveling to Cincinnati to figure out what the Elementary Initiative was doing right. When those researchers visited South Avondale, teachers told them that the most important ingredient in the schools’ turnaround was data—the same data, in fact, that the district had been collecting for years. Teachers said that a “data-driven culture” had actually transformed how they made classroom decisions. When pressed, however, those teachers also said they rarely looked at the online dashboards or memos or spreadsheets the central office sent around. In fact, the EI was succeeding because teachers had been ordered to set aside those slick data tools and fancy software—and were told instead to start manipulating information by hand.
This reminds me of something that I have seen far too often in many organizations that intend to go towards a “data driven culture”. Usually, what happens is that a group of data analysts and engineers would gather superficial requirements from business teams, and then set up all the data dashboards and metrics for the end users with little intermittent feedback in between.
For these end users, the underlying logic of all these metrics remains a black box. No attempt was made to internalize the way these metrics are calculated. The extent of going towards a “data driven culture” stops at cursory discussions over numbers at their regular weekly meetings.
Do we blame the engineers or the business users? It doesn’t make sense to as that is simply human nature. It is extremely difficult for us to internalize abstract things like numbers and metrics unless we have gone through the calculation logic ourselves. Just like in the case of this passage, only by making the teachers manipulate the information by themselves could they learn how to yield data to improve their performance.
“What I cannot create, I do not understand” – Richard Feynman. This quote seems ever more relevant in the context of this anecdote. To really achieve a strong understanding of your data, you’ll have to compel yourself to go through the details and train of logic yourself. Have fun, or not :p